(OPINION) For the past two weeks, I’ve been using a new camera to secretly snap photos and record videos of strangers in parks, on trains, inside stores and at restaurants. (I promise it was all in the name of journalism.) I wasn’t hiding the camera, but I was wearing it, and no one noticed.
I was testing the recently released $300 Ray-Ban Meta glasses that Mark Zuckerberg’s social networking empire made in collaboration with the iconic eyewear maker.
The high-tech glasses include a camera for shooting photos and videos, and an array of speakers and microphones for listening to music and talking on the phone.
The glasses, Meta says, can help you “live in the moment” while sharing what you see with the world. You can livestream a concert on Instagram while watching the performance, for instance, as opposed to holding up a phone.
That’s a humble goal, but it is part of a broader ambition in Silicon Valley to shift computing away from smartphone and computer screens and toward our faces.
Meta, Apple, and Magic Leap have all been hyping mixed-reality headsets that use cameras to allow their software to interact with objects in the real world.
On Tuesday, Mr. Zuckerberg posted a video on Instagram demonstrating how the smart glasses could use A.I. to scan a shirt and help him pick out a pair of matching pants. Wearable face computers, the companies say, could eventually change the way we live and work.
For Apple, which is preparing to release its first high-tech goggles, the $3,500 Vision Pro headset, next year, a pair of smart glasses that look nice and accomplish interesting tasks are the end goal.
For the past seven years, headsets have remained unpopular, largely because they are bulky and aesthetically off-putting. The minimalist design of the Ray-Ban Meta glasses represent how smart glasses might look one day if they succeed (though past lightweight wearables, such as the Google Glass from a decade ago and the Spectacles sunglasses released by Snap in 2016, were flops).
Sleek, lightweight and satisfyingly hip, the Meta glasses blend effortlessly into the quotidian. No one — not even my editor, who was aware I was writing this column — could tell them apart from ordinary glasses, and everyone was blissfully unaware of being photographed.
After wearing the Ray-Ban Meta glasses practically nonstop this month, I was relieved to remove them. While I was impressed with the comfortable, stylish design of the glasses, I felt bothered by the implications for our privacy.
I’m also concerned about how smart glasses may broadly affect our ability to focus. Even when I wasn’t using any of the features, I felt distracted while wearing them. But the main problem is that the glasses don’t do much we can’t already do with phones.
Meta said in a statement that privacy was top of mind when designing the glasses. “We know if we’re going to normalize smart glasses in everyday life, privacy has to come first and be integrated into everything we do,” the company said.
I wore the glasses and took hundreds of photos and videos while doing all sorts of activities in my daily life — working, cooking, hiking, rock climbing, driving a car and riding a scooter — to assess how smart glasses might affect us going forward. Here’s how that went.