Ray-Ban’s Meta Smart Glasses have received substantial enhancements. Meta has introduced AI-powered camera capabilities, now in beta, which enhance the intelligence of their photo-capturing glasses, as highlighted by Mark Zuckerberg in a recent Instagram video. This update also introduces video calling features and a new frame design.
Ray-Ban Meta Glasses: We are in the chapter of AI wearables
The smart glasses now utilize the camera to provide visual insights. The built-in AI assistant can recognize objects, provide details about landmarks, and translate foreign languages for travelers.
Further advancements in the Ray-Ban Meta Smart Glasses include improved integration with Apple Music, the addition of multimodal AI capabilities, and enhanced connectivity with WhatsApp and Messenger. These updates allow wearers to broadcast their viewpoints directly from their glasses.
The concept of multimodal AI enables the device’s AI assistant to process various types of input—such as images, audio, video, and text—all at once. This capability means the glasses can execute voice commands while simultaneously analyzing visuals in real time.
Initially hinted at shortly after the launch, the support for multimodal AI is now becoming available to all users of the smart glasses. Meta is enhancing the glasses with familiar features for tech-savvy users while incorporating new functions, such as live video sharing via popular messaging platforms. The feature of broadcasting live video through WhatsApp and Messenger is hands-free, offering users the ability to share their immediate view without the need to handle a device, thus adding a novel aspect to video communication.
Meta wants to change how you see the world with AI
Priced at $329, the smart glasses feature an integrated ultra-wide 12 megapixel (MP) camera equipped with Meta AI and Vision for augmented reality (AR) functions. This includes the ability to translate text in foreign languages simply by looking at it and a landmark identification feature, both showcased by Mark Zuckerberg in an Instagram update earlier this year.
The practicality of live view sharing ranges from displaying product options at a supermarket to sharing breathtaking scenery during hikes or vacations. To share views, users need only double-tap the physical capture button on the glasses, even without syncing their WhatsApp with the Meta View app.
The glasses allow users to operate the camera for photos and videos via voice commands, such as “Hey Meta, send a photo,” eliminating the need for a touchscreen or visual interface and enabling a streamlined, hands-free operation.
To access these new capabilities, owners of the Ray-Ban Meta Smart Glasses simply have to update their device through the Meta View app. According to a recent blog post by Meta, the rollout of these updates began on April 24, although they may not be immediately available to all users.
“We’re committed to building the next generation of AI-powered experiences across our apps and devices responsibly and safely. We’re routinely testing and retraining our models to help ensure that our AI features provide experiences that are safe and helpful.”
-Meta
Featured image credit: Meta