Meta has rolled out a significant update to its Ray-Ban Smart Glasses, introducing a “Hey Meta” voice prompt that enables users to access Meta’s AI assistant directly. This feature allows users to ask the AI chatbot for assistance, identify objects and people, and perform various tasks.
The “Hey Meta” feature is part of the “live AI” capability, powered by machine learning technology. Users can ask Meta AI to identify people, objects, or places visible through the glasses, and seek help with problems or alternatives. The AI assistant can also translate languages in real-time, supporting English, French, Italian, and Spanish. This is achieved through the glasses’ built-in microphone and open-ear speakers.
Users can also utilize the voice command to post on Instagram, send messages on Messenger, and play music via streaming platforms. The Ray-Ban Meta Smart Glasses have received several updates since their release, including integration with Apple Music, which can be controlled using voice commands.
The latest update builds upon the introduction of Meta AI to the smart glasses last year, which brought a multimodal AI experience using the Llama model. Meta continues to expand the capabilities of its wearable device, offering users a more comprehensive and integrated experience.
The features available through the “Hey Meta” voice prompt include:
- Object and scene identification: Users can ask Meta AI to identify people, objects, or places visible through the glasses.
- Real-time translation: Supports English, French, Italian, and Spanish, using the built-in microphone and open-ear speakers.
- Social media and messaging: Users can post on Instagram and send messages on Messenger using voice commands.
- Music playback: Play songs via music streaming platforms using voice commands.