Want to add a display to your shades? Meta thinks the answer is yes.
Meta plans to incorporate display technology into its Ray-Ban smart glasses by 2025, according to Financial Times, aiming to enhance the user experience and solidify its position in the competitive wearables market. This initiative seeks to challenge smartphones’ dominance as the primary computing device for consumers and comes at a time when rival companies, including Apple and Google, are actively developing similar products.
Meta to add display tech to Ray-Ban smart glasses by 2025
The anticipated updates to the Ray-Ban glasses, developed in collaboration with EssilorLuxottica, will feature a small display primarily for notifications and responses from Meta’s virtual assistant. Sources indicate that these upgraded versions could launch as early as the second half of 2025.
This move follows the successful release of the latest version of Ray-Ban Meta glasses in September 2023. According to EssilorLuxottica’s CEO Francesco Milleri, these latest models have significantly outperformed earlier iterations, achieving more sales in their initial months than previous versions did over two years. The broader market for wearable glasses experienced a notable growth of 73% in shipments in 2024.
In September, Meta showcased its augmented-reality glasses prototype, Orion, which boasts a compact design and advanced display technology. The Orion glasses can overlay 3D content onto the real world, representing a substantial advancement in AR technology compared to previous attempts by companies like Google and Microsoft. Meta has accelerated the development of Orion following positive feedback from early tests, although consumer availability is still years away.
European Ray-Ban fans, meet your new AI-powered glasses
Additionally, Meta has introduced live AI features for its Ray-Bans, allowing for real-time interactions and translations. The AI, once activated, provides users with an “always-aware” experience, responding to questions and offering information about the surrounding environment. However, this feature is still in an early beta stage, with limitations regarding connectivity and accuracy. Translation capabilities currently support a limited number of languages, and users must download specific language packs for the feature to work effectively.
Despite the ongoing advancements, Meta faces challenges in creating fashionable AR glasses equipped with adequate hardware performance and battery life at an acceptable price point. Meta’s chief technology officer, Andrew Bosworth, recently highlighted that the feasibility of integrating live AI and heads-up displays will depend on overcoming these obstacles. He indicated that adding gesture control via a wristband or similar input device may arise alongside future display-enhanced models.
Looking ahead, Meta’s ongoing investments in AI technology may lead to more integrated and versatile devices, combining capabilities like fitness tracking and VR functionalities. The development of gesture recognition systems could also enhance user interaction with smart glasses, as current Ray-Bans do not support pointing gestures, limiting functionality.
Featured image credit: Meta