Meta announced on his blog new features for its Ray-Ban smart glasses, including AI-powered live assistance and real-time translation, making the device far more versatile than just capturing audio and video. To take advantage of this, the company has updated its Meta View application (see below) which allows customers to configure the glasses from an iPhone or Android.
Live support with AI
With the new live AI feature, the Ray-Ban glasses’ built-in camera allows the device to “see” what the user sees and engage in real-time dialogue. The smart assistant can offer help with tasks like cooking, gardening, or exploring unfamiliar neighborhoods, all without needing to use hands. A small revolution that brings us closer to a truly useful accessory.
Notably, it is no longer necessary to say the keyword “Hey Meta” to activate the assistant. AI can follow the context of previous questions, making conversations more natural and fluid. Meta also mentioned that the assistant could soon offer useful suggestions before the user even makes a request.
Real-time translation
The company’s glasses now support real-time translation between English and French, Spanish or Italian. When someone speaks in one of these languages, the glasses translate their words into English through the built-in speakers or on a connected smartphone, and vice versa. Again, this is a major development!
Music recognition with Shazam
Last major addition, the identification of songs thanks to the integration of Shazam. Users can simply ask: “What is this song?” to get the title of the song. This integration is particularly interesting because Shazam is owned by Apple and deeply integrated into iOS. Meta therefore had the apple’s agreement…
Beta program
All of these features are part of Meta’s Early Access program, available to smartglass users in the United States and Canada. Registrations are available on the official Meta website, although places are limited.
What about Apple Glass?
Meta’s progress in smart glasses is proportional to rumors about Apple’s potential entry into this market. In October, Bloomberg’s Mark Gurman reported that Apple was working on its own smart glasses with Siri support, built-in cameras and other features similar to Meta’s Ray-Ban models, without going as far as a typical standalone model VisionPro. The idea would be to launch a pair of lightweight glasses, supported by the iPhone. In any case, at first. Apple Glass is expected within 3 to 5 years.
Download the free app Meta View