At the Meta Connect 2024 event, Meta CEO Mark Zuckerberg announced a big news update to the company’s smart glasses, which will be produced in collaboration with Meta and Ray-Ban Meta.
According to Meta CEO, this smart glasses can become one of the most used gadgets in the future; During the event, Zuckerberg announced some of the new AI capabilities and features that the Ray-Ban Meta will come with.
Some of the new features of these smart glasses include artificial intelligence video processing and live translation. These smart glasses also have the ability to scan QR codes and display reminder notifications. Also updated and integrated with iHeartRadio and Audible, it looks to give Ray-Ban Meta users the features they already know and love from their smartphones.
Meta says its smart glasses will soon have AI-connected video capabilities, meaning you can ask Ray-Ban Meta glasses about what’s in front of you and Meta’s AI will verbally respond in time. Right now, the Ray-Ban Meta glasses can only take a photo and describe it to you or answer questions about it, but the video upgrade should make the experience more natural, at least in theory. These multifaceted features are slated to roll out later this year.
In a demo version, users can ask the Ray-Ban Meta questions about the food being cooked, or the city scenes unfolding in front of them.
However, this is easier said than done and we will have to see how fast and seamless this feature is in practice. As mentioned, Zuckerberg also announced a live translation of the Ray-Ban Meta language. English-speaking users can talk to someone who speaks French, Italian, or Spanish, and their Ray-Ban Meta glasses should be able to translate what the other person is saying into their language of choice. Meta says the feature is coming later this year and will include more languages later.
RCO NEWS