Thursday, September 26, 2024

Meta glasses AI real-time video

 Good progress, but disappointed the translation isn't real-time. 

The only difference between this and Google Translate is that it's hands-free and visual. However, we still have the translation lag. Hopefully, this is obvious enough that real-time is on radar.

https://www.youtube.com/watch?v=l_QruJ0Kv9U

https://www.youtube.com/watch?v=AA7ZnWOHs8I


Ray-Ban Smart Glasses Updated With Real-Time AI Video, Reminders, and QR Code Scanning (techcrunch.com)14

An anonymous reader quotes a report from TechCrunch:Meta CEO Mark Zuckerberg announced updates to the company's Ray-Ban Meta smart glasses at Meta Connect 2024 on Wednesday. [...] Meta says its smart glasses will soon have real-time AI video capabilities, meaning you can ask the Ray Ban Meta glasses questions about what you're seeing in front of you, and Meta AI will verbally answer you in real time. Currently, the Ray-Ban Meta glasses can only take a picture and describe that to you or answer questions about it, but the video upgrade should make the experience more natural, in theory at least. These multimodal features are slated to come later this year. In a demo, users could ask Ray-Ban Meta questions about a meal they were cooking, or city scenes taking place in front of them. The real-time video capabilities mean that Meta's AI should be able to process live action and respond in an audible way. This is easier said than done, however, and we'll have to see how fast and seamless the feature is in practice. We've seen demonstrations of these real-time AI video capabilities from Google and OpenAI, but Meta would be the first to launch such features in a consumer product.

Zuckerberg also announced live language translation for Ray-Ban Meta. English speaking users can talk to someone speaking French, Italian, or Spanish, and their Ray-Ban Meta glasses should be able to translate what the other person is saying into their language of choice. Meta says this feature is coming later this year and will include more language later on. The Ray-Ban Meta glasses are getting reminders, which will allow people to ask Meta AI to remind them about things they look at through the smart glasses. In a demo, a user asked their Ray-Ban Meta glasses to remember a jacket they were looking at, so they could share the image with a friend later on. Meta announced that integrations with Amazon Music, Audible, and iHeart are coming to its smart glasses. This should make it easier for people to listen to music on their streaming service of choice using the glasses' built-in speakers. The Ray-Ban Meta glasses will also gain the ability to scan QR codes or phone numbers from the glasses. Users can ask the glasses to scan something, and the QR code will immediately open on the person's phone with no further action required.
Zuckerberg also unveiled the company's prototype AR glasses codenamed Orion, which feature a 70-degree field of view, Micro LED projectors, and silicon carbide lenses that beam graphics directly into the wearer's eyes.