Mobile & Gadgets:

Meta adds ‘Live AI’ features to Ray-Ban smart glasses

Meta announced a new software update for its Ray-Ban smart glasses that will add a “Live AI” feature that can use video feed to gather context for questions, similar to Google’s Project Astra.

The new update rolling out to Ray-Ban Meta smart glasses v11 introduces several new options.

It includes Shazam integration, which allows users to ask the glasses “Hey Meta, what’s this song” and have the result read aloud. This feature will be available in the US and Canada.

In addition, Meta also introduces new AI features and they look enticing. The first of these new features is “Live AI,” which allows Ray-Ban Meta glasses to capture video, which is used by artificial intelligence to offer “real-time, silent assistance” in what you’re actively doing.

Ultimately, Meta says, this data will be used to make suggestions before you even ask.

First live AIadds video to Meta AI in your glasses. During a live AI session, Meta AI can continuously see what you see and converse with you more naturally than ever before. Get real-time, silent help and inspiration with everyday activities like cooking, gardening, or exploring a new neighborhood. You can ask questions without saying “Hey Meta,” refer back to what you discussed earlier in the session, and interrupt at any time to ask additional questions or change topics. Finally, live AI will make helpful suggestions at the right time, before you even ask.

“Live Translation” meanwhile will be able to translate the speech in real time, the other person’s speech will be in English through the glasses (also recorded on your phone). It works in Spanish, French and Italian.

Meta will only roll out these features through a waitlist, for now only in the US and Canada.

Google is working on something similar.

At Google I/O 2024 in May, the company demonstrated “Project Astra”. a new AI project that will be able to use video feed to gather context, then answer questions based on what it sees. Google has teased functionality in the glasses, but hasn’t unveiled anything yet. The Earlier this month, the Gemini 2.0 announcement saw Google detail new updates for the Astra chat in multiple languages, store up to 10 minutes of memory, improve latency, and more. It’s unclear how Meta’s “Live AI” will compare, but it’s certainly exciting to see this functionality coming so soon, especially since we won’t be seeing it. fully implemented for Google by next year.

More about smart glasses:

Follow me: Twitter/X, Topics, Blueskyand Instagram

FTC: We use automatic affiliate links that generate income. More.



https://i0.wp.com/9to5google.com/wp-content/uploads/sites/4/2024/05/meta-ray-ban-camera-1.jpg?resize=1200%2C628&quality=82&strip=all&ssl=1

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button