Samsung’s new Galaxy phones lay the groundwork for headphones and glasses to come

Samsung and Google they are working on a Apple Vision Pro-like Mixed reality VR headset with Android XR and Google Gemini. We know that already and still have a demo of it last year. But Samsung also revealed a little more about its phone focus Samsung Unpacked winter eventspecifically, a joint Google-Samsung AI ecosystem collaboration that could be the missing piece to unite everything. This AI-infused experience will be on a next-gen VR/AR headset this year, but expect it to be up and running as well Galaxy S25 phone and glasses that will connect them.
In a sense, I already had a preview of what the future holds at the end of last year.
Samsung’s vision for its products connects via AI. And now that AI is becoming consistent.
An AI sees it working in real time
Samsung briefly addressed the upcoming VR/AR headsets and glasses at its latest Unpacked event, but we largely know which ones already exist. However, Samsung’s demonstration of real-time AI that can see things on your phone or through cameras is exactly the trend we have. they expect to arrive in 2025.
The Moohan Project (which means “Infinite” in Korean) is a VR helmet with passthrough cameras that mix the virtual and the real, like the Vision Pro or Meta’s Quest 3. The design feels very much like Meta is discontinued Quest Pro but with much better specs. The headset has hand and eye tracking, runs Android apps via an Android XR OS that will be fully revealed later this year, and uses Google Gemini AI as an assistance layer throughout. Google Astra project tech, which allows real-time assistance on glasses, phones and headsets, it debuted on the Samsung Galaxy S25 series of phones. But I have already seen it in action on my face.
My demos last year allowed me to use Gemini to help me as I looked around a room, watched YouTube videos or did basically anything else. The live AI needed to start in that live mode to use it, then it could see and hear what it was seeing or hearing. There were also pause modes to temporarily stop live assistance.
Samsung has shown off what looks like multiple real-time AI features in the Galaxy S25 phones, and more has been promised. I expect it will be able to work while watching videos on YouTube, as my Android XR demo did. And according to Samsung and Google executives working on Android XR, it could also be used for live help while playing.
Gemini flight visual recognition skills may begin to feel the same between glasses and phones.
Better battery life and processing…for glasses?
Samsung and Google have also confirmed that they are working on smart glasses, also using Gemini AI, to compete Meta’s Ray-Bans and a wave of other emerging glasses. AR glasses are also apparently in the works.
While Project Moohan is a standalone VR headset with its own battery pack and processors, like Apple’s Vision Pro, the smaller smart glasses Google and Samsung are working on — and any glasses after that — rely on connectivity and assistance. of treatment by telephones. to work That’s how smart glasses like Meta’s Ray-Bans already work.
But, perhaps, with more functionality means the need for more intensive phone processing. Live AI could start to become an increasingly used feature, relying on phones to work continuously to help these glasses. Better processing, graphics, and most importantly, improved battery life and cooling sounded to me as ways to make these phones better pocket computers for eventual glasses.
Clouds of personal data are what Samsung and Google will tap into to drive smarter AI assistants on both glasses and phones.
A set of personal data that these AI gadgets need
Samsung also announced a dark-sounding Personal Data Engine that Google and Samsung’s AI will take advantage of, bucketing personal data into a place where AI could possibly develop richer conclusions and connections to all things that are part of the your life
How it takes place or is secured, or where are its limits, was not very clear. But it sounds like a repository of personal data that Samsung and Google’s AI can train and work with extended connected products, including watches, rings and glasses.
Camera-enabled AI wearables are only as good as the data that can help them, which is why so many of these devices now feel awkward and awkward to use, including Meta’s Ray-Bans in their AI modes. Usually, these AI devices hit the wall when it comes to knowing things that your existing apps already know better. Google and Samsung are clearly trying to solve this.
Do you want to trust this process with Google and Samsung, or someone else? How will these phones, and future glasses, make the relationship between AI and our data clearer and more manageable? It looks like we’re looking at one shoe here, with others to come when Google’s I/O developer conference will likely discuss Android XR and Gemini developments in much more depth.
Samsung made Project Moohan its first headset, followed by glasses in the future. Expect Google to go into more detail along with Samsung at the developer-focused Google I/O conference around May or June and possibly the full recap in the summer at Samsung’s upcoming Unpacked event. Then, we can know a lot more because this new apparently boring wave of Galaxy S25 phones could be building an infrastructure that will take place in clearer details at the end of the year … or even later.
https://www.cnet.com/a/img/resize/4dd22705fdc9c0990ffa8bdbbec94ed55f2fe422/hub/2025/01/23/8b8eec0f-3ca2-49b9-91b6-93d0d88f8062/img-0006.jpg?auto=webp&fit=crop&height=675&width=1200
2025-01-23 16:00:00