Meta AI’s experimental new smart glasses can see everything you do and even tell how you feel about it


- Meta is developing its Aria Gen 2 smart glasses, which come packed with sensors and AI features
- The smart glasses can track your gaze, movement, and even heart rate to gauge what’s happening around you and your feelings about it
- The smart glasses are currently being used to help researchers train robots and build better AI systems that could be incorporated into consumer smart glasses
The Ray-Ban Meta smart glasses are still relatively new, but Meta is already ramping up work with its new Aria Gen 2 smart glasses. Unlike the Ray-Bans, these smart glasses are only for research purposes, for now, but are packed with enough sensors, cameras, and processing power that it seems inevitable some of what Meta learns from them will be incorporated into future wearables.
Project Aria’s research-level tools, like the new smart glasses, are used by people working on computer vision, robotics, or any relevant hybrid of contextual AI and neuroscience that draws Meta’s attention. The idea for developers is to utilize these glasses to devise more effective methods for teaching machines to navigate, contextualize, and interact with the world.
The first Aria smart glasses came out in 2020. The Aria Gen 2s are far more advanced in hardware and software. They’re lighter, more accurate, pack more power, and look much more like glasses people wear in their regular lives, though you wouldn’t mistake them for a standard pair of spectacles.
The four computer vision cameras can see an 80° arc around you and measure depth and relative distance, so it can tell both how far your coffee mug is from your keyboard, or where a drone’s landing gear might be heading. That’s just the beginning of the sensory equipment in the glasses, including an ambient light sensor with ultraviolet mode, a contact microphone that can pick up your voice even in noisy environments, and a pulse detector embedded in the nose pad that can estimate your heart rate.
Future facewear
There’s also plenty of eye-tracking technology, able to tell where you’re looking, when you blink, how your pupils change, and what you’re focusing on. It can even track your hands, measuring joint movement in a way that could help with training robots or learning gestures. Combined, the glasses can figure out what you’re looking at, how you’re holding an object, and if what you’re seeing is getting your heart rate up because of an emotional reaction. If you’re holding an egg and see your sworn enemy, the AI might be able to figure out you want to throw the egg at them, and help you aim it accurately.
As stated, these are research tools. They’re not for sale to consumers, and Meta hasn’t said if they ever will be. Researchers have to apply to get access, and the company is expected to start taking those applications later this year.
But the implications are far larger. Meta’s plans for smart glasses go well beyond checking for messages. They want to link human interactions with the real world to machines, teaching them to do the same. Theoretically, those robots could look, listen, and interpret the world around them like humans do.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
It’s not going to happen tomorrow, but the Aria Gen 2 smart glasses prove it’s a lot closer than you might think. And it’s probably only a matter of time before some version of the Aria Gen 2 ends up for sale to the average person. You’ll have that powerful AI brain sitting on your face, remembering where you left your keys and sending a robot to pick them up for you.
You might also like
Meta is developing its Aria Gen 2 smart glasses, which come packed with sensors and AI features The smart glasses can track your gaze, movement, and even heart rate to gauge what’s happening around you and your feelings about it The smart glasses are currently being used to help researchers…
Recent Posts
- Meta AI’s experimental new smart glasses can see everything you do and even tell how you feel about it
- A GameStop damaged Switch 2 screens with staples, but they’re getting replaced
- Here are three new apps building out the open social web
- After the Switch 2, there’s no going back to the old eShop
- What NOT to expect at Apple’s WWDC 2025 – three things you definitely won’t see
Archives
- June 2025
- May 2025
- April 2025
- March 2025
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010