This incredible human-eye-like scientific breakthrough could lead to safer self-driving cars and better smartphone cameras

As you’re reading this, your eyes are in most cases slowly scanning from left to right but even when not reading or looking at a fixed object your eyes are constantly on the move and this, it turns out, is the key to the quality of human vision and how robots, self-driving cars, and maybe even smartphones could see more clearly.
A team of University of Maryland researchers created a camera that mimics human eye movements. Called the Artificial Microsaccade-Enhanced Event Camera (AMI-EV), it uses a rotating round wedge prism (round, but one face of the prism is sharply angled) rotating in front of an event camera, in this case, an Intel RealSense D435 camera, to move the images around.
Even though the movements are small, they’re meant to mimic the saccades of the human eye. Saccades describe three different levels of movement the eye makes – Rapid, small tremors, slower eye drift, and microsaccades, which happen multiple times per second and are small enough to be imperceptible to the human eye.
This last movement may help us see more clearly, especially moving objects where our eye shifts to put the image against the best part of our retina, replacing blurs with shape and color.
With the understanding of how these micro-movements help human perception, the team equipped its camera with a rotating prism.
According to the paper’s abstract, “Inspired by microsaccades, we designed an event-based perception system capable of simultaneously maintaining low reaction time and stable texture. In this design, a rotating wedge prism was mounted in front of the aperture of an event camera to redirect light and trigger events.”
Researchers paired the hardware solution with software that could compensate for the movement and combine captured images for a stable and clear image.
Sign up for breaking news, reviews, opinion, top tech deals, and more.
According to a report in Science Daily, the experiments were so successful that AMI EV-equipped cameras detected everything from quickly moving objects to the human pulse. That’s some precise vision.
Making robotic eyes see more like humans offers the potential of not only robots that can share our vision skills but, for instance, self-driving cars that could finally discern between people and other objects. There is already evidence that self-driving cars struggle to identify some humans. A self-driving Tesla equipped with an AMI-EV camera might be able to tell the difference between a bag blowing by and a child running into the street.
Equipped with AMI EV cameras, mixed-reality headsets, which use cameras to combine real and virtual worlds, might do a better job combining them for a more realistic experience.
“…it has many applications that much of the general public already interacts with, like autonomous driving systems or even smartphone cameras. We believe that our novel camera system is paving the way for more advanced and capable systems to come.,” Yiannis Aloimonos, a professor of computer science at UMD and the study’s co-author, told Science Daily.
These are early days, and the hardware looks more like something you’d put in an engine than the ultra-tiny and thin camera you might need for the best smartphone.
Still, the realization that something we can’t see happening is responsible for what we can see and how that small but critical vision capability can be replicated in robotic cameras is a significant step on the path to a future where robots match human visual perception.
You might also like
As you’re reading this, your eyes are in most cases slowly scanning from left to right but even when not reading or looking at a fixed object your eyes are constantly on the move and this, it turns out, is the key to the quality of human vision and how…
Recent Posts
- US government warns this popular CMS software has a worrying security flaw
- Apple’s iPhone 17 lineup is looking a little Pixelated
- What’s the deal with all these airplane crashes?
- Apple responds to tariff threat with a $500 billion US investment
- Nvidia’s RTX 5070 Ti may be getting the competition it needs as the AMD Radeon RX 9070 XT’s performance leaks
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010