Seite 1 von 1

Revolutionary Eye-Tracking Method Enhances Precision Using 3D Imaging

Verfasst: Mi Apr 09, 2025 2:39 pm
von konerto
Eye tracking is a vital technology across numerous fields, including virtual and augmented reality, entertainment, medical diagnostics, behavioral sciences, automotive safety, and industrial engineering. Yet, achieving high-precision tracking of eye movements remains a major challenge.

Researchers at the University of Arizona's Wyant College of Optical Sciences have introduced a breakthrough approach that could redefine the capabilities of current eye-tracking systems. Their findings, published in Nature Communications, demonstrate that combining deflectometry—a powerful 3D imaging technique—with advanced computational methods dramatically improves tracking accuracy.

"Traditional eye-tracking techniques typically rely on data from a small number of surface points—around a dozen—to determine gaze direction," said Florian Willomitzer, associate professor of optical sciences and principal investigator of the study. "Our deflectometry-based approach can extract information from over 40,000 surface points—and potentially millions—from just a single image."

This dramatic increase in data resolution leads to much more accurate gaze estimations, a key advantage for applications like next-generation VR headsets. "We’ve shown that our method increases the number of usable data points by more than a factor of 3,000 compared to conventional approaches," added Jiazhang Wang, postdoctoral researcher and lead author of the study.

Deflectometry is traditionally used to scan the surface of highly reflective optics, such as telescope mirrors, for minute imperfections. The Arizona team has extended this technique into a new domain they call "computational deflectometry," which applies precise surface measurement tools combined with computer vision algorithms to analyze not only optics but also human eyes, artworks, and skin lesions.

"By pairing measurement precision with computation, we enable machines to 'see the unseen'—essentially providing them with superhuman vision," said Willomitzer.

To test their approach, the researchers conducted experiments using both real human subjects and a lifelike artificial eye model. They were able to estimate gaze direction with accuracies ranging from 0.46 to 0.97 degrees in humans, and as precise as 0.1 degrees on the artificial model.

Rather than relying on a few infrared light sources, as current systems do, this technique uses a display screen to project structured light patterns. Each pixel on the screen can serve as a virtual point light source, allowing the system to detect detailed deformations in the patterns as they reflect off the eye's surface.

https://github.com/BrentGNT/Arch2
https://github.com/TomGMN/ffkoa
https://github.com/BryanEMD/kkstb
https://github.com/CodyEGT/Sfumm
https://github.com/ErikBDT/Fcumg
https://github.com/DorianKVT/Mwum
https://github.com/JeremyEGT/Ahum9
https://github.com/AidenBMT/Wmumd
https://github.com/BlakeTNS/S2ugm
https://github.com/BrettKNO/Zadzmg
https://github.com/BrodyTNN/Cscofps
https://github.com/CalebSNT/Bmumg
https://github.com/CarterSMN/KtbsK
https://github.com/ChaseDNT/Trucg
https://github.com/CooperTRN/Sf2ueg
https://github.com/DaltonYNT/Ssumg
https://github.com/EthanGNM/Wwhug
https://github.com/JacksonMRT/Bwum
https://github.com/AtevenKLD/Wtmum
https://github.com/JoshBDO/Coum5


By analyzing these deformations, the researchers reconstruct dense 3D surface maps of the cornea and sclera. They then use these maps in combination with known geometric properties of the eye to accurately determine the direction of gaze.

In earlier work, the team proposed that this method could be integrated into AR and VR headsets by using built-in patterns in the headset frame or even elements of the display content itself—such as images or videos—as the light source. This integration could simplify the system design significantly.

Future iterations may also employ infrared light rather than visible light, allowing the system to operate without visually interfering with the user experience. "We use stereo-deflectometry along with advanced surface optimization algorithms," Wang said. "This means we don’t have to rely on rigid assumptions about eye shape or structure, which can vary between individuals."

An added benefit of the technique is its ability to produce detailed surface reconstructions of the eye in real time, potentially opening the door to new applications in eye health diagnostics and personalized vision correction.

With this innovation, the University of Arizona team is laying the foundation for a new generation of ultra-accurate, low-complexity eye-tracking systems that could impact both consumer technology and clinical tools.