Gaze tracking is a tough tech to master even if its possible applications are hugely important. The ability to automatically assists in eye tracking could help people living with disorders affecting motor systems. However, there are some unique challenges associated with predicting a person’s gaze and tracking it. Many variables must be measured, including head position, head tilt, distance, lighting, background noise, eye rotation, and any facial coverings such as glasses. People in need of such technology are often left paying a lot of money for existing options that tap into sensors. Microsoft wants to create an affordable software solution that can match these devices. In the past, software has had limitations because of interference from lighting. The new system offers the best results on GazeCapture so far. Specifically, an error of 1.8073 on the data source that includes information from over 1,450 people. This result was achievable without the need for calibration or other fine-tuning methods.
Building on Eye Control
Microsoft Research has been working on gaze tracking for some time. Windows 10 already has Eye Control. The project was born from Microsoft’s original 2017 Hackathon event. One of the challenges posed in the event came from former NFL player, Steve Gleason. Now a sufferer of ALS, Gleason challenged researchers to find a way for him to use software, despite his inability to move or speak. The Eye Gaze Wheelchair was the result, which gave Gleason the ability to move a wheelchair around by looking at points on a Surface device. Microsoft was so impressed by the technology, the company decided to develop a research team. Eye Control evolved from the Eye Gaze technology. It allows users to navigate Windows 10 through eye movements. The new AI model extends what Eye Control can do because it is firstly more accurate and secondly would work across any device. Moving forward, the team aims to create custom neural network architectures to further hone the accuracy of the system.