back Back

AI-powered smartglasses track gaze, facial expressions for VR/AR headsets

Apr. 11, 2024.
2 min. read Interactions

Google Glass Edition 3? Maybe not. :)

About the Writer

Amara Angelica

168.11748 MPXR

I admit: I was a Google Glass wearer

GazeTrak is the first eyeglass-based gaze-tracking system that relies on acoustic signals. (credit: Jason Koski/Cornell University)

Cornell University researchers have developed two technologies that track a person’s gaze and facial expressions, using sonar-like sensing to improve communication.

Mounted on an eyeglass frame, the MR (mixed-reality) technology is small enough to fit on commercial smartglasses or virtual reality and augmented reality headsets like Vision Pro or Meta Quest. The design consumes significantly less power than similar tools using cameras, say the researchers.

The speakers and microphones are mounted on an eyeglass frame, bouncing sonar-like inaudible soundwaves off someone’s face and picking up reflected signals caused by face and eye movements.

GazeTrak

GazeTrak is the first eye-tracking system that relies on acoustic signals. It continuously and accurately detects facial expressions and recreates them in an avatar in real time. The detailed facial expressions and gaze movements could improve interactions with other users.

“It’s small, it’s cheap and super low-powered, so you can wear it on smartglasses every day—it won’t kill your battery,” said Cheng Zhang, an assistant professor of information science who directs the Smart Computer Interfaces for Future Interactions (SciFi) Lab, which created the new devices.

GazeTrak has a speaker and four microphones positoned around the inside of each eye frame of the glasses. It bounces and picks up soundwaves from the eyeball and the area around the eyes. It then sends sound signals into a customized deep-learning pipeline that uses AI to continuously infer the direction of the person’s gaze.

EyeEcho

For futher help, EyeEcho has an ultrasound speaker and an microphone located next to the glasses’ hinges, pointing down to catch skin movement as facial expressions change. These reflected signals are also interpreted by AI.



EyeEcho continuous facial expression tracking on glasses
(credit: Ke Li et al.)

Imaginative Uses

With this new technology, users can make hands-free video calls through an avatar, even in a noisy café or on the street. While some smartglasses have the ability to recognize faces or distinguish between a few specific expressions, currently, none track expressions continuously, like EyeEcho, say the researchers.

GazeTrak could also be used with screen readers to read out portions of text for people with low vision as they read a website or book.

GazeTrak and EyeEcho could also potentially help diagnose or monitor neurodegenerative diseases, like Alzheimer’s and Parkinsons, where patients often have abnormal eye movements and less expressive faces. It could tracked the progression of the disease at home or via a physician.

Li will present GazeTrak at the Annual International Conference on Mobile Computing and Networking on May 11-16 and EyeEcho at the Association of Computing Machinery CHI conference on Human Factors in Computing Systems in May.

Citations: Ke Li et al. EyeEcho: Continuous and Low-power Facial Expression Tracking on Glasses. arXiv https://arxiv.org/html/2402.12388v1 and Ke Li et al. GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass Frame. arXiv. arxiv.org/html/2402.14634v2.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Comment on this content

0 Comments

0 thoughts on “AI-powered smartglasses track gaze, facial expressions for VR/AR headsets

Like
Dislike
Share

0

Comments
Reactions
💯 💘 😍 🎉 👏
🟨 😴 😡 🤮 💩

Here is where you pick your favorite article of the month. An article that collected the highest number of picks is dubbed "People's Choice". Our editors have their pick, and so do you. Read some of our other articles before you decide and click this button; you can only select one article every month.

People's Choice
Bookmarks