back Back

New VR/AR interface uses AI to turn your skin into a touch-sensitive controller

Nov. 15, 2024.
2 mins. read. Interactions

New interface method shows that we might soon control virtual and augmented reality (VR/AR) environments simply by touching our own skin.

About the Writer

Giulio Prisco

64.62894 MPXR

Giulio Prisco is Senior Editor at Mindplex. He is a science and technology writer mainly interested in fundamental science and space, cybernetics and AI, IT, VR, bio/nano, crypto technologies.

Imagine a future where the way you control virtual or augmented reality (VR/AR) isn’t with a device in your hand, but by touching your own skin. This idea is becoming a reality with a new technology called EgoTouch, developed by researchers at Carnegie Mellon University. EgoTouch uses artificial intelligence (AI) to turn your skin into a touch-sensitive controller.

AI helps the AR/VR headset recognize when and how you touch your skin. Before EgoTouch, there was OmniTouch, which also aimed to let users control interfaces by touching their skin, but it needed a special camera that was big and cumbersome.

EgoTouch, however, uses the cameras that are already part of AR/VR headsets. Here’s how it works: when you touch your skin, it creates little shadows and slight changes in how your skin looks. These changes are what the AI model, trained by the researchers, looks for to detect touch. They collected data using a special sensor on the finger and palm, teaching the AI to understand different touches, like light or hard pressing, without needing to label each touch manually.

Simple hand controls for VR/AR

This system can detect touch with over 96% accuracy, meaning it gets it right almost every time. It can tell if you’re pressing down, lifting off, or dragging your finger across your skin, and it’s smart enough to differentiate between a light touch and a firm one, which could be used like a right-click on a computer mouse.

EgoTouch is simple and broadly applicable. It works well across different skin types and conditions, though it’s less effective over bony parts like knuckles because the skin there doesn’t deform much when touched. The researchers are also looking into making EgoTouch work in the dark by using night vision technology.

With EgoTouch, the future of interacting with VR/AR might just be at our fingertips, quite literally, as we might soon control virtual environments simply by touching our own skin.

Let us know your thoughts! Sign up for a Mindplex account now, join our Telegram, or follow us on Twitter

Comment on this article

0 Comments

0 thoughts on “New VR/AR interface uses AI to turn your skin into a touch-sensitive controller

Like

Dislike

Share

Comments
Reactions
💯 💘 😍 🎉 👏
🟨 😴 😡 🤮 💩

Here is where you pick your favorite article of the month. An article that collected the highest number of picks is dubbed "People's Choice". Our editors have their pick, and so do you. Read some of our other articles before you decide and click this button; you can only select one article every month.

People's Choice
Bookmarks