EgoTouch Technology Transforms VR Users’ Palms into Touchscreen Interfaces
Handheld controllers and floating options can be perturbing and hard to use in virtual reality (VR). EgoTouch, a new technology developed at Carnegie Mellon University, offers an innovative solution by turning the user’s palm into an interface for streamlined, on-the-go interactions.
Currently in its prototype stage, EgoTouch was created by PhD student Vimal Mollyn and his team at Carnegie Mellon’s Human-Computer Interaction Institute. Their work builds on the concept of on-body interfaces, which have shown considerable benefits in speed, accuracy, and ergonomics when compared to the commonly used “in-air” interfaces projected within VR environments.
Existing on-body interface solutions generally rely on specialised depth-sensing cameras to track a user’s body part and finger positions. However, EgoTouch takes a more practical approach by utilising the RGB optical camera that already exists in most VR headsets. As the user’s finger touches their palm, the camera registers shadows and skin deformations, allowing the system to accurately interpret the user’s selections based on these visual cues mapped onto a virtual palm interface.
To train the system’s algorithm, the research team instructed volunteers to press their index finger onto various points on their palms while wearing a head-mounted RGB camera. Beneath each participant’s finger, an unobtrusive touch sensor gathered additional data. This combination allowed the algorithm to match camera images with touch sensor inputs, creating an accurate mapping of touch locations, pressures, and durations. The data was recorded across diverse lighting conditions and with participants of varying skin tones and hair densities, ensuring that the algorithm could reliably function across a wide range of users.
In testing, EgoTouch achieved a 96% accuracy rate for touch detection on the palm, with a false positive rate of only 5%. It could also tell the difference between soft and hard impacts 98% of the time and understand different touch movements like pressing downward, lifting up, and pushing.
Mollyn highlights the accessibility of this system, stating that it only requires the standard camera included in all VR headsets. According to Mollyn, the EgoTouch models are calibration-free and functional by default, making this a significant advancement toward realising on-skin interfaces in VR.