Touch is a vital sense to consider if virtual reality is to eventually replicate our existing reality. Haptics research refers to delving into technology that can incorporate a sense of touch through immersive and virtual mediums. Previously, haptics has been a prominent issue for most AR and VR offering providers like Meta.
Researchers from Carnegie Mellon University’s Future Interfaces Group have created a mouth haptics device add-on that can be mounted to the bottom of a virtual reality headset. Their presentation on the technology will be showcased this week at the CHI Conference on Human Factors in Computing Systems event in New Orleans. Users with VR eyewear may discover what it is like to sip water from a fountain or even have a spider walk across their cheek.
How the technology works is quite simplistic. An array of small ultrasonic speakers is arranged on a circuit attached to the bottom of a virtual reality headset. These transducers, or speakers, generate sound waves at a wavelength that the auditory system of humans cannot detect. According to Chris Harrison, Director, Future Interfaces Group, the speakers may be triggered in certain patterns to ensure that their waves focus together, much like light through a magnifying lens. According to him, the technology generates a pressing feeling, and that force is responsible for creating a vibration over time. He said that when different things are put together creatively, accompanied by visual effects and music to create a full-fledged immersive experience.
According to Vivian Shen, a Ph.D. candidate at Carnegie Mellon University and the paper’s lead author, the team decided to experiment with ultrasonic haptics since it has a highly localised effect and is more potent in expressing feelings. Because the mouth is a very sensitive portion of the human body, it is an excellent target for testing out the sensations. They may experiment with several settings for the technology, such as the effect’s power, location, duration, variations of the impact through time, or if the effect is static or regulated.
The researchers experimented with numerous permutations of these developments to generate fundamental effects, and the most robust and interesting ones were applied to create the animation collection. This content library contains the basic haptic instructions for various motion types. According to Shen, swipes in the x, y, and z directions were included, because any form of the motion on the mouth is a highly fascinating impact that can easily complement a variety of other sorts of virtual reality animation. He emphasised that since ultrasound is a highly localised node, tapping and constant vibrations are definitely easy to apply. Shen said that they are capable of modifying spacing, timing, and modulation frequencies.
The prototypes were based on open-source architecture and had a bespoke printed circuit board or PCB. The device’s original iteration resembled a typewriter keyboard, having small lights aimed toward the participant’s lips. The team also provided an early glimpse at a fresher, development-phase iteration of the device with smaller panels and a more sophisticated version of the transducers.