XR Experiences Can Be Improved with a Wireless Tracking Gadget
Engineers at the University of California, San Diego, have created a new technology that might improve the seamlessness and smoothness of the extended reality (XR) experience. The technology consists of an asset localisation system that tracks the real-time, centimetre-level precision of physical things via wireless signals and creates a virtual representation of those assets. This technology may be used for everything from increasing worker safety to optimising virtual gaming experiences.
The ACM Conference on Embedded Networked Sensor Systems (SenSys 2023) was held in Istanbul, Turkey. There, the team, under the leadership of Dinesh Bharadia, from the Department of Electrical and Computer Engineering, UC San Diego Jacobs School of Engineering, presented the concept.
Current localisation techniques have severe drawbacks. For example, according to one of the first authors of the research, Aditya Arun, a Ph.D. student in Bharadia’s lab studying electrical and computer engineering, many XR apps employ cameras to localise things, whether they be via AR glasses, VR devices, or phone cameras.
Nevertheless, Arun said that these camera-based techniques are inaccurate in extremely dynamic circumstances with visual obstacles, rapidly shifting surroundings, or low illumination. Ultrawide-band (UWB) technology requires complicated setup and configuration, whereas wireless technologies like WiFi and Bluetooth Low Energy (BLE) sometimes fail to provide the necessary precision.
These restrictions are removed by the new asset localisation system created by Bharadia’s group at UC San Diego and Shunsuke Saruwatari at Osaka University in Japan. It offers precise, real-time localisation of objects with centimetre-level accuracy, even in dynamic and dimly lit environments. Additionally, the technology comes in a one-metre-sized, readily deployable, and small module that may be quickly integrated into electronics like sound bars or TVs.
The researchers created their gadget by using wireless waves in the under-6 GHz band. Unlike camera-based techniques, Arun claims that these transmissions are less affected by visual impediments and can still work in non-line-of-sight scenarios.
The technology locates battery-operated UWB tags that are affixed to objects via wireless signals. It can be broken down into two primary components. One is a beacon-transmitting UWB tag used for localisation. The second part is a localisation module that contains six time- and phase-synchronised UWB receivers to pick up the beacon signal. Each receiver receives this signal at a slightly varied phase and timing, depending on its route. The innovative way the algorithm integrates these differences allows it to determine the tag’s placement in 2D space with accuracy.
In experiments, the researchers used common items to play a life-size game of chess utilising their technology. They converted mugs into virtual chess pieces by retrofitting them with readily available UWB tags. The device tracked the pieces’ motions with centimetre precision in real time while they were moved about on a table.
Arun says that they discovered that in changing conditions, their approach reaches 90th percentile precision and outperforms current state-of-the-art localisation technologies by at least eight fold.
The team is currently working to improve the system. The next stages include adding antennas along the vertical axis to provide complete 3D localisation, lowering the number of receivers to enhance energy efficiency, and strengthening the PCB architecture to make the system more resilient.