Visit StickyLock

Jedi powers come to life with Force Push VR

Device-free hand gestures initiate responses in the VR user interface.

Star Wars franchise fans are set to wait for at least one more year, before being able to enjoy the realistic big screen experience with the telekinetic glasses. Episode IX, which does not have a title yet, is the final party of the Star Wars space epic which started back in 1977. Episode IX is expected to release no earlier than December 2019. 

While waiting with great anticipation for the release, Star Wars loyalists and new fans have a reason to rejoice. The Virginia Tech research team has come up with a cutting-edge virtual reality method, aptly titled as Force Push, which will be catered as a small screen offering.

Force Push enables its users to displace objects that are at a considerable distance, in a fashion that is similar to the swagger of Star Wars character Yoda. VR-based remote object manipulation is used to ensure a smooth and edgy experience.

Run Yu, a Ph.D. aspirant in the Department of Computer Science and the Institute for Creativity, Technology, and the Arts, explained the experience. According to him, users push objects in intended directions, just like the Jedi master characters from the Star Wars movies. Objects placed far from a touching distance can be pulled and pushed via hand gestures. Yu is the first individual to pen a detailed research article in Frontiers in ICT.

The simplicity is remarkable as finely-controlled hand gestures can pull, push and spin objects around. Users use bare hands to make naturally flowing object manipulation gestures, which are mapped within a VR setting.

Doug Bowman, director of the Center for Human Computer Interaction and Frank J. Maher Professor of Computer Science, expressed the intention to perform the VR movements without a device, using just hands and gestures.

The overall Force Push experience is more physical and refined, compared to conventional VR hand controllers. Users will find the concept easier to grasp, as the technology responds to hand gesture movements and speed in an intuitive manner to speed up or slow down objects.

The capability to read and respond to fine hand movements is founded on innovative physics-based algorithms. Mapping input gesture features dynamically to physics-driven simulation rendered the interface manageable in most cases. A slight finger movement can create massive movements in heavy objects. Being based on physics, the technology offers a realistic credibility to the technique.

An Oculus Rift CV1 was utilised by the team as a display in order to perform the experiments on users. For the purpose of hand tracking a Leap Motion was utilised. The virtual reality environment was created with the help of the Unity game engine. The Unity native physics technology was used for generating the physics-driven simulation effect as seen on the Force Push user interface.

The team utilised an Oculus Rift CV1 display for user experiments, and a Leap Motion facilitated hand tracking. Bowman revealed that continual tweaks are being made to further enhance the experience.

Join the Discussion

Visit StickyLock
Back to top