Visit StickyLock

Asteroid Focuses Using AR Software and Hardware to Revive Human-Candidate Interface

Input technology is evolving dynamically with the latest innovations in the domain of smartphones, computers and immersive technology devices. Headsets like the Magic Leap One and HoloLens are gaining fast popularity in today’s world.

Asteroid, a startup organisation, is gearing up to reinvent the entire gamut of HMI (human-computer interfaces). It is offering five natural-type input methods. The brand recently launched the 3D Human-Machine Interface Starter Kit with a crowdfunding campaign. It is a developmental technology to create applications for hand gestures, emotions, eyes and emotions.

Saku Panditharatne, CEO and founder of Asteroid, explained the development in a blog. He stated that the technology can be likened to using a 3D modeling program with an eye tracker. The eye tracker can determine which part of the model is being focussed upon, and auto-zooming is also possible.

He further explained that the Interface can utilise trained AI (artificial intelligence) to interpret human eye fixation behaviour for analysing the menu to be opened. The technology can estimate the menu to be opened, and enable menu browsing for desired actions. In case the computer interprets wrong information, humans have the option of cancelling actions by pressing buttons.

With a $450 agreement, the development team will be allowing backers to be recipients of the Focus eye tracker, Glyph gesture sensor, Orbit hand controller, Continuum linear scrubber and the Axon interface. Each of the components will receive power from a 9V battery pack and a Bluetooth feature for achieving connecting tablets, PCs and smartphones.


Focus consists of a pair of high-speed HD USB cameras, plastic frames, a battery, a Bluetooth patent a Raspberry Pi-based board. The eye tracker can be availed through a $200 pledge, and can effectively determine user attention and intentions depending on eye movements.

 

Panditharatne remarked that compared to a mouse pointing device which can process a click every few seconds, the eye-tracker system can deduce informational tidbits regarding human intent and attention numerous times within a second.

Axon contains six electrodes which can be mounted on one’s head, all of which are connected to an Arduino board. The device can process brain signals for interpreting user intent and emotions.

 

Glyph, Orbit, and Continuum, the other three components, are all connected to an Arduino board. Glyph utilises an electric field sector to deduce gestures, Orbit consists of an acrylic wand controller offering various levels of freedom. Continuum uses a resistive controller for touch-based fine scrubbing.

Using the Mac OS software enables the developers to create nodes for associating application inputs to actions. A user’s eye movements can be used to modify the cursor position within three-dimensional spaces. Created nodes can be accessed from a mobile application, which allows developers to generate ARKit applications.

Panditharatne explained that the process of inputting thoughts into a computer with the latest technology is hundreds of times faster than traditional input devices. 3D models will take a fraction of general timespans to create. Further developments are expected in the near future.

Join the Discussion


Visit StickyLock
Back to top