Skip to yearly menu bar Skip to main content


Demonstration

A Hands-free Natural User Interface (NUI) for AR/VR Head-Mounted Displays Exploiting Wearer’s Facial Gestures

Jaekwang Cha · Shiho Kim · Jinhyuk Kim

Room 510 ABCD #D7

Abstract:

The demonstration presents interactions between head mounted display (HMD) worn user and augmented reality (AR) environment with our state-of-the-art hands-free user interface (UI) device which catches user’s facial gesture as an input signal of the UI. AR systems used in complex environment, such as surgery or works in dangerous environment, require a hands-free UI because they must continue to use their hands during operation. Moreover, hands-free UI helps improve the user experience (UX), not only in such a complex environment but also in common usage of AR and virtual reality (VR). Even though demands on interface device for HMD environment, there have not been such optimized interface yet like keyboard and mouse for PC or touch interface for smartphone. The objective of our demo is to present attendees a hands-free AR UI experience and to introduce attendees to benefits of using hands-free interface when using AR HMD environment. In the demo, attendee can deliver commands to the system through the wink gesture instead of using today’s common HMD input interface such as hand-held remote controller or HMD buttons which interferes user immerse on HMD environment. The wink acts like mouse click in our demonstration presented AR world. The facial gestures of user are automatically mapped to commands through deep neural networks. The proposed UI system is very unique and appropriate to develop various natural user interface (NUI) for AR/VR HMD environment because the sensing mechanism does not interfere user and allows user to hands-free.

Live content is unavailable. Log in and register to view live content