||A VR/AR Interface Design based on Unaligned Hand Position and Gaze Direction
||Joon Young Ahn;Sang Yoon Han;Junho Jo;Sang Hwa Lee;Nam Ik Cho
||Natural user interface (NUI); Eye gaze tracking; Hand gesture recognition
||In conventional natural user interface schemes that use hand gestures without holding a sensor device, users usually need to hold up their hands to align them to the gaze direction where the (virtual) object is displayed. Also, users need to move their hands in the designated space for quite a long time, which may cause fatigue and decrease the performance of gesture recognition.
Hence, we propose a new interface scheme that alleviates this problem by freeing the hand position from the gaze direction such that users can even put their hands on a desk. For this, we need to have more robust hand detection and an effective gesture recognition scheme, because the hands can be placed anywhere in the space, and hand motion is not calibrated to the virtual space. For robust hand detection, we propose an algorithm based on a new hand pose model, which is implemented using a wide-angle camera. For efficient gaze calibration and tracking, we use a three-dimensional eyeball model from our previous work. Because the gaze and hand motion/positions are not aligned, we designed a finite state machine for robust gesture flow. Based on this scheme, we defined six interactive interfaces for the target object: click, double click, drag and drop, zoom in, zoom out, and return home. More complicated interactions can be implemented based on these six basic interfaces. The proposed interface scheme is implemented on a handheld mobile device, which is shown to work robustly for a wide range of hand positions.