Pinpointing: Precise head-and eye-based target selection for augmented reality

M Kytö, B Ens, T Piumsomboon, GA Lee… - Proceedings of the …, 2018 - dl.acm.org
Head and eye movement can be leveraged to improve the user's interaction repertoire for
wearable displays. Head movements are deliberate and accurate, and provide the current …

Eye&head: Synergetic eye and head movement for gaze pointing and selection

L Sidenmark, H Gellersen - Proceedings of the 32nd annual ACM …, 2019 - dl.acm.org
Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but
existing approaches to gaze pointing are based on eye-tracking in abstraction from head …

Hands-free human–robot interaction using multimodal gestures and deep learning in wearable mixed reality

KB Park, SH Choi, JY Lee, Y Ghasemi… - IEEE …, 2021 - ieeexplore.ieee.org
This study proposes a novel hands-free interaction method using multimodal gestures such
as eye gazing and head gestures and deep learning for human-robot interaction (HRI) in …

Headgesture: Hands-free input approach leveraging head movements for hmd devices

Y Yan, C Yu, X Yi, Y Shi - Proceedings of the ACM on Interactive, Mobile …, 2018 - dl.acm.org
We propose HeadGesture, a hands-free input approach to interact with Head Mounted
Display (HMD) devices. Using HeadGesture, users do not need to raise their arms to …

Understanding gesture input articulation with upper-body wearables for users with upper-body motor impairments

RD Vatavu, OC Ungurean - Proceedings of the 2022 CHI Conference on …, 2022 - dl.acm.org
We examine touchscreen stroke-gestures and mid-air motion-gestures articulated by users
with upper-body motor impairments with devices worn on the wrist, finger, and head. We …

Headcross: Exploring head-based crossing selection on head-mounted displays

Y Yan, Y Shi, C Yu, Y Shi - Proceedings of the ACM on interactive …, 2020 - dl.acm.org
We propose HeadCross, a head-based interaction method to select targets on VR and AR
head-mounted displays (HMD). Using HeadCross, users control the pointer with head …

Gazedock: Gaze-only menu selection in virtual reality using auto-triggering peripheral menu

X Yi, Y Lu, Z Cai, Z Wu, Y Wang… - 2022 IEEE Conference …, 2022 - ieeexplore.ieee.org
Gaze-only input techniques in VR face the challenge of avoiding false triggering due to
continuous eye tracking while maintaining interaction performance. In this paper, we …

“I Don't Want People to Look At Me Differently” Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments

X Zhao, M Fan, T Han - Proceedings of the 2022 CHI Conference on …, 2022 - dl.acm.org
Recent research proposed eyelid gestures for people with upper-body motor impairments
(UMI) to interact with smartphones without finger touch. However, such eyelid gestures were …

Resolving target ambiguity in 3d gaze interaction through vor depth estimation

D Mardanbegi, T Langlotz, H Gellersen - … of the 2019 CHI Conference on …, 2019 - dl.acm.org
Target disambiguation is a common problem in gaze interfaces, as eye tracking has
accuracy and precision limitations. In 3D environments this is compounded by objects …

Head and eye egocentric gesture recognition for human-robot interaction using eyewear cameras

J Marina-Miranda, VJ Traver - IEEE Robotics and Automation …, 2022 - ieeexplore.ieee.org
Non-verbal communication plays a particularly important role in a wide range of scenarios in
Human-Robot Interaction (HRI). Accordingly, this work addresses the problem of human …