Eye-tracked virtual reality: a comprehensive survey on methods and privacy challenges

E Bozkir, S Özdel, M Wang, B David-John… - arXiv preprint arXiv …, 2023 - arxiv.org
Latest developments in computer hardware, sensor technologies, and artificial intelligence
can make virtual reality (VR) and virtual spaces an important part of human everyday life …

Investigating eyes-away mid-air typing in virtual reality using squeeze haptics-based postural reinforcement

A Gupta, N Sendhilnathan, J Hartcher-O'Brien… - Proceedings of the …, 2023 - dl.acm.org
In this paper, we investigate postural reinforcement haptics for mid-air typing using squeeze
actuation on the wrist. We propose and validate eye-tracking based objective metrics that …

Rids: Implicit detection of a selection gesture using hand motion dynamics during freehand pointing in virtual reality

T Zhang, Z Hu, A Gupta, CH Wu, H Benko… - Proceedings of the 35th …, 2022 - dl.acm.org
Freehand interactions with augmented and virtual reality are growing in popularity, but they
lack reliability and robustness. Implicit behavior from users, such as hand or gaze …

Exploring Visualizations for Precisely Guiding Bare Hand Gestures in Virtual Reality

X Wang, B Lafreniere, J Zhao - Proceedings of the CHI Conference on …, 2024 - dl.acm.org
Bare hand interaction in augmented or virtual reality (AR/VR) systems, while intuitive, often
results in errors and frustration. However, existing methods, such as a static icon or a …

Neural network implementation of gaze-target prediction for human-robot interaction

V Somashekarappa, A Sayeed… - 2023 32nd IEEE …, 2023 - ieeexplore.ieee.org
Gaze cues, which initiate an action or behaviour, are necessary for a responsive and
intuitive interaction. Using gaze to signal intentions or request an action during conversation …

FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth Manipulation

C Zhang, T Chen, E Shaffer, E Soltanaghai - Proceedings of the CHI …, 2024 - dl.acm.org
Gaze interaction presents a promising avenue in Virtual Reality (VR) due to its intuitive and
efficient user experience. Yet, the depth control inherent in our visual system remains …

Understanding How Blind Users Handle Object Recognition Errors: Strategies and Challenges

J Hong, H Kacorri - Proceedings of the 26th International ACM …, 2024 - dl.acm.org
Object recognition technologies hold the potential to support blind and low-vision people in
navigating the world around them. However, the gap between benchmark performances and …

GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR Interactions

P Hallgarten, N Sendhilnathan, T Zhang… - Proceedings of the …, 2024 - dl.acm.org
Machine learning models using users' gaze and hand data to encode user interaction
behavior in VR are often tailored to a single task and sensor set, limiting their applicability in …

XR Input Error Mediation for Hand-Based Input: Task and Context Influences a User's Preference

T Lin, B Lafreniere, Y Xu, T Grossman… - … on Mixed and …, 2023 - ieeexplore.ieee.org
Many XR devices use bare-hand gestures to reduce the need for handheld controllers. Such
gestures, however, lead to false positive and false negative recognition errors, which detract …

Explainable Interfaces for Rapid Gaze-Based Interactions in Mixed Reality

M Yu, D Harris, I Jones, T Zhang, Y Liu… - arXiv preprint arXiv …, 2024 - arxiv.org
Gaze-based interactions offer a potential way for users to naturally engage with mixed reality
(XR) interfaces. Black-box machine learning models enabled higher accuracy for gaze …