Text input for non-stationary XR workspaces: investigating tap and word-gesture keyboards in virtual and augmented reality
F Kern, F Niebling, ME Latoschik - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
This article compares two state-of-the-art text input techniques between non-stationary
virtual reality (VR) and video see-through augmented reality (VST AR) use-cases as XR …
virtual reality (VR) and video see-through augmented reality (VST AR) use-cases as XR …
DRG-Keyboard: Enabling subtle gesture typing on the fingertip with dual IMU rings
C Liang, C Hsia, C Yu, Y Yan, Y Wang… - Proceedings of the ACM on …, 2023 - dl.acm.org
We present DRG-Keyboard, a gesture keyboard enabled by dual IMU rings, allowing the
user to swipe the thumb on the index fingertip to perform word gesture typing as if typing on …
user to swipe the thumb on the index fingertip to perform word gesture typing as if typing on …
Glancewriter: Writing text by glancing over letters with gaze
W Cui, R Liu, Z Li, Y Wang, A Wang, X Zhao… - Proceedings of the …, 2023 - dl.acm.org
Writing text with eye gaze only is an appealing hands-free text entry method. However,
existing gaze-based text entry methods introduce eye fatigue and are slow in typing speed …
existing gaze-based text entry methods introduce eye fatigue and are slow in typing speed …
Gaze speedup: eye gaze assisted gesture typing in virtual reality
M Zhao, AM Pierce, R Tan, T Zhang, T Wang… - Proceedings of the 28th …, 2023 - dl.acm.org
Mid-air text input in augmented or virtual reality (AR/VR) is an open problem. One proposed
solution is gesture typing where the user performs a gesture trace over the keyboard …
solution is gesture typing where the user performs a gesture trace over the keyboard …
Eyesaycorrect: Eye gaze and voice based hands-free text correction for mobile devices
M Zhao, H Huang, Z Li, R Liu, W Cui… - Proceedings of the 27th …, 2022 - dl.acm.org
Text correction on mobile devices usually requires precise and repetitive manual control. In
this paper, we present EyeSayCorrect, an eye gaze and voice based hands-free text …
this paper, we present EyeSayCorrect, an eye gaze and voice based hands-free text …
Design and evaluation of a silent speech-based selection method for eye-gaze pointing
L Pandey, AS Arif - Proceedings of the ACM on Human-Computer …, 2022 - dl.acm.org
We investigate silent speech as a hands-free selection method in eye-gaze pointing. We first
propose a stripped-down image-based model that can recognize a small number of silent …
propose a stripped-down image-based model that can recognize a small number of silent …
Hummer: Text entry by gaze and hum
R Hedeshy, C Kumar, R Menges, S Staab - Proceedings of the 2021 CHI …, 2021 - dl.acm.org
Text entry by gaze is a useful means of hands-free interaction that is applicable in settings
where dictation suffers from poor voice recognition or where spoken words and sentences …
where dictation suffers from poor voice recognition or where spoken words and sentences …
HMM-based gesture recognition for eye-swipe typing
M Mifsud, TA Camilleri, KP Camilleri - Biomedical Signal Processing and …, 2023 - Elsevier
Eye-swipe typing requires users to simply look in the vicinity of the keys forming the desired
word, similar to swiping their finger on a touch screen device. This work presents a novel …
word, similar to swiping their finger on a touch screen device. This work presents a novel …
A one-point calibration design for hybrid eye typing interface
Z Zeng, ES Neuer, M Roetting… - International Journal of …, 2023 - Taylor & Francis
We present an eye typing interface with one-point calibration, which is a two-stage design.
The characters are clustered in groups of four characters. Users select a cluster by gazing at …
The characters are clustered in groups of four characters. Users select a cluster by gazing at …
[HTML][HTML] Bayesgaze: A bayesian approach to eye-gaze based target selection
Z Li, M Zhao, Y Wang, S Rashidian, F Baig… - Proceedings …, 2021 - ncbi.nlm.nih.gov
Selecting targets accurately and quickly with eye-gaze input remains an open research
question. In this paper, we introduce BayesGaze, a Bayesian approach of determining the …
question. In this paper, we introduce BayesGaze, a Bayesian approach of determining the …