Interaction methods for smart glasses: A survey

LH Lee, P Hui - IEEE access, 2018 - ieeexplore.ieee.org
Since the launch of Google Glass in 2014, smart glasses have mainly been designed to
support micro-interactions. The ultimate goal for them to become an augmented reality …

Stream: Exploring the combination of spatially-aware tablets with augmented reality head-mounted displays for immersive analytics

S Hubenschmid, J Zagermann, S Butscher… - Proceedings of the …, 2021 - dl.acm.org
Recent research in the area of immersive analytics demonstrated the utility of head-mounted
augmented reality devices for visual data analysis. However, it can be challenging to use the …

Designing an effective vibration-based notification interface for mobile phones

B Saket, C Prasojo, Y Huang, S Zhao - Proceedings of the 2013 …, 2013 - dl.acm.org
We conducted an experiment to understand how mobile phone users perceive the urgency
of ten simple vibration alerts that were created from four basic signals: short on, short off …

Blindtype: Eyes-free text entry on handheld touchpad by leveraging thumb's muscle memory

Y Lu, C Yu, X Yi, Y Shi, S Zhao - Proceedings of the ACM on Interactive …, 2017 - dl.acm.org
Eyes-free input is desirable for ubiquitous computing, since interacting with mobile and
wearable devices often competes for visual attention with other devices and tasks. In this …

Facesight: Enabling hand-to-face gesture interaction on ar glasses with a downward-facing camera vision

Y Weng, C Yu, Y Shi, Y Zhao, Y Yan, Y Shi - Proceedings of the 2021 …, 2021 - dl.acm.org
We present FaceSight, a computer vision-based hand-to-face gesture sensing technique for
AR glasses. FaceSight fixes an infrared camera onto the bridge of AR glasses to provide …

Expanding exertion gaming

J Marshall, S Benford, S Pijnappel - International Journal of Human …, 2016 - Elsevier
While exertion games-digital games where the outcome is determined by physical exertion-
are of growing interest in HCI, we believe the current health and fitness focus in the research …

Palmgesture: Using palms as gesture interfaces for eyes-free input

CY Wang, MC Hsiu, PT Chiu, CH Chang… - Proceedings of the 17th …, 2015 - dl.acm.org
In this paper, we explored eyes-free gesture interactions on palms, which enables users to
interact with devices by drawing stroke gestures on palms without looking at palms. We …

Understanding and Adapting Bezel-to-Bezel Interactions for Circular Smartwatches in Mobile and Encumbered Scenarios

B Rey, K Zhu, ST Perrault, S Bardot, A Neshati… - Proceedings of the …, 2022 - dl.acm.org
Supporting eyes-free interaction, mobility and encumbrance, while providing a broad set of
commands on a smartwatch display is a difficult, yet important, task. Bezel-to-bezel (B2B) …

TIMMi: Finger-worn textile input device with multimodal sensing in mobile interaction

SH Yoon, K Huo, VP Nguyen, K Ramani - Proceedings of the Ninth …, 2015 - dl.acm.org
We introduce TIMMi, a textile input device for mobile interactions. TIMMi is worn on the index
finger to provide a multimodal sensing input metaphor. The prototype is fabricated on a …

Vibration-augmented buttons: Information transmission capacity and application to interaction design

C Park, J Kim, DG Kim, S Oh, S Choi - … of the 2022 CHI Conference on …, 2022 - dl.acm.org
One can embed a vibration actuator to a physical button and augment the physical button's
original kinesthetic response with a programmable vibration generated by the actuator. Such …