Deep learning in human activity recognition with wearable sensors: A review on advances

S Zhang, Y Li, S Zhang, F Shahabi, S Xia, Y Deng… - Sensors, 2022 - mdpi.com
Mobile and wearable devices have enabled numerous applications, including activity
tracking, wellness monitoring, and human–computer interaction, that measure and improve …

Toward storytelling from visual lifelogging: An overview

M Bolanos, M Dimiccoli… - IEEE Transactions on …, 2016 - ieeexplore.ieee.org
Visual lifelogging consists of acquiring images that capture the daily experiences of the user
by wearing a camera over a long period of time. The pictures taken offer considerable …

Socratic models: Composing zero-shot multimodal reasoning with language

A Zeng, M Attarian, B Ichter, K Choromanski… - arXiv preprint arXiv …, 2022 - arxiv.org
Large pretrained (eg," foundation") models exhibit distinct capabilities depending on the
domain of data they are trained on. While these domains are generic, they may only barely …

Ego4d: Around the world in 3,000 hours of egocentric video

K Grauman, A Westbury, E Byrne… - Proceedings of the …, 2022 - openaccess.thecvf.com
We introduce Ego4D, a massive-scale egocentric video dataset and benchmark suite. It
offers 3,670 hours of daily-life activity video spanning hundreds of scenarios (household …

H2o: Two hands manipulating objects for first person interaction recognition

T Kwon, B Tekin, J Stühmer, F Bogo… - Proceedings of the …, 2021 - openaccess.thecvf.com
We present a comprehensive framework for egocentric interaction recognition using
markerless 3D annotations of two hands manipulating objects. To this end, we propose a …

The epic-kitchens dataset: Collection, challenges and baselines

D Damen, H Doughty, GM Farinella… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org
Since its introduction in 2018, EPIC-KITCHENS has attracted attention as the largest
egocentric video benchmark, offering a unique viewpoint on people's interaction with …

H+ o: Unified egocentric recognition of 3d hand-object poses and interactions

B Tekin, F Bogo, M Pollefeys - Proceedings of the IEEE/CVF …, 2019 - openaccess.thecvf.com
We present a unified framework for understanding 3D hand and object interactions in raw
image sequences from egocentric RGB cameras. Given a single RGB image, our model …

A survey on activity detection and classification using wearable sensors

M Cornacchia, K Ozcan, Y Zheng… - IEEE Sensors …, 2016 - ieeexplore.ieee.org
Activity detection and classification are very important for autonomous monitoring of humans
for applications, including assistive living, rehabilitation, and surveillance. Wearable sensors …

Lending a hand: Detecting hands and recognizing activities in complex egocentric interactions

S Bambach, S Lee, DJ Crandall… - Proceedings of the IEEE …, 2015 - openaccess.thecvf.com
Hands appear very often in egocentric video, and their appearance and pose give important
cues about what people are doing and what they are paying attention to. But existing work in …

Multi-class open set recognition using probability of inclusion

LP Jain, WJ Scheirer, TE Boult - … September 6-12, 2014, Proceedings, Part …, 2014 - Springer
The perceived success of recent visual recognition approaches has largely been derived
from their performance on classification tasks, where all possible classes are known at …