A survey on deep learning for human activity recognition

F Gu, MH Chung, M Chignell, S Valaee… - ACM Computing …, 2021 - dl.acm.org
Human activity recognition is a key to a lot of applications such as healthcare and smart
home. In this study, we provide a comprehensive survey on recent advances and challenges …

First-person hand action benchmark with rgb-d videos and 3d hand pose annotations

G Garcia-Hernando, S Yuan… - Proceedings of the …, 2018 - openaccess.thecvf.com
In this work we study the use of 3D hand poses to recognize first-person dynamic hand
actions interacting with 3D objects. Towards this goal, we collected RGB-D video sequences …

A survey on activity detection and classification using wearable sensors

M Cornacchia, K Ozcan, Y Zheng… - IEEE Sensors …, 2016 - ieeexplore.ieee.org
Activity detection and classification are very important for autonomous monitoring of humans
for applications, including assistive living, rehabilitation, and surveillance. Wearable sensors …

Trear: Transformer-based rgb-d egocentric action recognition

X Li, Y Hou, P Wang, Z Gao, M Xu… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
In this article, we propose a transformer-based RGB-D egocentric action recognition
framework, called Trear. It consists of two modules: 1) interframe attention encoder and 2) …

Sensors in assisted living: A survey of signal and image processing methods

F Erden, S Velipasalar, AZ Alkar… - IEEE Signal Processing …, 2016 - ieeexplore.ieee.org
Our society will face a notable demographic shift in the near future. According to a United
Nations report, the ratio of the elderly population (aged 60 years or older) to the overall …

[HTML][HTML] Meccano: A multimodal egocentric dataset for humans behavior understanding in the industrial-like domain

F Ragusa, A Furnari, GM Farinella - Computer Vision and Image …, 2023 - Elsevier
Wearable cameras allow to acquire images and videos from the user's perspective. These
data can be processed to understand humans behavior. Despite human behavior analysis …

Multi-stream deep neural networks for rgb-d egocentric action recognition

Y Tang, Z Wang, J Lu, J Feng… - IEEE Transactions on …, 2018 - ieeexplore.ieee.org
In this paper, we investigate the problem of RGB-D egocentric action recognition. Unlike
conventional human action videos that are passively recorded by static cameras, egocentric …

Adaptive feature processing for robust human activity recognition on a novel multi-modal dataset

M Moencks, V De Silva, J Roche, A Kondoz - arXiv preprint arXiv …, 2019 - arxiv.org
Human Activity Recognition (HAR) is a key building block of many emerging applications
such as intelligent mobility, sports analytics, ambient-assisted living and human-robot …

Action recognition in RGB-D egocentric videos

Y Tang, Y Tian, J Lu, J Feng… - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
In this paper, we investigate the problem of action recognition in RGB-D egocentric videos.
These self-generated and embodied videos provide richer semantic cues than the …

Egocentric scene understanding via multimodal spatial rectifier

T Do, K Vuong, HS Park - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com
In this paper, we study a problem of egocentric scene understanding, ie, predicting depths
and surface normals from an egocentric image. Egocentric scene understanding poses …