Co-Located Human–Human Interaction Analysis Using Nonverbal Cues: A Survey

C Beyan, A Vinciarelli, AD Bue - ACM Computing Surveys, 2023 - dl.acm.org
Automated co-located human–human interaction analysis has been addressed by the use of
nonverbal communication as measurable evidence of social and psychological phenomena …

Temporal analysis of multimodal data to predict collaborative learning outcomes

JK Olsen, K Sharma, N Rummel… - British Journal of …, 2020 - Wiley Online Library
The analysis of multiple data streams is a long‐standing practice within educational
research. Both multimodal data analysis and temporal analysis have been applied …

Classroom digital twins with instrumentation-free gaze tracking

K Ahuja, D Shah, S Pareddy, F Xhakaj, A Ogan… - Proceedings of the …, 2021 - dl.acm.org
Classroom sensing is an important and active area of research with great potential to
improve instruction. Complementing professional observers–the current best practice …

Multimediate: Multi-modal group behaviour analysis for artificial mediation

P Müller, M Dietz, D Schiller, D Thomas… - Proceedings of the 29th …, 2021 - dl.acm.org
Artificial mediators are promising to support human group conversations but at present their
abilities are limited by insufficient progress in group behaviour analysis. The MultiMediate …

Automated detection of joint attention and mutual gaze in free play parent-child interactions

P Li, H Lu, RW Poppe, AA Salah - Companion Publication of the 25th …, 2023 - dl.acm.org
Observing a child's interaction with their parents can provide us with important information
about the child's cognitive development. Nonverbal cues such as joint attention and mutual …

Predicting gaze from egocentric social interaction videos and imu data

SK Thakur, C Beyan, P Morerio, A Del Bue - Proceedings of the 2021 …, 2021 - dl.acm.org
Gaze prediction in egocentric videos is a fairly new research topic, which might have several
applications for assistive technology (eg, supporting blind people in their daily interactions) …

Multiparty visual co-occurrences for estimating personality traits in group meetings

L Zhang, I Bhattacharya, M Morgan… - Proceedings of the …, 2020 - openaccess.thecvf.com
Participants' body language during interactions with others in a group meeting can reveal
important information about their individual personalities, as well as their contribution to a …

A multi-stream recurrent neural network for social role detection in multiparty interactions

L Zhang, RJ Radke - IEEE Journal of Selected Topics in Signal …, 2020 - ieeexplore.ieee.org
Understanding multiparty human interaction dynamics is a challenging problem involving
multiple data modalities and complex ordered interactions between multiple people. We …

[HTML][HTML] Classifying the emotional speech content of participants in group meetings using convolutional long short-term memory network

MM Morgan, I Bhattacharya, RJ Radke… - The Journal of the …, 2021 - pubs.aip.org
Emotion is a central component of verbal communication between humans. Due to
advances in machine learning and the development of affective computing, automatic …

Multi-scale Conformer Fusion Network for Multi-participant Behavior Analysis

Q Song, R Dian, B Sun, J Xie, S Li - Proceedings of the 31st ACM …, 2023 - dl.acm.org
Understanding and elucidating human behavior across diverse scenarios represents a
pivotal research challenge in pursuing seamless human-computer interaction. However …