Co-Located Human–Human Interaction Analysis Using Nonverbal Cues: A Survey
Automated co-located human–human interaction analysis has been addressed by the use of
nonverbal communication as measurable evidence of social and psychological phenomena …
nonverbal communication as measurable evidence of social and psychological phenomena …
Temporal analysis of multimodal data to predict collaborative learning outcomes
The analysis of multiple data streams is a long‐standing practice within educational
research. Both multimodal data analysis and temporal analysis have been applied …
research. Both multimodal data analysis and temporal analysis have been applied …
Classroom digital twins with instrumentation-free gaze tracking
Classroom sensing is an important and active area of research with great potential to
improve instruction. Complementing professional observers–the current best practice …
improve instruction. Complementing professional observers–the current best practice …
Multimediate: Multi-modal group behaviour analysis for artificial mediation
Artificial mediators are promising to support human group conversations but at present their
abilities are limited by insufficient progress in group behaviour analysis. The MultiMediate …
abilities are limited by insufficient progress in group behaviour analysis. The MultiMediate …
Automated detection of joint attention and mutual gaze in free play parent-child interactions
Observing a child's interaction with their parents can provide us with important information
about the child's cognitive development. Nonverbal cues such as joint attention and mutual …
about the child's cognitive development. Nonverbal cues such as joint attention and mutual …
Predicting gaze from egocentric social interaction videos and imu data
Gaze prediction in egocentric videos is a fairly new research topic, which might have several
applications for assistive technology (eg, supporting blind people in their daily interactions) …
applications for assistive technology (eg, supporting blind people in their daily interactions) …
Multiparty visual co-occurrences for estimating personality traits in group meetings
L Zhang, I Bhattacharya, M Morgan… - Proceedings of the …, 2020 - openaccess.thecvf.com
Participants' body language during interactions with others in a group meeting can reveal
important information about their individual personalities, as well as their contribution to a …
important information about their individual personalities, as well as their contribution to a …
A multi-stream recurrent neural network for social role detection in multiparty interactions
Understanding multiparty human interaction dynamics is a challenging problem involving
multiple data modalities and complex ordered interactions between multiple people. We …
multiple data modalities and complex ordered interactions between multiple people. We …
[HTML][HTML] Classifying the emotional speech content of participants in group meetings using convolutional long short-term memory network
MM Morgan, I Bhattacharya, RJ Radke… - The Journal of the …, 2021 - pubs.aip.org
Emotion is a central component of verbal communication between humans. Due to
advances in machine learning and the development of affective computing, automatic …
advances in machine learning and the development of affective computing, automatic …
Multi-scale Conformer Fusion Network for Multi-participant Behavior Analysis
Understanding and elucidating human behavior across diverse scenarios represents a
pivotal research challenge in pursuing seamless human-computer interaction. However …
pivotal research challenge in pursuing seamless human-computer interaction. However …