A systematic review on affective computing: Emotion models, databases, and recent advances

Y Wang, W Song, W Tao, A Liotta, D Yang, X Li, S Gao… - Information …, 2022 - Elsevier
Affective computing conjoins the research topics of emotion recognition and sentiment
analysis, and can be realized with unimodal or multimodal data, consisting primarily of …

A Comprehensive Review of Data‐Driven Co‐Speech Gesture Generation

S Nyatsanga, T Kucherenko, C Ahuja… - Computer Graphics …, 2023 - Wiley Online Library
Gestures that accompany speech are an essential part of natural and efficient embodied
human communication. The automatic generation of such co‐speech gestures is a long …

Cross-subject multimodal emotion recognition based on hybrid fusion

Y Cimtay, E Ekmekcioglu, S Caglar-Ozhan - IEEE Access, 2020 - ieeexplore.ieee.org
Multimodal emotion recognition has gained traction in affective computing research
community to overcome the limitations posed by the processing a single form of data and to …

MSP-IMPROV: An acted corpus of dyadic interactions to study emotion perception

C Busso, S Parthasarathy, A Burmania… - IEEE Transactions …, 2016 - ieeexplore.ieee.org
We present the MSP-IMPROV corpus, a multimodal emotional database, where the goal is
to have control over lexical content and emotion while also promoting naturalness in the …

A multimodal emotional human–robot interaction architecture for social robots engaged in bidirectional communication

A Hong, N Lunscher, T Hu, Y Tsuboi… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
For social robots to effectively engage in human–robot interaction (HRI), they need to be
able to interpret human affective cues and to respond appropriately via display of their own …

[HTML][HTML] A survey on databases for multimodal emotion recognition and an introduction to the VIRI (visible and InfraRed image) database

MFH Siddiqui, P Dhakal, X Yang, AY Javaid - Multimodal Technologies …, 2022 - mdpi.com
Multimodal human–computer interaction (HCI) systems pledge a more human–human-like
interaction between machines and humans. Their prowess in emanating an unambiguous …

Context-aware personality inference in dyadic scenarios: Introducing the udiva dataset

C Palmero, J Selva, S Smeureanu… - Proceedings of the …, 2021 - openaccess.thecvf.com
This paper introduces UDIVA, a new non-acted dataset of face-to-face dyadic interactions,
where interlocutors perform competitive and collaborative tasks with different behavior …

[HTML][HTML] A multimodal facial emotion recognition framework through the fusion of speech with visible and infrared images

MFH Siddiqui, AY Javaid - Multimodal Technologies and Interaction, 2020 - mdpi.com
The exigency of emotion recognition is pushing the envelope for meticulous strategies of
discerning actual emotions through the use of superior multimodal techniques. This work …

An engineering view on emotions and speech: From analysis and predictive models to responsible human-centered applications

CC Lee, T Chaspari, EM Provost… - Proceedings of the …, 2023 - ieeexplore.ieee.org
The substantial growth of Internet-of-Things technology and the ubiquity of smartphone
devices has increased the public and industry focus on speech emotion recognition (SER) …

The MSP-conversation corpus

L Martinez-Lucas, M Abdelwahab, C Busso - Interspeech 2020, 2020 - par.nsf.gov
Human-computer interactions can be very effective, especially if computers can
automatically recognize the emotional state of the user. A key barrier for effective speech …