[PDF][PDF] Sensor-based Data Fusion for Multimodal Affect Detection in Game-based Learning Environments.

NL Henderson, JP Rowe, BW Mott, JC Lester - EDM (workshops), 2019 - ceur-ws.org
EDM (workshops), 2019ceur-ws.org
Affect detection is central to educational data mining because of its potential contribution to
predicting learning processes and outcomes. Using multiple modalities has been shown to
increase the performance of affect detection. With the rise of sensor-based modalities due to
their relatively low cost and high level of flexibility, there has been a marked increase in
research efforts pertaining to sensor-based, multimodal systems for affective computing
problems. In this paper, we demonstrate the impact that multimodal systems can have when …
Abstract
Affect detection is central to educational data mining because of its potential contribution to predicting learning processes and outcomes. Using multiple modalities has been shown to increase the performance of affect detection. With the rise of sensor-based modalities due to their relatively low cost and high level of flexibility, there has been a marked increase in research efforts pertaining to sensor-based, multimodal systems for affective computing problems. In this paper, we demonstrate the impact that multimodal systems can have when using Microsoft Kinect-based posture data and electrodermal activity data for the analysis of affective states displayed by students engaged with a game-based learning environment. We compare the effectiveness of both support vector machines and deep neural networks as affect classifiers. Additionally, we evaluate different types of data fusion to determine which method for combining the separate modalities yields the highest classification rate. Results indicate that multimodal approaches outperform unimodal baseline classifiers, and feature-level concatenation offers the highest performance among the data fusion techniques.
ceur-ws.org
以上显示的是最相近的搜索结果。 查看全部搜索结果