

DREAMER: Emotion recognition dataset
In this work, we present DREAMER, a multi-modal database consisting of electroencephalogram (EEG) and electrocardiogram (ECG) signals recorded during affect elicitation by means of audio-visual stimuli. Signals from 23 participants were recorded along with the participants self-assessment of their affective state after each stimuli, in terms of valence, arousal, and dominance. All the signals were captured using portable, wearable, wireless, low-cost and off-the-shelf equipment that has the potential to allow the use of affective computing methods in everyday applications. A baseline for participant-wise affect recognition using EEG and ECG -based features, as well as their fusion, was established through supervised classification experiments using Support Vector Machines (SVMs). The selfassessment of the participants was evaluated through comparison with the self-assessments from another study using the same audio-visual stimuli. Classification results for valence, arousal and dominance of the proposed database are comparable to the ones achieved for other databases that use non-portable, expensive, medical grade devices. These results indicate the prospects of using low-cost devices for affect recognition applications. The proposed database will be made publicly available in order to allow researchers to achieve a more thorough evaluation of the suitability of these capturing devices for affect recognition applications.
Download DREAMER
Relevant projects
Relevant publications
- S. Katsigiannis and N. Ramzan, "DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices," in IEEE Journal of Biomedical and Health Informatics, vol. 22, no. 2, pp. 1-1, 2018. doi: 10.1109/JBHI.2017.2688239