StressID: a multimodal dataset for stress identification

Chaptoukaev, Hava; Strizhkova, Valeriya; Panariello, Michele; D’Alpaos, Bianca; Reka, Aglind; Manera, Valeria; Thümmler, Susanne; Ismailova, Esma; Evans, Nicholas; Bremond, François; Todisco, Massimiliano; Zuluaga, Maria A.; Ferrari, Laura M.
NeurIPS 2023, 37th Conference on Neural Information Processing Systems, 11-16 December 2023, New Orleans, USA

NeurIPS 2023 Scholar Award

StressID is a new dataset specifically designed for stress identification from unimodal and multimodal data. It contains videos of facial expressions, audio recordings, and physiological signals. The video and audio recordings are acquired using an RGB camera with an integrated microphone. The physiological data is composed of electrocardiography (ECG), electrodermal activity (EDA), and respiration signals that are recorded and monitored using a wearable device. This experimental setup ensures a synchronized and high-quality multimodal data collection. Different stress-inducing stimuli, such as emotional video clips, cognitivtasks including mathematical or comprehension exercises, and public speaking scenarios, are designed to trigger a diverse range of emotional responses. The final dataset consists of recordings from 65 participants that performed 11 tasks, as well as their ratings of perceived relaxation, stress, arousal, and valence levels.

 

StressID is one of the largest datasets for stress identification that features three different sources of data and varied classes of stimuli, representing more than 39 hours of annotated data in total. StressID offers baseline models for stress classification including a cleaning, feature extraction, and classification phase for each modality. Additionally, we provide multimodal predictive models combining video, audio, and physiological inputs. The data and the code for the baselines are available at https://project.inria.fr/stressid/.


HAL
Type:
Conférence
City:
New Orleans
Date:
2023-12-11
Department:
Data Science
Eurecom Ref:
7455
Copyright:
© NIST. Personal use of this material is permitted. The definitive version of this paper was published in NeurIPS 2023, 37th Conference on Neural Information Processing Systems, 11-16 December 2023, New Orleans, USA and is available at :

PERMALINK : https://www.eurecom.fr/publication/7455