You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

You agree to not use the dataset to conduct experiments that cause harm to human subjects.

Log in or Sign Up to review the conditions and access this dataset content.

REFED: A Subject Real-time Dynamic Labeled EEG-fNIRS Synchronized Recorded Emotion Dataset

About REFED

The REFED is an affective brain-computer interface (aBCI) dataset integrating multimodal brain signals and real-time dynamic emotion annotation, fills a critical gap in the study of neural mechanisms of emotional dynamic evolution and the development of high-reliability aBCI models. By synchronizing the acquisition of EEG and fNIRS signals, REFED realizes the joint observation of neuroelectrical activity and hemodynamic response under emotional evocation, which provides unique data support for exploring emotion-related neuro-vascular coupling mechanisms. Meanwhile, the dynamic valence and arousal annotation based on subjects' subjective reports, realizes temporal alignment of brain signals with emotional state changes, which significantly improves the temporal modeling capability of the emotion recognition model. Experimental validation shows that the dataset meets the standard in terms of emotion evoked validity and labeling reliability, and the multimodal signal features show significant correlation with the dynamic labeling. The open sharing of REFED will promote the cross-modal neural representation parsing for the dynamic encoding of emotions in the following research directions, and lay an important foundation for the field of affective computation and brain-computer interfaces to move towards dynamic interaction paradigms with higher ecological validity.

Data overview

REFED/
β”œβ”€β”€ README.md
β”œβ”€β”€ Metadata.csv
β”œβ”€β”€ SAM.csv
β”œβ”€β”€ PANAS.csv
β”œβ”€β”€ fNIRS_reservations.csv
β”œβ”€β”€ fNIRS_coordinate.csv
β”œβ”€β”€ Video_info.csv
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ 1/
β”‚   β”‚   β”œβ”€β”€ EEG_baselines.mat
β”‚   β”‚   β”œβ”€β”€ EEG_videos.mat
β”‚   β”‚   β”œβ”€β”€ fNIRS_baselines.mat
β”‚   β”‚   └── fNIRS_videos.mat
β”‚   β”œβ”€β”€ 2/
β”‚   β”‚   └── ...
β”‚   └── 32/
β”‚       └── ...
└── annotations/
    β”œβ”€β”€ 1_label.mat
    β”œβ”€β”€ 2_label.mat
    └── ... (up to 32_label.mat)

πŸ“„ README.md

Documentation of instructions for using the dataset, including background, structure, data collection, license agreement, etc.

πŸ“„ Metadata.csv

Subject basic information (ID, gender, age, health status, etc.).

πŸ“„ SAM.csv

Post-trial subjective mood scores based on the Self-Assessment Manikin (SAM) scale (valence and arousal).

πŸ“„ PANAS.csv

Positive and Negative Affect Self-Assessment Results Before and After the Experiment (Positive and Negative Affect Schedule).

πŸ“„ fNIRS_reservations.csv

fNIRS data acquisition logs, including bad channel markers.

πŸ“„ fNIRS_coordinate.csv

3D coordinates of the fNIRS channels for alignment and spatial analysis.

πŸ“„ Video_info.csv

Information on 15 emotional videos.

πŸ“ data/

Contains folders of data for subjects numbered 1 through 32, each containing the following raw brain signal files:

Subdirectory structure:

data/{subject_id}/
β”œβ”€β”€ EEG_baselines.mat       # Resting EEG signal (baseline phase)
β”œβ”€β”€ EEG_videos.mat          # Emotionally evoked phase EEG signaling (video stimulation)
β”œβ”€β”€ fNIRS_baselines.mat     # Resting-state fNIRS signal (baseline phase)
└── fNIRS_videos.mat        # Emotionally evoked phase fNIRS signaling (video stimulation)
  • Each subject corresponds to a numbered folder, for a total of 32 subjects.
  • EEG sampling rate of 1000 Hz, fNIRS sampling rate of 47.62 Hz.
  • fNIRS has six signal types, including HbO, HbR, HbT, Abs 780 nm, Abs 805 nm, Abs 830 nm.
  • .mat format saves multi-trial time series data, such as video_1, a total of 15 videos.
  • The shape of EEG is channel dimension * time dimension and the shape of fNIRS is signal type * channel dimension * time dimension.

πŸ“ annotations/

Store the real-time dynamic labeling data of each subject during the viewing of the emotion video:

annotations/
β”œβ”€β”€ 1_label.mat
β”œβ”€β”€ 2_label.mat
└── ... (up to 32_label.mat)
  • Each *_label.mat file contains the results of subjects' subjective labeling of emotions based on the joystick. Contains both valence and arousal dimensions.
  • Annotation sequences containing changes in valence and arousal over time have been precisely aligned to brain signals (shaped as 2 * time dimension).

Collection

In order to realize the real-time annotation of subjects' emotional state and the automated control of the whole process, we also developed a real-time annotation and control system, as shown in the following figure.

Figure_2

The channel distribution of the joint EEG-fNIRS acquisition is shown in the following figure.

Figure_1

Citation

@inproceedings{ning2025refed,
  title={{REFED}: A Subject Real-time Dynamic Labeled {EEG}-f{NIRS} Synchronized Recorded Emotion Dataset},
  author={Xiaojun Ning and Jing Wang and Zhiyang Feng and Tianzuo Xin and Shuo Zhang and Shaoqi Zhang and Zheng Lian and Yi Ding and Youfang Lin and Ziyu Jia},
  booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
  year={2025},
  url={https://openreview.net/forum?id=C4IqLzavel}
}

License

Publicly available under the CC BY-NC-SA protocol, with direct access to users after confirming use for non-commercial research purposes.

CC-BY-NC-SA

Downloads last month
294