The Harvard EEG Database will encompass data gathered from four hospitals affiliated with Harvard University:Massachusetts General Hospital (MGH), Brigham and Women's Hospital (BWH), Beth Israel Deaconess Medical Center (BIDMC), and Boston Children's Hospital (BCH).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Univ. of Bonn’ and ‘CHB-MIT Scalp EEG Database’ are publically available datasets which are the most sought after amongst researchers. Bonn dataset is very small compared to CHB-MIT. But still researchers prefer Bonn as it is in simple '.txt' format. The dataset being published here is a preprocessed form of CHB-MIT. The dataset is available in '.csv' format.
THIS RESOURCE IS NO LONGER IN SERVICE. Documented on November 22, 2022. Data set collected at the Children''s Hospital Boston, of EEG recordings from pediatric subjects with intractable seizures. Subjects were monitored for up to several days following withdrawal of anti-seizure medication in order to characterize their seizures and assess their candidacy for surgical intervention. All signals were sampled at 256 samples per second with 16-bit resolution. Most files contain 23 EEG signals (24 or 26 in a few cases).
THIS RESOURCE IS NO LONGER IN SERVICE. Documented on April 29,2025. Electroencephalogram (EEG) data recorded from invasive and scalp electrodes. The EEG database contains invasive EEG recordings of 21 patients suffering from medically intractable focal epilepsy. The data were recorded during an invasive pre-surgical epilepsy monitoring at the Epilepsy Center of the University Hospital of Freiburg, Germany. In eleven patients, the epileptic focus was located in neocortical brain structures, in eight patients in the hippocampus, and in two patients in both. In order to obtain a high signal-to-noise ratio, fewer artifacts, and to record directly from focal areas, intracranial grid-, strip-, and depth-electrodes were utilized. The EEG data were acquired using a Neurofile NT digital video EEG system with 128 channels, 256 Hz sampling rate, and a 16 bit analogue-to-digital converter. Notch or band pass filters have not been applied. For each of the patients, there are datasets called ictal and interictal, the former containing files with epileptic seizures and at least 50 min pre-ictal data. the latter containing approximately 24 hours of EEG-recordings without seizure activity. At least 24 h of continuous interictal recordings are available for 13 patients. For the remaining patients interictal invasive EEG data consisting of less than 24 h were joined together, to end up with at least 24 h per patient. An interdisciplinary project between: * Epilepsy Center, University Hospital Freiburg * Bernstein Center for Computational Neuroscience (BCCN), Freiburg * Freiburg Center for Data Analysis and Modeling (FDM).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Artificial intelligence (AI) based automated epilepsy diagnosis has aimed to ease the burden of manual detection, prediction, and management of seizure and epilepsy-specific EEG signals for medical specialists. With increasing open-source, raw and large EEG databases, there is a need for data standardization of patient and seizure sensitive AI analysis with reduced redundant information. This work releases a balanced, annotated, fixed time and length meta-data of Siena Scalp EEG database v1.0.0.
The work releases patient inter-specific and patient non-specific EEG data extracted using specific time stamps of ictal, pre-ictal, post-ictal and peri-ictal EEG provided in the original database (annotations). Further details of this metadata can be found in the provided csv file (Siena DB timestamp.csv). The released EEG data is available in csv format and class labels are provided in the last row of the csv files. PN00-3 has not been included in this database.
Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
License information was derived automatically
Dataset
Synthetic EEG data generated by the ‘bai’ model based on real data.
Features/Columns:
No: "Number" Sex: "Gender" Age: "Age of participants" EEG Date: "The date of the EEG" Education: "Education level" IQ: "IQ level of participants" Main Disorder: "General class definition of the disorder" Specific Disorder: "Specific class definition of the disorder"
Total Features/Columns: 1140
Content:
Obsessive Compulsive Disorder Bipolar Disorder Schizophrenia… See the full description on the dataset page: https://huggingface.co/datasets/Neurazum/General-Disorders-EEG-Dataset-v1.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Source, raw and preprocessed EEG data, resting state EEG data, image set, DNN feature maps and code of the paper: "A large and rich EEG dataset for modeling human visual object recognition".
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
EEG signals with 256 channels captured from 11 subjects executing a SSVEP-based experimental protocol. Five different frequencies (6.66, 7.50, 8.57, 10.00 and 12.00 Hz) have been used for the visual stimulation, and the EGI 300 Geodesic EEG System (GES 300), using a 256-channel HydroCel Geodesic Sensor Net (HCGSN) and a sampling rate of 250 Hz has been used for capturing the signals. Check https://www.youtube.com/watch?v=8lGBVvCX5d8&feature=youtu.be for a video demonstrating one trial.Check https://github.com/MAMEM/ssvep-eeg-processing-toolbox for the processing toolbox.Check http://arxiv.org/abs/1602.00904 for the technical report.
Data set from a large study to examine EEG correlates of genetic predisposition to alcoholism. It contains measurements from 64 electrodes placed on the scalp sampled at 256 Hz (3.9-msec epoch) for 1 second. There were two groups of subjects: alcoholic and control. Each subject was exposed to either a single stimulus (S1) or to two stimuli (S1 and S2) which were pictures of objects chosen from the 1980 Snodgrass and Vanderwart picture set. When two stimuli were shown, they were presented in either a matched condition where S1 was identical to S2 or in a non-matched condition where S1 differed from S2. There were 122 subjects and each subject completed 120 trials where different stimuli were shown. The electrode positions were located at standard sites (Standard Electrode Position Nomenclature, American Electroencephalographic Association 1990). Zhang et al. (1995) describes in detail the data collection process. There are three versions of the EEG data set. * The Small Data Set (smni97_eeg_data.tar.gz) contains data for the 2 subjects, alcoholic a_co2a0000364 and control c_co2c0000337. For each of the 3 matching paradigms, c_1 (one presentation only), c_m (match to previous presentation) and c_n (no-match to previous presentation), 10 runs are shown. * The Large Data Set (SMNI_CMI_TRAIN.tar.gz and SMNI_CMI_TEST.tar.gz) contains data for 10 alcoholic and 10 control subjects, with 10 runs per subject per paradigm. The test data used the same 10 alcoholic and 10 control subjects as with the training data, but with 10 out-of-sample runs per subject per paradigm. * The Full Data Set contains all 120 trials for 122 subjects. The entire set of data is about 700 MBytes.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset is a BIDS-compatible version of the CHB-MIT Scalp EEG Database. It reorganizes the file structure to comply with the BIDS specification. To this effect:
The dataset is released under the Open Data Commons Attribution License v1.0.
The original Physionet CHB-MIT Scalp EEG Database was published by Ali Shoeb. This BIDS-compatible version of the dataset was published by Jonathan Dan.
The original Physionet CHB-MIT Scalp EEG Database is available on the Physionet website.
CHB-MIT Scalp EEG Database
2010
This database, collected at the Children's Hospital Boston, consists of EEG recordings from pediatric subjects with intractable seizures. Subjects were monitored for up to several days following withdrawal of anti-seizure medication in order to characterize their seizures and assess their candidacy for surgical intervention.
Each folder (sub-01, sub-01, etc.) contains between 9 and 42 continuous .edf files from a single subject. Hardware limitations resulted in gaps between consecutively-numbered .edf files, during which the signals were not recorded; in most cases, the gaps are 10 seconds or less, but occasionally there are much longer gaps. In order to protect the privacy of the subjects, all protected health information (PHI) in the original .edf files has been replaced with surrogate information in the files provided here. Dates in the original .edf files have been replaced by surrogate dates, but the time relationships between the individual files belonging to each case have been preserved. In most cases, the .edf files contain exactly one hour of digitized EEG signals, although those belonging to case sub-10 are two hours long, and those belonging to cases sub-04, sub-06, sub-07, sub-09, and sub-23 are four hours long; occasionally, files in which seizures are recorded are shorter.
The EEG is recorded at 256 Hz with a 16-bit resolution. The recordings are referenced in a double banana bipolar montage with 18 channels from the 10-20 electrode system.
The dataset also contains seizure annotations as start and stop times.
The dataset contains 664 `.edf` recordings. 129 those files that contain one or more seizures. In all, these records include 198 seizures.
23 pediatric subjects with intractable seizures. (5 males, ages 3–22; and 17 females, ages 1.5–19; 1 n/a)
Recordings were performed at the Children's Hospital Boston using the International 10-20 system of EEG electrode positions. Signals were sampled at 256 samples per second with 16-bit resolution.
This zip file contains the raw EEG files for all 27 subjects and three experimental sessions. Pre-processing of the raw EEG data is fully automatic and can be reproduced with the analysis scripts.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
The dataset comprised 14 patients with paranoid schizophrenia and 14 healthy controls. Data were acquired with the sampling frequency of 250 Hz using the standard 10-20 EEG montage with 19 EEG channels: Fp1, Fp2, F7, F3, Fz, F4, F8, T3, C3, Cz, C4, T4, T5, P3, Pz, P4, T6, O1, O2. The reference electrode was placed between electrodes Fz and Cz.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
epilepsy
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This is the raw EEG data for the study. Data is in BioSemi Data Format (BDF). Files with only "II" in the file name were recorded during the reported 1-Exemplar categorization task; "RB-II" files were recorded during the reported 2-Exemplar categorization task. "Resting" files were recorded during wakeful resting state data.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This data set consists of electroencephalography (EEG) data from 50 (Subject1 – Subject50) participants with acute ischemic stroke aged between 30 and 77 years. The participants included 39 male and 11 female. The time after stroke ranged from 1 days to 30 days. 22 participants had right hemisphere hemiplegia and 28 participants had left hemisphere hemiplegia. All participants were originally right-handed. Each of the participants sat in front of a computer screen with an arm resting on a pillow on their lap or on a table and they carried out the instructions given on the computer screen. At the trial start, a picture with text description which was circulated with left right hand, were presented for 2s. We asked the participants to focus their mind on the hand motor imagery which was instructed, at the same time, the video of ipsilateral hand movement is displayed on the computer screen and lasts for 4s. Next, take a 2s break.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
PCA
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Title: Brain-Computer Music Interface for Monitoring and Inducing Affective States (BCMI-MIdAS)
Dates: 2012-2017
Funding organisation: Engineering and Physical Sciences Research Council (EPSRC)
Grant no.: EP/J003077/1 and EP/J002135/1.
Title: EEG data investigating neural correlates of music-induced emotion.
Description: This dataset accompanies the publication by Daly et al. (2018) and has been analysed in Daly et al. (2014; 2015a; 2015b) (please see Section 5 for full references). The purpose of the research activity in which the data were collected was to investigate the EEG neural correlates of music-induced emotion. For this purpose 31 healthy adult participants listened to 40 music clips of 12 s duration each, targeting a range of emotional states. The music clips comprised excerpts from film scores spanning a range of styles and rated on induced emotion. The dataset contains unprocessed EEG data from all 31 participants (age range 18-66, 18 female) while listening to the music clips, together with the reported induced emotional responses . The paradigm involved 6 runs of EEG recordings. The first and last runs were resting state runs, during which participants were instructed to sit still and rest for 300 s. The other 4 runs each contained 10 music listening trials.
Publication Year: 2018
Creator: Nicoletta Nicolaou, Ian Daly.
Contributors: Isil Poyraz Bilgin, James Weaver, Asad Malik.
Principal Investigator: Slawomir Nasuto (EP/J003077/1).
Co-Investigator: Eduardo Miranda (EP/J002135/1).
Organisation: University of Reading
Rights-holders: University of Reading
Source: The musical stimuli were taken from Eerola & Vuoskoski, ŌĆ£A comparison of the discrete and dimensional models of emotion in musicŌĆØ, Psychol. Music, 39:18-49, 2010 (doi: 10.1177/0305735610362821).
Copyright University of Reading, 2018. This dataset is licensed by the rights-holder(s) under a Creative Commons Attribution 4.0 International Licence: https://creativecommons.org/licenses/by/4.0/.
BIDS File listing: The dataset comprises data from 31 participants, named using the convention: sub_s_number where: s_number is a random participant number from 1 to 31. For example: ŌĆśsub-08ŌĆÖ contains data obtained from participant 8.
The data is BIDS format and contains EEG and associated meta data. The sampling rate is 1 kHz and the EEG corresponding to a music clip is 20 s long (the duration of the clips).
Each data folder contains the following data (please note that the number of runs varies between participants):
EEG data in .tsv format. Event codes (JSON) and timings (tsv). EEG channel information.
This information is available in the following publications:
[1] Daly, I., Nicolaou, N., Williams, D., Hwang, F., Kirke, A., Miranda, E., Nasuto, S.J., ōNeural and physiological data from participants listening to affective musicö, Scientific Data, 2018. [2] Daly, I., Malik, A., Hwang, F., Roesch, E., Weaver, J., Kirke, A., Williams, D., Miranda, E. R., Nasuto, S. J., ōNeural correlates of emotional responses to music: an EEG studyö, Neuroscience Letters, 573: 52-7, 2014; doi: 10.1016/j.neulet.2014.05.003. [3] Daly, I., Hallowell, J., Hwang, F., Kirke, A., Malik, A., Roesch, E., Weaver, J., Williams, D., Miranda, E., Nasuto, S.J., ōChanges in music tempo entrain movement related brain activityö, Proc. IEEE EMBC 2014, pp.4595-8; doi: 10.1109/EMBC.2014.6944647 [4] Daly, I., Williams, D., Hallowell, J., Hwang, F., Kirke, A., Malik, A., Weaver, J., Miranda, E., Nasuto, S.J., ōMusic-induced emotions can be predicted from a combination of brain activity and acoustic featuresö, Brain and Cognition, 101:1-11, 2015b; doi: 10.1016/j.bandc.2015.08.003
Please cite these references if you use this dataset in your study.
Thank you for your interest in our work.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Tecnologico de Monterrey School of Engineering and Sciences NeuroTechs Research Group
Context: This dataset includes electroencephalographic (EEG) recordings from 34 healthy, young adults in Mexico, collected to study the somatosensory system's responses to a range of tactile stimuli. The study employs innovative NeuroSense tactile stimulators to explore how the brain processes touch sensations when subjected to stimuli such as air, vibration, and caress at four distinct intensity levels.
Objective: The objective of this database is to understand the cortical processing of tactile stimuli including air, vibration and carress using EEG.
Main Outcome Measure: The main outcome measure is the EEG recordings which includes the evoked responses of the somatosensory system to each type of stimulus and intensities. These measurements allow for an in-depth analysis of the cortical dynamics involved in processing touch.
Limitations: One limitation of the database is its focus on a relatively small and specific population, which could affect the generalizability of the findings. Additionally, the data is dependent on the accuracy and consistency of the stimulus delivery and EEG recording during the experimental sessions.
Generalizability: While the findings provide significant insights into the neural processing of tactile stimuli within the central nervous sytem, their generalizability might be limited due to the specialized nature of the stimuli and the controlled experimental conditions. However, the dataset serves as a valuable resource for developing diagnostic and therapeutic strategies for somatosensory impairments and advancing research in neuroscience and somatosensory rehabilitation.
No description was included in this Dataset collected from the OSF
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Context : We share a large database containing electroencephalographic signals from 87 human participants, with more than 20,800 trials in total representing about 70 hours of recording. It was collected during brain-computer interface (BCI) experiments and organized into 3 datasets (A, B, and C) that were all recorded following the same protocol: right and left hand motor imagery (MI) tasks during one single day session. It includes the performance of the associated BCI users, detailed information about the demographics, personality and cognitive user’s profile, and the experimental instructions and codes (executed in the open-source platform OpenViBE). Such database could prove useful for various studies, including but not limited to: 1) studying the relationships between BCI users' profiles and their BCI performances, 2) studying how EEG signals properties varies for different users' profiles and MI tasks, 3) using the large number of participants to design cross-user BCI machine learning algorithms or 4) incorporating users' profile information into the design of EEG signal classification algorithms.
Sixty participants (Dataset A) performed the first experiment, designed in order to investigated the impact of experimenters' and users' gender on MI-BCI user training outcomes, i.e., users performance and experience, (Pillette & al). Twenty one participants (Dataset B) performed the second one, designed to examined the relationship between users' online performance (i.e., classification accuracy) and the characteristics of the chosen user-specific Most Discriminant Frequency Band (MDFB) (Benaroch & al). The only difference between the two experiments lies in the algorithm used to select the MDFB. Dataset C contains 6 additional participants who completed one of the two experiments described above. Physiological signals were measured using a g.USBAmp (g.tec, Austria), sampled at 512 Hz, and processed online using OpenViBE 2.1.0 (Dataset A) & OpenVIBE 2.2.0 (Dataset B). For Dataset C, participants C83 and C85 were collected with OpenViBE 2.1.0 and the remaining 4 participants with OpenViBE 2.2.0. Experiments were recorded at Inria Bordeaux sud-ouest, France.
Duration : Each participant's folder is composed of approximately 48 minutes EEG recording. Meaning six 7-minutes runs and a 6-minutes baseline.
Documents Instructions: checklist read by experimenters during the experiments. Questionnaires: the Mental Rotation test used, the translation of 4 questionnaires, notably the Demographic and Social information, the Pre and Post-session questionnaires, and the Index of Learning style. English and french version Performance: The online OpenViBE BCI classification performances obtained by each participant are provided for each run, as well as answers to all questionnaires Scenarios/scripts : set of OpenViBE scenarios used to perform each of the steps of the MI-BCI protocol, e.g., acquire training data, calibrate the classifier or run the online MI-BCI
Database : raw signals Dataset A : N=60 participants Dataset B : N=21 participants Dataset C : N=6 participants
The Harvard EEG Database will encompass data gathered from four hospitals affiliated with Harvard University:Massachusetts General Hospital (MGH), Brigham and Women's Hospital (BWH), Beth Israel Deaconess Medical Center (BIDMC), and Boston Children's Hospital (BCH).