100+ datasets found
  1. i

    EEG Signal Dataset

    • ieee-dataport.org
    Updated Jun 11, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rahul Kher (2020). EEG Signal Dataset [Dataset]. https://ieee-dataport.org/documents/eeg-signal-dataset
    Explore at:
    Dataset updated
    Jun 11, 2020
    Authors
    Rahul Kher
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    PCA

  2. EEG Dataset for ADHD

    • kaggle.com
    Updated Jan 20, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Danizo (2025). EEG Dataset for ADHD [Dataset]. https://www.kaggle.com/datasets/danizo/eeg-dataset-for-adhd
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jan 20, 2025
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    Danizo
    Description

    This is the Dataset Collected by Shahed Univeristy Released in IEEE.

    the Columns are: Fz, Cz, Pz, C3, T3, C4, T4, Fp1, Fp2, F3, F4, F7, F8, P3, P4, T5, T6, O1, O2, Class, ID

    the first 19 are channel names.

    Class: ADHD/Control

    ID: Patient ID

    Participants were 61 children with ADHD and 60 healthy controls (boys and girls, ages 7-12). The ADHD children were diagnosed by an experienced psychiatrist to DSM-IV criteria, and have taken Ritalin for up to 6 months. None of the children in the control group had a history of psychiatric disorders, epilepsy, or any report of high-risk behaviors.

    EEG recording was performed based on 10-20 standard by 19 channels (Fz, Cz, Pz, C3, T3, C4, T4, Fp1, Fp2, F3, F4, F7, F8, P3, P4, T5, T6, O1, O2) at 128 Hz sampling frequency. The A1 and A2 electrodes were the references located on earlobes.

    Since one of the deficits in ADHD children is visual attention, the EEG recording protocol was based on visual attention tasks. In the task, a set of pictures of cartoon characters was shown to the children and they were asked to count the characters. The number of characters in each image was randomly selected between 5 and 16, and the size of the pictures was large enough to be easily visible and countable by children. To have a continuous stimulus during the signal recording, each image was displayed immediately and uninterrupted after the child’s response. Thus, the duration of EEG recording throughout this cognitive visual task was dependent on the child’s performance (i.e. response speed).

    Citation Author(s): Ali Motie Nasrabadi Armin Allahverdy Mehdi Samavati Mohammad Reza Mohammadi

    DOI: 10.21227/rzfh-zn36

    License: Creative Commons Attribution

  3. u

    EEG Datasets for Naturalistic Listening to "Alice in Wonderland" (Version 1)...

    • deepblue.lib.umich.edu
    Updated Nov 20, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Brennan, Jonathan R. (2018). EEG Datasets for Naturalistic Listening to "Alice in Wonderland" (Version 1) [Dataset]. http://doi.org/10.7302/Z29C6VNH
    Explore at:
    Dataset updated
    Nov 20, 2018
    Dataset provided by
    Deep Blue Data
    Authors
    Brennan, Jonathan R.
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    These files contain the raw data and processing parameters to go with the paper "Hierarchical structure guides rapid linguistic predictions during naturalistic listening" by Jonathan R. Brennan and John T. Hale. These files include the stimulus (wav files), raw data (matlab format for the Fieldtrip toolbox), data processing paramters (matlab), and variables used to align the stimuli with the EEG data and for the statistical analyses reported in the paper.

  4. EEG datasets of stroke patients

    • figshare.com
    json
    Updated Sep 14, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Haijie Liu; Xiaodong Lv (2023). EEG datasets of stroke patients [Dataset]. http://doi.org/10.6084/m9.figshare.21679035.v5
    Explore at:
    jsonAvailable download formats
    Dataset updated
    Sep 14, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Haijie Liu; Xiaodong Lv
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data set consists of electroencephalography (EEG) data from 50 (Subject1 – Subject50) participants with acute ischemic stroke aged between 30 and 77 years. The participants included 39 male and 11 female. The time after stroke ranged from 1 days to 30 days. 22 participants had right hemisphere hemiplegia and 28 participants had left hemisphere hemiplegia. All participants were originally right-handed. Each of the participants sat in front of a computer screen with an arm resting on a pillow on their lap or on a table and they carried out the instructions given on the computer screen. At the trial start, a picture with text description which was circulated with left right hand, were presented for 2s. We asked the participants to focus their mind on the hand motor imagery which was instructed, at the same time, the video of ipsilateral hand movement is displayed on the computer screen and lasts for 4s. Next, take a 2s break.

  5. h

    General-Disorders-EEG-Dataset-v1

    • huggingface.co
    Updated Aug 21, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Neurazum (2025). General-Disorders-EEG-Dataset-v1 [Dataset]. http://doi.org/10.57967/hf/3321
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Aug 21, 2025
    Dataset authored and provided by
    Neurazum
    License

    Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
    License information was derived automatically

    Description

    Dataset

    Synthetic EEG data generated by the ‘bai’ model based on real data.

      Features/Columns:
    

    No: "Number" Sex: "Gender" Age: "Age of participants" EEG Date: "The date of the EEG" Education: "Education level" IQ: "IQ level of participants" Main Disorder: "General class definition of the disorder" Specific Disorder: "Specific class definition of the disorder"

    Total Features/Columns: 1140

      Content:
    

    Obsessive Compulsive Disorder Bipolar Disorder Schizophrenia… See the full description on the dataset page: https://huggingface.co/datasets/Neurazum/General-Disorders-EEG-Dataset-v1.

  6. EEG of Alzheimer's and Frontotemporal dementia

    • kaggle.com
    zip
    Updated Jan 28, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    yosf tag (2024). EEG of Alzheimer's and Frontotemporal dementia [Dataset]. https://www.kaggle.com/datasets/yosftag/open-nuro-dataset
    Explore at:
    zip(4479288286 bytes)Available download formats
    Dataset updated
    Jan 28, 2024
    Authors
    yosf tag
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    This dataset contains the EEG resting state-closed eyes recordings from 88 subjects in total. Participants: 36 of them were diagnosed with Alzheimer's disease (AD group), 23 were diagnosed with Frontotemporal Dementia (FTD group) and 29 were healthy subjects (CN group). Cognitive and neuropsychological state was evaluated by the international Mini-Mental State Examination (MMSE). MMSE score ranges from 0 to 30, with lower MMSE indicating more severe cognitive decline. The duration of the disease was measured in months and the median value was 25 with IQR range (Q1-Q3) being 24 - 28.5 months. Concerning the AD groups, no dementia-related comorbidities have been reported. The average MMSE for the AD group was 17.75 (sd=4.5), for the FTD group was 22.17 (sd=8.22) and for the CN group was 30. The mean age of the AD group was 66.4 (sd=7.9), for the FTD group was 63.6 (sd=8.2), and for the CN group was 67.9 (sd=5.4).

    Recordings: Recordings were aquired from the 2nd Department of Neurology of AHEPA General Hispital of Thessaloniki by an experienced team of neurologists. For recording, a Nihon Kohden EEG 2100 clinical device was used, with 19 scalp electrodes (Fp1, Fp2, F7, F3, Fz, F4, F8, T3, C3, Cz, C4, T4, T5, P3, Pz, P4, T6, O1, and O2) according to the 10-20 international system and 2 reference electrodes (A1 and A2) placed on the mastoids for impendance check, according to the manual of the device. Each recording was performed according to the clinical protocol with participants being in a sitting position having their eyes closed. Before the initialization of each recording, the skin impedance value was ensured to be below 5k?. The sampling rate was 500 Hz with 10uV/mm resolution. The recording montages were anterior-posterior bipolar and referential montage using Cz as the common reference. The referential montage was included in this dataset. The recordings were received under the range of the following parameters of the amplifier: Sensitivity: 10uV/mm, time constant: 0.3s, and high frequency filter at 70 Hz. Each recording lasted approximately 13.5 minutes for AD group (min=5.1, max=21.3), 12 minutes for FTD group (min=7.9, max=16.9) and 13.8 for CN group (min=12.5, max=16.5). In total, 485.5 minutes of AD, 276.5 minutes of FTD and 402 minutes of CN recordings were collected and are included in the dataset.

    Preprocessing: The EEG recordings were exported in .eeg format and are transformed to BIDS accepted .set format for the inclusion in the dataset. Automatic annotations of the Nihon Kohden EEG device marking artifacts (muscle activity, blinking, swallowing) have not been included for language compatibility purposes (If this is an issue, please use the preprocessed dataset in Folder: derivatives). The unprocessed EEG recordings are included in folders named: sub-0XX. Folders named sub-0XX in the subfolder derivatives contain the preprocessed and denoised EEG recordings. The preprocessing pipeline of the EEG signals is as follows. First, a Butterworth band-pass filter 0.5-45 Hz was applied and the signals were re-referenced to A1-A2. Then, the Artifact Subspace Reconstruction routine (ASR) which is an EEG artifact correction method included in the EEGLab Matlab software was applied to the signals, removing bad data periods which exceeded the max acceptable 0.5 second window standard deviation of 17, which is considered a conservative window. Next, the Independent Component Analysis (ICA) method (RunICA algorithm) was performed, transforming the 19 EEG signals to 19 ICA components. ICA components that were classified as “eye artifacts” or “jaw artifacts” by the automatic classification routine “ICLabel” in the EEGLAB platform were automatically rejected. It should be noted that, even though the recording was performed in a resting state, eyes-closed condition, eye artifacts of eye movement were still found at some EEG recordings.

    A complete analysis of this dataset can be found in the published Data Descriptor paper "A Dataset of Scalp EEG Recordings of Alzheimer’s Disease, Frontotemporal Dementia and Healthy Subjects from Routine EEG", https://doi.org/10.3390/data8060095 *****Im not the original creator of this dataset it was published on https://openneuro.org/datasets/ds004504/versions/1.0.6 i just ported it here for ease of use *****

  7. c

    Ultra high-density EEG recording of interictal migraine and controls:...

    • kilthub.cmu.edu
    txt
    Updated Jul 21, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alireza Chaman Zar; Sarah Haigh; Pulkit Grover; Marlene Behrmann (2020). Ultra high-density EEG recording of interictal migraine and controls: sensory and rest [Dataset]. http://doi.org/10.1184/R1/12636731
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jul 21, 2020
    Dataset provided by
    Carnegie Mellon University
    Authors
    Alireza Chaman Zar; Sarah Haigh; Pulkit Grover; Marlene Behrmann
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    We used a high-density electroencephalography (HD-EEG) system, with 128 customized electrode locations, to record from 17 individuals with migraine (12 female) in the interictal period, and 18 age- and gender-matched healthy control subjects, during visual (vertical grating pattern) and auditory (modulated tone) stimulation which varied in temporal frequency (4 and 6Hz), and during rest. This dataset includes the EEG raw data related to the paper entitled Chamanzar, Haigh, Grover, and Behrmann (2020), Abnormalities in cortical pattern of coherence in migraine detected using ultra high-density EEG. The link to our paper will be made available as soon as it is published online.

  8. i

    Preprocessed CHB-MIT Scalp EEG Database

    • ieee-dataport.org
    Updated Dec 24, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mrs Deepa .B (2024). Preprocessed CHB-MIT Scalp EEG Database [Dataset]. https://ieee-dataport.org/open-access/preprocessed-chb-mit-scalp-eeg-database
    Explore at:
    Dataset updated
    Dec 24, 2024
    Authors
    Mrs Deepa .B
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Univ. of Bonn’ and ‘CHB-MIT Scalp EEG Database’ are publically available datasets which are the most sought after amongst researchers. Bonn dataset is very small compared to CHB-MIT. But still researchers prefer Bonn as it is in simple '.txt' format. The dataset being published here is a preprocessed form of CHB-MIT. The dataset is available in '.csv' format.

  9. EEG Alzheimer's Dataset

    • kaggle.com
    Updated Sep 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    UCI Machine Learning (2025). EEG Alzheimer's Dataset [Dataset]. https://www.kaggle.com/datasets/ucimachinelearning/eeg-alzheimers-dataset
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Sep 9, 2025
    Dataset provided by
    Kaggle
    Authors
    UCI Machine Learning
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    This dataset contains 848,640 records with 17 columns, representing EEG (Electroencephalogram) signals recorded from multiple electrode positions on the scalp, along with a status label. The dataset is be related to the study of Alzheimer’s Disease (AD).

    Features (16 continuous variables, float64): Each feature corresponds to the electrical activity recorded from standard EEG electrode placements based on the international 10-20 system:

    Fp1, Fp2, F7, F3, Fz, F4, F8

    T3, C3, Cz, C4, T4

    T5, P3, Pz, P4

    These channels measure brain activity in different cortical regions (frontal, temporal, central, and parietal lobes).

    Target variable (1 categorical variable, int64):

    status: Represents the condition or classification of the subject at the time of recording (e.g., patient vs. control, or stage of Alzheimer’s disease).

    Size & Integrity:

    Rows: 848,640 samples

    Columns: 17 (16 EEG features + 1 status label)

    Data types: 16 float features, 1 integer label

    Missing values: None (clean dataset)

    This dataset is suitable for machine learning and deep learning applications such as:

    EEG signal classification (AD vs. healthy subjects)

    Brain activity pattern recognition

    Feature extraction and dimensionality reduction (e.g., PCA, wavelet transforms)

    Time-series analysis of EEG recordings

  10. Data from: A Resting-state EEG Dataset for Sleep Deprivation

    • openneuro.org
    Updated Apr 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chuqin Xiang; Xinrui Fan; Duo Bai; Ke Lv; Xu Lei (2025). A Resting-state EEG Dataset for Sleep Deprivation [Dataset]. http://doi.org/10.18112/openneuro.ds004902.v1.0.8
    Explore at:
    Dataset updated
    Apr 27, 2025
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Chuqin Xiang; Xinrui Fan; Duo Bai; Ke Lv; Xu Lei
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    General information

    The dataset provides resting-state EEG data (eyes open,partially eyes closed) from 71 participants who underwent two experiments involving normal sleep (NS---session1) and sleep deprivation(SD---session2) .The dataset also provides information on participants' sleepiness and mood states. (Please note here Session 1 (NS) and Session 2 (SD) is not the time order, the time order is counterbalanced across participants and is listed in metadata.)

    Dataset

    Presentation

    The data collection was initiated in March 2019 and was terminated in December 2020. The detailed description of the dataset is currently under working by Chuqin Xiang,Xinrui Fan,Duo Bai,Ke Lv and Xu Lei, and will submit to Scientific Data for publication.

    EEG acquisition

    • EEG system (Brain Products GmbH, Steing- rabenstr, Germany, 61 electrodes)
    • Sampling frequency: 500Hz
    • Impedances were kept below 5k

    Contact

     * If you have any questions or comments, please contact:
     * Xu Lei: xlei@swu.edu.cn   
    

    Article

    Xiang, C., Fan, X., Bai, D. et al. A resting-state EEG dataset for sleep deprivation. Sci Data 11, 427 (2024). https://doi.org/10.1038/s41597-024-03268-2

  11. RAW EEG STRESS DATASET

    • kaggle.com
    zip
    Updated Dec 11, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ayush Tibrewal (2023). RAW EEG STRESS DATASET [Dataset]. https://www.kaggle.com/datasets/ayushtibrewal/raw-eeg-stress-dataset-sam40
    Explore at:
    zip(366418728 bytes)Available download formats
    Dataset updated
    Dec 11, 2023
    Authors
    Ayush Tibrewal
    Description

    SAM 40: Dataset of 40 subject EEG recordings to monitor the induced-stress while performing Stroop color-word test, arithmetic task, and mirror image recognition task

    presents a collection of electroencephalogram (EEG) data recorded from 40 subjects (female: 14, male: 26, mean age: 21.5 years). The dataset was recorded from the subjects while performing various tasks such as Stroop color-word test, solving arithmetic questions, identification of symmetric mirror images, and a state of relaxation. The experiment was primarily conducted to monitor the short-term stress elicited in an individual while performing the aforementioned cognitive tasks. The individual tasks were carried out for 25 s and were repeated to record three trials. The EEG was recorded using a 32-channel Emotiv Epoc Flex gel kit. The EEG data were then segmented into non-overlapping epochs of 25 s depending on the various tasks performed by the subjects. The EEG data were further processed to remove the baseline drifts by subtracting the average trend obtained using the Savitzky-Golay filter. Furthermore, the artifacts were also removed from the EEG data by applying wavelet thresholding. The dataset proposed in this paper can aid and support the research activities in the field of brain-computer interface and can also be used in the identification of patterns in the EEG data elicited due to stress.

  12. EEG and audio dataset for auditory attention decoding

    • zenodo.org
    bin, zip
    Updated Jan 31, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Søren A. Fuglsang; Søren A. Fuglsang; Daniel D.E. Wong; Daniel D.E. Wong; Jens Hjortkjær; Jens Hjortkjær (2020). EEG and audio dataset for auditory attention decoding [Dataset]. http://doi.org/10.5281/zenodo.1199011
    Explore at:
    zip, binAvailable download formats
    Dataset updated
    Jan 31, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Søren A. Fuglsang; Søren A. Fuglsang; Daniel D.E. Wong; Daniel D.E. Wong; Jens Hjortkjær; Jens Hjortkjær
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Description

    This dataset contains EEG recordings from 18 subjects listening to one of two competing speech audio streams. Continuous speech in trials of ~50 sec. was presented to normal hearing listeners in simulated rooms with different degrees of reverberation. Subjects were asked to attend one of two spatially separated speakers (one male, one female) and ignore the other. Repeated trials with presentation of a single talker were also recorded. The data were recorded in a double-walled soundproof booth at the Technical University of Denmark (DTU) using a 64-channel Biosemi system and digitized at a sampling rate of 512 Hz. Full details can be found in:

    • Søren A. Fuglsang, Torsten Dau & Jens Hjortkjær (2017): Noise-robust cortical tracking of attended speech in real-life environments. NeuroImage, 156, 435-444

    and

    • Daniel D.E. Wong, Søren A. Fuglsang, Jens Hjortkjær, Enea Ceolini, Malcolm Slaney & Alain de Cheveigné: A Comparison of Temporal Response Function Estimation Methods for Auditory Attention Decoding. Frontiers in Neuroscience, https://doi.org/10.3389/fnins.2018.00531

    The data is organized in format of the publicly available COCOHA Matlab Toolbox. The preproc_script.m demonstrates how to import and align the EEG and audio data. The script also demonstrates some EEG preprocessing steps as used the Wong et al. paper above. The AUDIO.zip contains wav-files with the speech audio used in the experiment. The EEG.zip contains MAT-files with the EEG/EOG data for each subject. The EEG/EOG data are found in data.eeg with the following channels:

    • channels 1-64: scalp EEG electrodes
    • channel 65: right mastoid electrode
    • channel 66: left mastoid electrode
    • channel 67: vertical EOG below right eye
    • channel 68: horizontal EOG right eye
    • channel 69: vertical EOG above right eye
    • channel 70: vertical EOG below left eye
    • channel 71: horizontal EOG left eye
    • channel 72: vertical EOG above left eye

    The expinfo table contains information about experimental conditions, including what what speaker the listener was attending to in different trials. The expinfo table contains the following information:

    • attend_mf: attended speaker (1=male, 2=female)
    • attend_lr: spatial position of the attended speaker (1=left, 2=right)
    • acoustic_condition: type of acoustic room (1= anechoic, 2= mild reverberation, 3= high reverberation, see Fuglsang et al. for details)
    • n_speakers: number of speakers presented (1 or 2)
    • wavfile_male: name of presented audio wav-file for the male speaker
    • wavfile_female: name of presented audio wav-file for the female speaker (if any)
    • trigger: trigger event value for each trial also found in data.event.eeg.value

    DATA_preproc.zip contains the preprocessed EEG and audio data as output from preproc_script.m.

    The dataset was created within the COCOHA Project: Cognitive Control of a Hearing Aid

  13. s

    EEG Data for "Electrophysiological signatures of brain aging in autism...

    • orda.shef.ac.uk
    bin
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Elizabeth Milne (2023). EEG Data for "Electrophysiological signatures of brain aging in autism spectrum disorder" [Dataset]. http://doi.org/10.15131/shef.data.16840351.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    The University of Sheffield
    Authors
    Elizabeth Milne
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data is linked to the publication "Electrophysiological signatures of brain aging in autism spectrum disorder" by Dickinson, Jeste and Milne, in which it is referenced as Dataset 1.EEG data were acquired via Biosemi Active two EEG system. The original recordings have been converted to .set and .fdt files via EEGLAB as uploaded here. There is a .fdt and a .set file for each recording, the .fdt file contains the data, the .set file contains information about the parameters of the recording (see https://eeglab.org/tutorials/ for further information). The files can be opened within EEGLAB software.The data were acquired from 28 individuals with a diagnosis of an autism spectrum condition and 28 neurotypical controls aged between 18 and 68 years. The paradigm that generated the data was a 2.5 minute (150 seconds) period of eyes closed resting.Ethical approval for data collection and data sharing was given by the Health Research Authority [IRAS ID = 212171].Only data from participants who provided signed consent for data sharing were included in this work and uploaded here.

  14. MAMEM EEG SSVEP Dataset I (256 channels, 11 subjects, 5 frequencies...

    • figshare.com
    • zenodo.org
    • +1more
    application/x-rar
    Updated May 30, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Spiros Nikolopoulos (2023). MAMEM EEG SSVEP Dataset I (256 channels, 11 subjects, 5 frequencies presented in isolation) [Dataset]. http://doi.org/10.6084/m9.figshare.2068677.v6
    Explore at:
    application/x-rarAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Spiros Nikolopoulos
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    EEG signals with 256 channels captured from 11 subjects executing a SSVEP-based experimental protocol. Five different frequencies (6.66, 7.50, 8.57, 10.00 and 12.00 Hz) have been used for the visual stimulation, and the EGI 300 Geodesic EEG System (GES 300), using a 256-channel HydroCel Geodesic Sensor Net (HCGSN) and a sampling rate of 250 Hz has been used for capturing the signals. Check https://www.youtube.com/watch?v=8lGBVvCX5d8&feature=youtu.be for a video demonstrating one trial.Check https://github.com/MAMEM/ssvep-eeg-processing-toolbox for the processing toolbox.Check http://arxiv.org/abs/1602.00904 for the technical report.

  15. m

    EEG dataset of individuals with intellectual and developmental disorder and...

    • data.mendeley.com
    Updated Apr 11, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ekansh Sareen (2020). EEG dataset of individuals with intellectual and developmental disorder and healthy controls while observing rest and music stimuli [Dataset]. http://doi.org/10.17632/fshy54ypyh.2
    Explore at:
    Dataset updated
    Apr 11, 2020
    Authors
    Ekansh Sareen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data presents a collection of EEG recordings of seven participants with Intellectual and Developmental Disorder (IDD) and seven Typically Developing Controls (TDC). The data is recorded while the participants observe a resting state and a soothing music stimuli. The data was collected using a high-resolution multi-channel dry-electrode system from EMOTIV called EPOC+. This is a 14-channel device with two reference channels and a sampling frequency of 128 Hz. The data was collected in a noise-isolated room. The participants were informed of the experimental procedure, related risks and were asked to keep their eyes closed throughout the experiment. The data is provided in two formats, (1) Raw EEG data and (2) Pre-processed and clean EEG data for both the group of participants. This data can be used to explore the functional brain connectivity of the IDD group. In addition, behavioral information like IQ, SQ, music apprehension and facial expressions (emotion) for IDD participants is provided in file “QualitativeData.xlsx".

    Data Usage: The data is arranged as follows: 1. Raw Data: Data/RawData/RawData_TDC/Music and Rest Data/RawData/RawData_IDD/Music and Rest 2. Clean Data Data/CleanData/CleanData_TDC/Music and Rest Data/CleanData/CleanData_IDD/Music and Rest

    The dataset comes along with a fully automated EEG pre-processing pipeline. This pipeline can be used to do batch-processing of raw EEG files to obtain clean and pre-processed EEG files. Key features of this pipeline are : (1) Bandpass filtering (2) Linenoise removal (3) Channel selection (4) Independent Component Analysis (ICA) (5) Automatic artifact rejection All the required files are present in the Pipeline folder.

    If you use this dataset and/or the fully automated pre-processing pipeline for your research work, kindly cite these two articles linked to this dataset.

    (1) Sareen, E., Singh, L., Varkey, B., Achary, K., Gupta, A. (2020). EEG dataset of individuals with intellectual and developmental disorder and healthy controls under rest and music stimuli. Data in Brief, 105488, ISSN 2352-3409, DOI:https://doi.org/10.1016/j.dib.2020.105488. (2) Sareen, E., Gupta, A., Verma, R., Achary, G. K., Varkey, B (2019). Studying functional brain networks from dry electrode EEG set during music and resting states in neurodevelopment disorder, bioRxiv 759738 [Preprint]. Available from: https://www.biorxiv.org/content/10.1101/759738v1

  16. Features-EEG dataset

    • researchdata.edu.au
    • openneuro.org
    Updated Jun 29, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Grootswagers Tijl; Tijl Grootswagers (2023). Features-EEG dataset [Dataset]. http://doi.org/10.18112/OPENNEURO.DS004357.V1.0.0
    Explore at:
    Dataset updated
    Jun 29, 2023
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Western Sydney University
    Authors
    Grootswagers Tijl; Tijl Grootswagers
    License

    ODC Public Domain Dedication and Licence (PDDL) v1.0http://www.opendatacommons.org/licenses/pddl/1.0/
    License information was derived automatically

    Description

    Experiment Details Electroencephalography recordings from 16 subjects to fast streams of gabor-like stimuli. Images were presented in rapid serial visual presentation streams at 6.67Hz and 20Hz rates. Participants performed an orthogonal fixation colour change detection task.

    Experiment length: 1 hour Raw and preprocessed data are available online through openneuro: https://openneuro.org/datasets/ds004357. Supplementary Material and analysis scripts are available on github: https://github.com/Tijl/features-eeg

  17. Healthy Brain Network (HBN) EEG - Release 3

    • openneuro.org
    Updated Mar 11, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Seyed Yahya Shirazi; Alexandre Franco; Maurício Scopel Hoffmann; Nathalia B. Esper; Dung Truong; Arnaud Delorme; Michael Milham; Scott Makeig (2025). Healthy Brain Network (HBN) EEG - Release 3 [Dataset]. http://doi.org/10.18112/openneuro.ds005507.v1.0.1
    Explore at:
    Dataset updated
    Mar 11, 2025
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Seyed Yahya Shirazi; Alexandre Franco; Maurício Scopel Hoffmann; Nathalia B. Esper; Dung Truong; Arnaud Delorme; Michael Milham; Scott Makeig
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    The HBN-EEG Dataset

    This is Release 3 of HBN-EEG, the EEG and (soon-released) Eye-Tracking Section of the Child Mind Network Healthy Brain Network (HBN) Project, curated into the Brain Imaging Data Structure (BIDS) format. This dataset is part of a larger initiative to advance the understanding of child and adolescent mental health through collecting and analyzing neuroimaging, behavioral, and genetic data (Alexander et al., Sci Data 2017).

    Data Description

    This dataset comprises electroencephalogram (EEG) data and behavioral responses collected during EEG experiments from >3000 participants (5-21 yo) involved in the HBN project. The data has been released in 11 separate Releases, each containing data from a different set of participants.

    Tasks

    The HBN-EEG dataset includes EEG recordings from participants performing six distinct tasks, which are categorized into passive and active tasks based on the presence of user input and interaction in the experiment.

    Passive Tasks

    1. Resting State: Participants rested with their heads on a chin rest, following instructions to open or close their eyes and fixate on a central cross.

    2. Surround Suppression: Participants viewed flashing peripheral disks with contrasting backgrounds, while event markers and conditions were recorded.

    3. Movie Watching: Participants watched four short movies with different themes, with event markers recording the start and stop times of presentations.

    Active Tasks

    1. Contrast Change Detection: Participants identified flickering disks with dominant contrast changes and received feedback based on their responses.

    2. Sequence Learning: Participants memorized and repeated sequences of flashed circles on the screen, designed for different age groups.

    3. Symbol Search: Participants performed a computerized symbol search task, identifying target symbols from rows of search symbols.

    Contents

    • EEG Data: High-resolution EEG recordings capture a wide range of neural activity during various tasks.
    • Behavioral Responses: Participant responses during EEG tasks, including reaction times and accuracy. This data was originally recorded within the behavior directory of the HBN data. The data is now included with the EEG data within the events.tsv files.

    Special Features

    • Hierarchical Event Descriptors (HED): Events, including the original EEG events and the included behavioral events, have clear explanations, including proper HED annotation suitable for systematic meta and mega analysis of the data.
    • P-Factor, Attention, Internalization and Externalization: Derived from the CBCL questionnaire, these factors provide valuable insights into the psychopathology of the participants, adding a rich layer of interpretation to the EEG and behavioral data.
    • Data quality and availability: We performed minimal quality control to ensure that the data was not corrupted, each task had its necessary events, and was ready for preprocessing. The results of this quality control are available in the participants.tsv file.
    • Future Releases: We are committed to enhancing this dataset with additional, valuable features in its next stages, including:
      • Personalized EEG Electrode Locations: To offer more detailed insights into individual neural activity patterns.
      • Personalized Lead Field Matrix: Enabling better understanding and interpretation of EEG data.
      • Eye-Tracking Data: Providing a window into the visual attention and processing mechanisms during EEG experiments.

    Other HBN-EEG Datasets

    For access all releases of the HBN-EEG dataset, follow this link on NEMAR.org. The links to the individual releases are below:

    Release 1 | DS005505

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R1
    • Total subjects: 136

    Release 2 | DS005506

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R2
    • Total subjects: 152

    Release 3 | DS005507

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R3
    • Total subjects: 183

    Release 4 | DS005508

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R4
    • Total subjects: 324

    Release 5 | DS005509

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R5
    • Total subjects: 330

    Release 6 | DS05510

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R6
    • Total subjects: 134

    Release 7 | DS005511

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R7
    • Total subjects: 381

    Release 8 | DS005512

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R8
    • Total subjects: 257

    Release 9 | DS005514

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R9
    • Total subjects: 295

    Release 10 | DS005515

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R10
    • Total subjects: 533

    Release 11 | DS005516

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_R11
    • Total subjects: 430

    Release NC | --NOT FOR COMMERCIAL USE-- This dataset is intended for research purposes only under the CC-BY-NC-SA-4.0 License and is not currently hosted on OpenNeuro/NEMAR. Any commercial use is prohibited.

    • S3 URI: s3://fcp-indi/data/Projects/HBN/BIDS_EEG/cmi_bids_NC
    • Total subjects: 458

    Copyright and License

    The HBN-EEG dataset is licensed under the Creative Commons Attribution 4.0 International License (CC BY SA 4.0), except for the Not-for-Commercial-Use dataset. Please cite the dataset paper (https://doi.org/10.1101/2024.10.03.615261) as well as the original HBN publication (https://dx.doi.org/10.1038/sdata.2017.181).

    Acknowledgments

    We would like to express our gratitude to all participants and their families, whose contributions have made this project possible. We also thank our dedicated team of researchers and clinicians for their efforts in collecting, processing, and curating this data.

  18. The raw EEG data, 4 files (EEG_A to D), in European data format (.edf)

    • zenodo.org
    • data.niaid.nih.gov
    bin
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Laurence A. Brown; Sibah Hasan; Russell G. Foster; Stuart N. Peirson; Laurence A. Brown; Sibah Hasan; Russell G. Foster; Stuart N. Peirson (2020). The raw EEG data, 4 files (EEG_A to D), in European data format (.edf) [Dataset]. http://doi.org/10.5281/zenodo.160118
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Laurence A. Brown; Sibah Hasan; Russell G. Foster; Stuart N. Peirson; Laurence A. Brown; Sibah Hasan; Russell G. Foster; Stuart N. Peirson
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    EEG data for comparison to PIR-estimated sleep in the Wellcome Open Research article:

    'COMPASS: Continuous Open Mouse Phenotyping of Activity and Sleep Status'

  19. Z

    Data from: EmoKey Moments Muse EEG Dataset (EKM-ED): A Comprehensive...

    • data.niaid.nih.gov
    • produccioncientifica.ugr.es
    • +1more
    Updated Nov 10, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Francisco M. Garcia-Moreno; Marta Badenes-Sastre (2023). EmoKey Moments Muse EEG Dataset (EKM-ED): A Comprehensive Collection of Muse S EEG Data and Key Emotional Moments [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_8431450
    Explore at:
    Dataset updated
    Nov 10, 2023
    Dataset provided by
    Universidad de Granada
    Authors
    Francisco M. Garcia-Moreno; Marta Badenes-Sastre
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    EmoKey Moments Muse EEG Dataset (EKM-ED): A Comprehensive Collection of Muse S EEG Data and Key Emotional Moments

    Dataset Description:

    The EmoKey Moments EEG Dataset (EKM-ED) is an intricately curated dataset amassed from 47 participants, detailing EEG responses as they engage with emotion-eliciting video clips. Covering a spectrum of emotions, this dataset holds immense value for those diving deep into human cognitive responses, psychological research, and emotion-based analyses.

    Dataset Highlights:

    Precise Timestamps: Capturing the exact millisecond of EEG data acquisition, ensuring unparalleled granularity.

    Brainwave Metrics: Illuminating the variety of cognitive states through the prism of Delta, Theta, Alpha, Beta, and Gamma waves.

    Motion Data: Encompassing the device's movement in three dimensions for enhanced contextuality.

    Auxiliary Indicators: Key elements like the device's positioning, battery metrics, and user-specific actions are meticulously logged.

    Consent and Ethics: The dataset respects and upholds privacy and ethical standards. Every participant provided informed consent. This endeavor has received the green light from the Ethics Committee at the University of Granada, documented under the reference: 2100/CEIH/2021.

    A pivotal component of this dataset is its focus on "key moments" within the selected video clips, honing in on periods anticipated to evoke heightened emotional responses.

    Curated Video Clips within Dataset:

        Film
        Emotion
        Duration (seconds)
    
    
    
    
        The Lover
        Baseline
        43
    
    
        American History X
        Anger
        106
    
    
        Cry Freedom
        Sadness
        166
    
    
        Alive
        Happiness
        310
    
    
        Scream
        Fear
        395
    

    The cornerstone of EKM-ED is its innovative emphasis on these key moments, bringing to light the correlation between distinct cinematic events and specific EEG responses.

    Key Emotional Moments in Dataset:

        Film
        Emotion
        Key moment timestamps (seconds)
    
    
    
    
        American History X
        Anger
        36, 57, 68
    
    
        Cry Freedom
        Sadness
        112, 132, 154
    
    
        Alive
        Happiness
        227, 270, 289
    
    
        Scream
        Fear
        23, 42, 79, 226, 279, 299, 334
    

    Citation: Gilman, T. L., et al. (2017). A film set for the elicitation of emotion in research. Behavior Research Methods, 49(6). Link to the study

    With its unparalleled depth and focus, the EmoKey Moments EEG Dataset aims to advance research in fields such as neuroscience, psychology, and affective computing, providing a comprehensive platform for understanding and analyzing human emotions through EEG data.

    ——————————————————————————————————— FOLDER STRUCTURE DESCRIPTION ———————————————————————————————————

    • questionnaires: all there response questionnaires (Spanish); raw and preprocessed Including SAM | ——preprocessed: Ficha_Evaluacion_Participante_SAM_Refactored.csv: the SAM responses for every film clip

    • key_moments: the key moment timestamps for every emotion’s clip

    • muse_wearable_data: XXXX | |—raw |——1: ID = 1 of subject |————muse: EEG data of Muse device |—————————ANGER_XXX.csv : leg data of the anger elicitation |—————————FEAR_XXX.csv : leg data of the fear elicitation |—————————HAPPINESS_XXX.csv : leg data of the happiness elicitation |—————————SADNESS_XXX.csv : leg data of the sadness elicitation |————order: film elicitation order of play: For example: HAPPINESS,SADNESS,ANGER,FEAR … | |—preprocessed |——unclean-signals: without removing EEG artifacts, noise, etc. |————muse: EEG data of Muse device |—————————0.0078125: data downsampled to 128 Hz from 256Hz recorded |——clean-signals: removed EEG artifacts, noise, etc. |————muse: EEG data of Muse device |—————————0.0078125: data downsampled to 128 Hz from 256Hz recorded

    The ethical consent for this dataset was provided by La Comisión de Ética en Investigación de la Universidad de Granada, as documented in the approval titled: 'DETECCIÓN AUTOMÁTICA DE LAS EMOCIONES BÁSICAS Y SU INFLUENCIA EN LA TOMA DE DECISIONES MEDIANTE WEARABLES Y MACHINE LEARNING' registered under 2100/CEIH/2021.

  20. c

    EEG-BCI Dataset for Motor Imagery and Overt Spatial Attention EEG-BCI...

    • kilthub.cmu.edu
    • datasetcatalog.nlm.nih.gov
    zip
    Updated Aug 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dylan Forenzo; Bin He (2023). EEG-BCI Dataset for Motor Imagery and Overt Spatial Attention EEG-BCI Control [Dataset]. http://doi.org/10.1184/R1/23677098.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 4, 2023
    Dataset provided by
    Carnegie Mellon University
    Authors
    Dylan Forenzo; Bin He
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset consists of EEG recordings and Brain-Computer Interface (BCI) data from 25 different human subjects performing BCI experiments. More information can be found in the corresponding manuscript:

    Dylan Forenzo, Yixuan Liu, Jeehyun Kim, Yidan Ding, Taehyung Yoon, Bin He: “Integrating Simultaneous Motor Imagery and Spatial Attention for EEG-BCI Control”, IEEE Transactions on Biomedical Engineering (10.1109/TBME.2023.3298957).

    Please cite this paper if you use any data included in this dataset.

    The dataset was collected under the support of NIH grants AT009263, EB021027, EB029354, NS096761, NS124564 to Dr. Bin He at Carnegie Mellon University.

    Each file is a MATLAB object (.mat file) which contains data from a single run of BCI control. The MATLAB files are grouped into folders based on the Subject, one for each of the 25 subjects studied. Each subject completed 5 sessions of BCI experiments and each session consisted of either 18 (sessions 1 and 2) or 15 (sessions 3-5) runs, for a total of 81 runs per subject or 2025 total BCI runs.

    Each of the MATLAB files contains a single structure with the following fields:

    data: An array containing the EEG recordings with the size (channels x time points)

    times: A vector containing the timestamps in milisceonds with the size (1 x time points)

    fs: sampling frequency (1000 Hz)

    labels: A cell array containing the label for each channel

    targets: A list of target codes. For LR: 1 is right, 2 is left. For UD: 1 is up, 2 is down. For 2D: 1 is right, 2 is left, 3 is up, and 4 is down

    event: A structure of events from BCI2000. Each index corresponds to the start of a trial and includes the time (latency) of when the trial starts, and how long each trial lasted (duration).

    results: A vector of which target was hit for each trial (0 if the trial was aborted before a target was hit)

    outcome: A vector indicating the outcome of each trail (1: hit, 0: abort, -1: miss)

    subject: The coded subject number

    session: The session number. Please note that the session numbers are for specific tasks, so even though 2D sessions began on the third day of experiments, the 2D runs are listed as session 1, 2, and 3 as they are the first, second, and third 2D sessions.

    axis: The axis of control. Either LR (horizontal only, Left-Right), UD (vertical only, Up-Down), or 2D (both horizontal and vertical control).

    task: The control paradigm used. Options are MI (motor imagery), OSA (overt spatial attention), MIOSA (MI and OSA together), MIOSA1 (MI controls horizontal axis, OSA controls vertical. Referred to as MI/OSA in the paper), or MIOSA2 (MI controls vertical axis, OSA controls horizontal. Referred to as OSA/MI in the paper).

    run: The run number

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Rahul Kher (2020). EEG Signal Dataset [Dataset]. https://ieee-dataport.org/documents/eeg-signal-dataset

EEG Signal Dataset

Explore at:
Dataset updated
Jun 11, 2020
Authors
Rahul Kher
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

PCA

Search
Clear search
Close search
Google apps
Main menu