100+ datasets found
  1. i

    EEG Signal Dataset

    • ieee-dataport.org
    Updated Jun 11, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rahul Kher (2020). EEG Signal Dataset [Dataset]. https://ieee-dataport.org/documents/eeg-signal-dataset
    Explore at:
    Dataset updated
    Jun 11, 2020
    Authors
    Rahul Kher
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    PCA

  2. o

    A large and rich EEG dataset for modeling human visual object recognition

    • osf.io
    Updated Jul 25, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Gifford (2025). A large and rich EEG dataset for modeling human visual object recognition [Dataset]. http://doi.org/10.17605/OSF.IO/3JK45
    Explore at:
    Dataset updated
    Jul 25, 2025
    Dataset provided by
    Center For Open Science
    Authors
    Alessandro Gifford
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Source, raw and preprocessed EEG data, resting state EEG data, image set, DNN feature maps and code of the paper: "A large and rich EEG dataset for modeling human visual object recognition".

  3. h

    General-Disorders-EEG-Dataset-v1

    • huggingface.co
    Updated Aug 21, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Neurazum (2025). General-Disorders-EEG-Dataset-v1 [Dataset]. http://doi.org/10.57967/hf/3321
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Aug 21, 2025
    Dataset authored and provided by
    Neurazum
    License

    Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
    License information was derived automatically

    Description

    Dataset

    Synthetic EEG data generated by the ‘bai’ model based on real data.

      Features/Columns:
    

    No: "Number" Sex: "Gender" Age: "Age of participants" EEG Date: "The date of the EEG" Education: "Education level" IQ: "IQ level of participants" Main Disorder: "General class definition of the disorder" Specific Disorder: "Specific class definition of the disorder"

    Total Features/Columns: 1140

      Content:
    

    Obsessive Compulsive Disorder Bipolar Disorder Schizophrenia… See the full description on the dataset page: https://huggingface.co/datasets/Neurazum/General-Disorders-EEG-Dataset-v1.

  4. EEG Motor Imagery BCICIV_2a

    • kaggle.com
    Updated May 7, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    aymanmostafa11 (2023). EEG Motor Imagery BCICIV_2a [Dataset]. https://www.kaggle.com/datasets/aymanmostafa11/eeg-motor-imagery-bciciv-2a
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    May 7, 2023
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    aymanmostafa11
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    The dataset is a modified csv version of the BCI Competition IV 2a for ease of use for beginners

    Description

    The data can be interacted with two approaches: 1- Each patient separately: A csv file for each patient is provided for subject dependent tasks 2- All patients: the file with "all_patients" in it's name contain all patients data with a column specifying the patient number

    The events considered in the data are only the 4 target classes (left, right, foot, tongue), other events mentioned in the paper have been discarded for simplicity

    Acknowledgements

    BCI Competition IV This introductory youtube video

  5. EEG Alpha Waves dataset

    • zenodo.org
    • search.datacite.org
    bin
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Grégoire Cattan; Pedro L. C. Rodrigues; Pedro L. C. Rodrigues; Marco Congedo; Marco Congedo; Grégoire Cattan (2020). EEG Alpha Waves dataset [Dataset]. http://doi.org/10.5281/zenodo.2348892
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Grégoire Cattan; Pedro L. C. Rodrigues; Pedro L. C. Rodrigues; Marco Congedo; Marco Congedo; Grégoire Cattan
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Summary:

    This dataset contains electroencephalographic recordings of subjects in a simple resting-state eyes open/closed experimental protocol. Data were recorded during a pilot experiment taking place in the GIPSA-lab, Grenoble, France, in 2017 [1]. Python code is available at https://github.com/plcrodrigues/Alpha-Waves-Dataset for manipulating the data.

    Principal Investigators: Eng. Grégoire CATTAN, Eng. Pedro L. C. RODRIGUES
    Scientific Supervisor: Dr. Marco Congedo

    Introduction :

    The occipital dominant rhythm (commonly referred to as occipital ‘Alpha’) is prominent in occipital and parietal regions when a subject is exempt of visual stimulations, as in the case when keeping the eyes closed (2). In normal subjects its peak frequency is in the range 8-12Hz. The detection of alpha waves on the ongoing electroencephalography (EEG) is a useful indicator of the subject’s level of stress, concentration, relaxation or mental load (3,4) and an easy marker to detect in the recorded signals because of its high signal-to-noise-ratio. This experiment was conducted to provide a simple yet reliable set of EEG signals carrying very distinct signatures on each experimental condition. It can be useful for researchers and students looking for an EEG dataset to perform tests with signal processing and machine learning algorithms. An example of application of this dataset can be seen in (5).

    I.Participants

    A total of 20 volunteers participated in the experiment (7 females), with mean (sd) age 25.8 (5.27) and median 25.5. 18 subjects were between 19 and 28 years old. Two participants with age 33 and 44 were outside this range.

    II.Procedures

    EEG signals were acquired using a standard research grade amplifier (g.USBamp, g.tec, Schiedlberg, Austria) and the EC20 cap equipped with 16 wet electrodes (EasyCap, Herrsching am Ammersee, Germany), placed according to the 10-20 international system. The locations of the electrodes were FP1, FP2, FC5, FC6, FZ, T7, CZ, T8, P7, P3, PZ, P4, P8, O1, Oz, and O2. The reference was placed on the right earlobe and the ground at the AFZ scalp location. The amplifier was linked by USB connection to the PC where the data were acquired by means of the software OpenVibe (6,7). We acquired the data with no digital filter and a sampling frequency of 512 samples per second was used. For ensuing analyses, the experimenter was able to tag the EEG signal using an in-house application based on a C/C++ library (8). The tag were sent by the application to the amplifier through the USB port of the PC. It was then recorded along with the EEG signal as a supplementary channel.

    For each recording we provide the age, genre and fatigue of each participant. Fatigue was evaluated by the subjects thanks to a scale ranging from 0 to 10, where 10 represents exhaustion. Each participant underwent one session consisting of ten blocks of ten seconds of EEG data recording. Five blocks were recorded while a subject was keeping his eyes closed (condition 1) and the others while his eyes were open (condition 2). The two conditions were alternated. Before the onset of each block, the subject was asked to close or open his eyes according to the experimental condition. The experimenter then tagged the EEG signal using the in-house application and started a 10-second countdown of a block.

    III.Organization of the dataset

    For each subject we provide a single .mat file containing the complete recording of the session. The file is a 2D-matrix where the rows contain the observations at each time sample. Columns 2 to 17 contain the recordings on each of the 16 EEG electrodes. The first column of the matrix represents the timestamp of each observation and column 18 and 19 contain the triggers for the experimental condition 1 and 2. The rows in column 18 (resp. 19) are filled with zeros, except at the timestamp corresponding to the beginning of the block for condition 1 (resp. 2), when the row gets a value of one.

    We supply an online and open-source example working with Python (9).

    IV.References

    1. Cattan G, Andreev A, Mendoza C, Congedo M. The Impact of Passive Head-Mounted Virtual Reality Devices on the Quality of EEG Signals. In Delft: The Eurographics Association; 2018 [cited 2018 Apr 16]. Available from: https://diglib.eg.org:443/handle/10.2312/vriphys20181064

    2. Pfurtscheller G, Stancák A, Neuper C. Event-related synchronization (ERS) in the alpha band — an electrophysiological correlate of cortical idling: A review. Int J Psychophysiol. 1996 Nov 1;24(1):39–46.

    3. Banquet JP. Spectral analysis of the EEG in meditation. Electroencephalogr Clin Neurophysiol. 1973 Aug 1;35(2):143–51.

    4. Antonenko P, Paas F, Grabner R, van Gog T. Using Electroencephalography to Measure Cognitive Load. Educ Psychol Rev. 2010 Dec 1;22(4):425–38.

    5. Rodrigues PLC, Congedo M, Jutten C. Multivariate Time-Series Analysis Via Manifold Learning. In: 2018 IEEE Statistical Signal Processing Workshop (SSP). 2018. p. 573–7.

    6. Renard Y, Lotte F, Gibert G, Congedo M, Maby E, Delannoy V, et al. OpenViBE: An Open-Source Software Platform to Design, Test, and Use Brain–Computer Interfaces in Real and Virtual Environments. Presence Teleoperators Virtual Environ. 2010 Feb 1;19(1):35–53.

    7. Arrouët C, Congedo M, Marvie J-E, Lamarche F, Lécuyer A, Arnaldi B. Open-ViBE: A Three Dimensional Platform for Real-Time Neuroscience. J Neurother. 2005 Jul 8;9(1):3–25.

    8. Mandal MK. C++ Library for Serial Communication with Arduino [Internet]. 2016 [cited 2018 Dec 15]. Available from : https://github.com/manashmndl/SerialPort

    9. Rodrigues PLC. Alpha-Waves-Dataset [Internet]. Grenoble: GIPSA-lab; 2018. Available from : https://github.com/plcrodrigues/Alpha-Waves-Dataset

  6. c

    Ultra high-density EEG recording of interictal migraine and controls:...

    • kilthub.cmu.edu
    txt
    Updated Jul 21, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alireza Chaman Zar; Sarah Haigh; Pulkit Grover; Marlene Behrmann (2020). Ultra high-density EEG recording of interictal migraine and controls: sensory and rest [Dataset]. http://doi.org/10.1184/R1/12636731
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jul 21, 2020
    Dataset provided by
    Carnegie Mellon University
    Authors
    Alireza Chaman Zar; Sarah Haigh; Pulkit Grover; Marlene Behrmann
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    We used a high-density electroencephalography (HD-EEG) system, with 128 customized electrode locations, to record from 17 individuals with migraine (12 female) in the interictal period, and 18 age- and gender-matched healthy control subjects, during visual (vertical grating pattern) and auditory (modulated tone) stimulation which varied in temporal frequency (4 and 6Hz), and during rest. This dataset includes the EEG raw data related to the paper entitled Chamanzar, Haigh, Grover, and Behrmann (2020), Abnormalities in cortical pattern of coherence in migraine detected using ultra high-density EEG. The link to our paper will be made available as soon as it is published online.

  7. EEG datasets of stroke patients

    • figshare.com
    json
    Updated Sep 14, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Haijie Liu; Xiaodong Lv (2023). EEG datasets of stroke patients [Dataset]. http://doi.org/10.6084/m9.figshare.21679035.v5
    Explore at:
    jsonAvailable download formats
    Dataset updated
    Sep 14, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Haijie Liu; Xiaodong Lv
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data set consists of electroencephalography (EEG) data from 50 (Subject1 – Subject50) participants with acute ischemic stroke aged between 30 and 77 years. The participants included 39 male and 11 female. The time after stroke ranged from 1 days to 30 days. 22 participants had right hemisphere hemiplegia and 28 participants had left hemisphere hemiplegia. All participants were originally right-handed. Each of the participants sat in front of a computer screen with an arm resting on a pillow on their lap or on a table and they carried out the instructions given on the computer screen. At the trial start, a picture with text description which was circulated with left right hand, were presented for 2s. We asked the participants to focus their mind on the hand motor imagery which was instructed, at the same time, the video of ipsilateral hand movement is displayed on the computer screen and lasts for 4s. Next, take a 2s break.

  8. A dataset of neonatal EEG recordings with seizures annotations

    • zenodo.org
    zip
    Updated Jul 2, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nathan Stevenson; Karoliina Tapani; Leena Lauronen; Sampsa Vanhatalo; Sampsa Vanhatalo; Nathan Stevenson; Karoliina Tapani; Leena Lauronen (2021). A dataset of neonatal EEG recordings with seizures annotations [Dataset]. http://doi.org/10.5281/zenodo.1280684
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 2, 2021
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Nathan Stevenson; Karoliina Tapani; Leena Lauronen; Sampsa Vanhatalo; Sampsa Vanhatalo; Nathan Stevenson; Karoliina Tapani; Leena Lauronen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Neonatal seizures are a common emergency in the neonatal intensive care unit (NICU). There are many questions yet to be answered regarding the temporal/spatial characteristics of seizures from different pathologies, response to medication, effects on neurodevelopment and optimal detection. This dataset contains EEG recordings from human neonates and the visual interpretation of the EEG by the human expert. Multi-channel EEG was recorded from 79 term neonates admitted to the neonatal intensive care unit (NICU) at the Helsinki University Hospital. The median recording duration was 74 minutes (IQR: 64 to 96 minutes). EEGs were annotated by three experts for the presence of seizures. An average of 460 seizures were annotated per expert in the dataset, 39 neonates had seizures by consensus and 22 were seizure free by consensus. The dataset can be used as a reference set of neonatal seizures, for the development of automated methods of seizure detection and other EEG analysis, as well as for the analysis of inter-observer agreement.

  9. b

    Harvard Electroencephalography Database

    • bdsp.io
    • registry.opendata.aws
    Updated Feb 10, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sahar Zafar; Tobias Loddenkemper; Jong Woo Lee; Andrew Cole; Daniel Goldenholz; Jurriaan Peters; Alice Lam; Edilberto Amorim; Catherine Chu; Sydney Cash; Valdery Moura Junior; Aditya Gupta; Manohar Ghanta; Marta Fernandes; Haoqi Sun; Jin Jing; M Brandon Westover (2025). Harvard Electroencephalography Database [Dataset]. http://doi.org/10.60508/k85b-fc87
    Explore at:
    Dataset updated
    Feb 10, 2025
    Authors
    Sahar Zafar; Tobias Loddenkemper; Jong Woo Lee; Andrew Cole; Daniel Goldenholz; Jurriaan Peters; Alice Lam; Edilberto Amorim; Catherine Chu; Sydney Cash; Valdery Moura Junior; Aditya Gupta; Manohar Ghanta; Marta Fernandes; Haoqi Sun; Jin Jing; M Brandon Westover
    License

    https://github.com/bdsp-core/bdsp-license-and-duahttps://github.com/bdsp-core/bdsp-license-and-dua

    Description

    The Harvard EEG Database will encompass data gathered from four hospitals affiliated with Harvard University: Massachusetts General Hospital (MGH), Brigham and Women's Hospital (BWH), Beth Israel Deaconess Medical Center (BIDMC), and Boston Children's Hospital (BCH). The EEG data includes three types:

    rEEG: "routine EEGs" recorded in the outpatient setting.
    EMU: recordings obtained in the inpatient setting, within the Epilepsy Monitoring Unit (EMU).
    ICU/LTM: recordings obtained from acutely and critically ill patients within the intensive care unit (ICU).
    
  10. i

    Preprocessed CHB-MIT Scalp EEG Database

    • ieee-dataport.org
    Updated Dec 24, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mrs Deepa .B (2024). Preprocessed CHB-MIT Scalp EEG Database [Dataset]. https://ieee-dataport.org/open-access/preprocessed-chb-mit-scalp-eeg-database
    Explore at:
    Dataset updated
    Dec 24, 2024
    Authors
    Mrs Deepa .B
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Univ. of Bonn’ and ‘CHB-MIT Scalp EEG Database’ are publically available datasets which are the most sought after amongst researchers. Bonn dataset is very small compared to CHB-MIT. But still researchers prefer Bonn as it is in simple '.txt' format. The dataset being published here is a preprocessed form of CHB-MIT. The dataset is available in '.csv' format.

  11. i

    Data from: EEG data for ADHD / Control children

    • ieee-dataport.org
    Updated Oct 15, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ali Motie Nasrabadi (2024). EEG data for ADHD / Control children [Dataset]. https://ieee-dataport.org/open-access/eeg-data-adhd-control-children
    Explore at:
    Dataset updated
    Oct 15, 2024
    Authors
    Ali Motie Nasrabadi
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    epilepsy

  12. An EEG dataset recorded during affective music listening

    • openneuro.org
    Updated Apr 23, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ian Daly; Nicoletta Nicolaou; Duncan Williams; Faustina Hwang; Alexis Kirke; Eduardo Miranda; Slawomir J. Nasuto (2020). An EEG dataset recorded during affective music listening [Dataset]. http://doi.org/10.18112/openneuro.ds002721.v1.0.1
    Explore at:
    Dataset updated
    Apr 23, 2020
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Ian Daly; Nicoletta Nicolaou; Duncan Williams; Faustina Hwang; Alexis Kirke; Eduardo Miranda; Slawomir J. Nasuto
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    0. Sections

    1. Project
    2. Dataset
    3. Terms of Use
    4. Contents
    5. Method and Processing

    1. PROJECT

    Title: Brain-Computer Music Interface for Monitoring and Inducing Affective States (BCMI-MIdAS)

    Dates: 2012-2017

    Funding organisation: Engineering and Physical Sciences Research Council (EPSRC)

    Grant no.: EP/J003077/1 and EP/J002135/1.

    2. DATASET

    Title: EEG data investigating neural correlates of music-induced emotion.

    Description: This dataset accompanies the publication by Daly et al. (2018) and has been analysed in Daly et al. (2014; 2015a; 2015b) (please see Section 5 for full references). The purpose of the research activity in which the data were collected was to investigate the EEG neural correlates of music-induced emotion. For this purpose 31 healthy adult participants listened to 40 music clips of 12 s duration each, targeting a range of emotional states. The music clips comprised excerpts from film scores spanning a range of styles and rated on induced emotion. The dataset contains unprocessed EEG data from all 31 participants (age range 18-66, 18 female) while listening to the music clips, together with the reported induced emotional responses . The paradigm involved 6 runs of EEG recordings. The first and last runs were resting state runs, during which participants were instructed to sit still and rest for 300 s. The other 4 runs each contained 10 music listening trials.

    Publication Year: 2018

    Creator: Nicoletta Nicolaou, Ian Daly.

    Contributors: Isil Poyraz Bilgin, James Weaver, Asad Malik.

    Principal Investigator: Slawomir Nasuto (EP/J003077/1).

    Co-Investigator: Eduardo Miranda (EP/J002135/1).

    Organisation: University of Reading

    Rights-holders: University of Reading

    Source: The musical stimuli were taken from Eerola & Vuoskoski, ŌĆ£A comparison of the discrete and dimensional models of emotion in musicŌĆØ, Psychol. Music, 39:18-49, 2010 (doi: 10.1177/0305735610362821).

    3. TERMS OF USE

    Copyright University of Reading, 2018. This dataset is licensed by the rights-holder(s) under a Creative Commons Attribution 4.0 International Licence: https://creativecommons.org/licenses/by/4.0/.

    4. CONTENTS

    BIDS File listing: The dataset comprises data from 31 participants, named using the convention: sub_s_number where: s_number is a random participant number from 1 to 31. For example: ŌĆśsub-08ŌĆÖ contains data obtained from participant 8.

    The data is BIDS format and contains EEG and associated meta data. The sampling rate is 1 kHz and the EEG corresponding to a music clip is 20 s long (the duration of the clips).

    Each data folder contains the following data (please note that the number of runs varies between participants):

    EEG data in .tsv format. Event codes (JSON) and timings (tsv). EEG channel information.

    5. METHOD and PROCESSING

    This information is available in the following publications:

    [1] Daly, I., Nicolaou, N., Williams, D., Hwang, F., Kirke, A., Miranda, E., Nasuto, S.J., ōNeural and physiological data from participants listening to affective musicö, Scientific Data, 2018. [2] Daly, I., Malik, A., Hwang, F., Roesch, E., Weaver, J., Kirke, A., Williams, D., Miranda, E. R., Nasuto, S. J., ōNeural correlates of emotional responses to music: an EEG studyö, Neuroscience Letters, 573: 52-7, 2014; doi: 10.1016/j.neulet.2014.05.003. [3] Daly, I., Hallowell, J., Hwang, F., Kirke, A., Malik, A., Roesch, E., Weaver, J., Williams, D., Miranda, E., Nasuto, S.J., ōChanges in music tempo entrain movement related brain activityö, Proc. IEEE EMBC 2014, pp.4595-8; doi: 10.1109/EMBC.2014.6944647 [4] Daly, I., Williams, D., Hallowell, J., Hwang, F., Kirke, A., Malik, A., Weaver, J., Miranda, E., Nasuto, S.J., ōMusic-induced emotions can be predicted from a combination of brain activity and acoustic featuresö, Brain and Cognition, 101:1-11, 2015b; doi: 10.1016/j.bandc.2015.08.003

    Please cite these references if you use this dataset in your study.

    Thank you for your interest in our work.

  13. f

    MAMEM EEG SSVEP Dataset I (256 channels, 11 subjects, 5 frequencies...

    • figshare.com
    • zenodo.org
    • +1more
    application/x-rar
    Updated May 30, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Spiros Nikolopoulos (2023). MAMEM EEG SSVEP Dataset I (256 channels, 11 subjects, 5 frequencies presented in isolation) [Dataset]. http://doi.org/10.6084/m9.figshare.2068677.v6
    Explore at:
    application/x-rarAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    figshare
    Authors
    Spiros Nikolopoulos
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    EEG signals with 256 channels captured from 11 subjects executing a SSVEP-based experimental protocol. Five different frequencies (6.66, 7.50, 8.57, 10.00 and 12.00 Hz) have been used for the visual stimulation, and the EGI 300 Geodesic EEG System (GES 300), using a 256-channel HydroCel Geodesic Sensor Net (HCGSN) and a sampling rate of 250 Hz has been used for capturing the signals. Check https://www.youtube.com/watch?v=8lGBVvCX5d8&feature=youtu.be for a video demonstrating one trial.Check https://github.com/MAMEM/ssvep-eeg-processing-toolbox for the processing toolbox.Check http://arxiv.org/abs/1602.00904 for the technical report.

  14. d

    CHB-MIT Scalp EEG Database

    • dknet.org
    • neuinfo.org
    • +2more
    Updated May 13, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). CHB-MIT Scalp EEG Database [Dataset]. http://identifiers.org/RRID:SCR_004264
    Explore at:
    Dataset updated
    May 13, 2025
    Description

    THIS RESOURCE IS NO LONGER IN SERVICE. Documented on November 22, 2022. Data set collected at the Children''s Hospital Boston, of EEG recordings from pediatric subjects with intractable seizures. Subjects were monitored for up to several days following withdrawal of anti-seizure medication in order to characterize their seizures and assess their candidacy for surgical intervention. All signals were sampled at 256 samples per second with 16-bit resolution. Most files contain 23 EEG signals (24 or 26 in a few cases).

  15. Music Listening- Genre EEG dataset (MUSIN-G)

    • openneuro.org
    Updated Aug 24, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Krishna Prasad Miyapuram; Pankaj Pandey; Nashra Ahmad; Bharatesh R Shiraguppi; Esha Sharma; Prashant Lawhatre; Dhananjay Sonawane; Derek Lomas (2021). Music Listening- Genre EEG dataset (MUSIN-G) [Dataset]. http://doi.org/10.18112/openneuro.ds003774.v1.0.0
    Explore at:
    Dataset updated
    Aug 24, 2021
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Krishna Prasad Miyapuram; Pankaj Pandey; Nashra Ahmad; Bharatesh R Shiraguppi; Esha Sharma; Prashant Lawhatre; Dhananjay Sonawane; Derek Lomas
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    The dataset contains Electroencephalography (EEG) responses from 20 Indian participants, on 12 songs of different genres (from Indian Classical to Goth Rock). Each session indicates a song by its number.

    For the experiment, the participants were indicated to close their eyes indicated by a single beep, and the song was presented to them on speakers. After listening to each song, a double beep was presented, asking them to open their eyes and rate their familiarity and enjoyment to the song. The responses were taken on a scale of 1 to 5, where 1 meant most familiar or most enjoyable, and 5 meant least familiar or least enjoyable.

  16. Data from: A Resting-state EEG Dataset for Sleep Deprivation

    • openneuro.org
    Updated Apr 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chuqin Xiang; Xinrui Fan; Duo Bai; Ke Lv; Xu Lei (2025). A Resting-state EEG Dataset for Sleep Deprivation [Dataset]. http://doi.org/10.18112/openneuro.ds004902.v1.0.8
    Explore at:
    Dataset updated
    Apr 27, 2025
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Chuqin Xiang; Xinrui Fan; Duo Bai; Ke Lv; Xu Lei
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    General information

    The dataset provides resting-state EEG data (eyes open,partially eyes closed) from 71 participants who underwent two experiments involving normal sleep (NS---session1) and sleep deprivation(SD---session2) .The dataset also provides information on participants' sleepiness and mood states. (Please note here Session 1 (NS) and Session 2 (SD) is not the time order, the time order is counterbalanced across participants and is listed in metadata.)

    Dataset

    Presentation

    The data collection was initiated in March 2019 and was terminated in December 2020. The detailed description of the dataset is currently under working by Chuqin Xiang,Xinrui Fan,Duo Bai,Ke Lv and Xu Lei, and will submit to Scientific Data for publication.

    EEG acquisition

    • EEG system (Brain Products GmbH, Steing- rabenstr, Germany, 61 electrodes)
    • Sampling frequency: 500Hz
    • Impedances were kept below 5k

    Contact

     * If you have any questions or comments, please contact:
     * Xu Lei: xlei@swu.edu.cn   
    

    Article

    Xiang, C., Fan, X., Bai, D. et al. A resting-state EEG dataset for sleep deprivation. Sci Data 11, 427 (2024). https://doi.org/10.1038/s41597-024-03268-2

  17. u

    Longitudinal ALS EEG Dataset for Motor Imagery Studies

    • rdr.ucl.ac.uk
    bin
    Updated Jan 24, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rishan Patel; Dai Jiang; Barney Bryson; Tom Carlson; Andreas Demosthenous; Andrew Geronimo (2025). Longitudinal ALS EEG Dataset for Motor Imagery Studies [Dataset]. http://doi.org/10.5522/04/28156016.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 24, 2025
    Dataset provided by
    University College London
    Authors
    Rishan Patel; Dai Jiang; Barney Bryson; Tom Carlson; Andreas Demosthenous; Andrew Geronimo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset comprises EEG recordings from eight ALS patients aged between 45.5 and 74 years. Patients exhibited revised ALS Functional Rating Scale (ALSFRS-R) scores ranging from 0 to 46, with time since symptom onset (TSSO) varying between 12 and 113 months. Notably, no disease progression was reported during the study period, ensuring stability in clinical conditions. The participants were recruited from the Penn State Hershey Medical Center ALS Clinic and had confirmed ALS diagnoses without significant dementia. This rigorous selection criterion ensured the validity and reliability of the dataset for motor imagery analysis in an ALS population.The EEG data were collected using 19 electrodes placed according to the international 10-20 system (FP1, FP2, F7, F3, FZ, F4, F8, T7, C3, CZ, C4, T8, P7, P3, PZ, P4, P8, O1, O2), with signals referenced to linked earlobes and a ground electrode at FPz. Additionally, three electrooculogram (EOG) electrodes were employed to facilitate artifact removal, maintaining impedance levels below 10 kΩ throughout data acquisition. The data were amplified using two g.USBamp systems (g.tec GmbH) and recorded via the BCI2000 software suite, with supplementary preprocessing in MATLAB. All experimental procedures adhered strictly to Penn State University’s IRB protocol PRAMSO40647EP, ensuring ethical compliance.Each participant underwent four brain-computer interface (BCI) sessions conducted over a period of 1 to 2 months. Each session consisted of four runs, with 10 trials per class (left hand, right hand, and rest) for a total of 40 trials per session. The sessions began with a calibration run to initialize the system, followed by feedback runs during which participants controlled a cursor's movement through motor imagery, specifically imagined grasping movements. The study design, focused on motor imagery (MI), generated a total of 160 trials per participant over two months.This dataset holds significance in studying the longitudinal dynamics of motor imagery decoding in ALS patients. To ensure reproducibility of our findings and to promote advancements in the field, we have received explicit permission from Prof. Geronimo of Penn State University to distribute this dataset in the processed format for research purposes. The original publication of this collection can be found below.How to use this dataset: This dataset is structured in MATLAB as a collection of subject-specific structs, where each subject is represented as a single struct. Each struct contains three fields:L: Trials corresponding to Left Motor Imagery.R: Trials corresponding to Right Motor Imagery.Re: Trials corresponding to Rest state.Each field contains an array of trials, where each trial is represented as a matrix with, Rows as Timestamps, and Columns as channels.Primary Collection: Geronimo A, Simmons Z, Schiff SJ. Performance predictors of brain-computer interfaces in patients with amyotrophic lateral sclerosis. Journal of neural engineering 2016 13. 10.1088/1741-2560/13/2/026002.All code for any publications with this data has been made publicly available at the following link:https://github.com/rishannp/Auto-Adaptive-FBCSPhttps://github.com/rishannp/Motor-Imagery---Graph-Attention-Network

  18. EEG dataset

    • figshare.com
    bin
    Updated Dec 6, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    minho lee (2019). EEG dataset [Dataset]. http://doi.org/10.6084/m9.figshare.8091242.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Dec 6, 2019
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    minho lee
    License

    https://www.gnu.org/copyleft/gpl.htmlhttps://www.gnu.org/copyleft/gpl.html

    Description

    This dataset has collected for the study of "Robust Detection of Event-Related Potentials in a User-Voluntary Short-Term Imagery Task.

  19. m

    EEG dataset of individuals with intellectual and developmental disorder and...

    • data.mendeley.com
    Updated Apr 11, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ekansh Sareen (2020). EEG dataset of individuals with intellectual and developmental disorder and healthy controls while observing rest and music stimuli [Dataset]. http://doi.org/10.17632/fshy54ypyh.2
    Explore at:
    Dataset updated
    Apr 11, 2020
    Authors
    Ekansh Sareen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data presents a collection of EEG recordings of seven participants with Intellectual and Developmental Disorder (IDD) and seven Typically Developing Controls (TDC). The data is recorded while the participants observe a resting state and a soothing music stimuli. The data was collected using a high-resolution multi-channel dry-electrode system from EMOTIV called EPOC+. This is a 14-channel device with two reference channels and a sampling frequency of 128 Hz. The data was collected in a noise-isolated room. The participants were informed of the experimental procedure, related risks and were asked to keep their eyes closed throughout the experiment. The data is provided in two formats, (1) Raw EEG data and (2) Pre-processed and clean EEG data for both the group of participants. This data can be used to explore the functional brain connectivity of the IDD group. In addition, behavioral information like IQ, SQ, music apprehension and facial expressions (emotion) for IDD participants is provided in file “QualitativeData.xlsx".

    Data Usage: The data is arranged as follows: 1. Raw Data: Data/RawData/RawData_TDC/Music and Rest Data/RawData/RawData_IDD/Music and Rest 2. Clean Data Data/CleanData/CleanData_TDC/Music and Rest Data/CleanData/CleanData_IDD/Music and Rest

    The dataset comes along with a fully automated EEG pre-processing pipeline. This pipeline can be used to do batch-processing of raw EEG files to obtain clean and pre-processed EEG files. Key features of this pipeline are : (1) Bandpass filtering (2) Linenoise removal (3) Channel selection (4) Independent Component Analysis (ICA) (5) Automatic artifact rejection All the required files are present in the Pipeline folder.

    If you use this dataset and/or the fully automated pre-processing pipeline for your research work, kindly cite these two articles linked to this dataset.

    (1) Sareen, E., Singh, L., Varkey, B., Achary, K., Gupta, A. (2020). EEG dataset of individuals with intellectual and developmental disorder and healthy controls under rest and music stimuli. Data in Brief, 105488, ISSN 2352-3409, DOI:https://doi.org/10.1016/j.dib.2020.105488. (2) Sareen, E., Gupta, A., Verma, R., Achary, G. K., Varkey, B (2019). Studying functional brain networks from dry electrode EEG set during music and resting states in neurodevelopment disorder, bioRxiv 759738 [Preprint]. Available from: https://www.biorxiv.org/content/10.1101/759738v1

  20. f

    Human EEG Dataset for Brain-Computer Interface and Meditation

    • figshare.com
    pdf
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    James Stieger (2023). Human EEG Dataset for Brain-Computer Interface and Meditation [Dataset]. http://doi.org/10.6084/m9.figshare.13123148.v1
    Explore at:
    pdfAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    figshare
    Authors
    James Stieger
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This database includes the de-identified EEG data from 62 healthy individuals who participated in a brain-computer interface (BCI) study. All subjects underwent 7-11 sessions of BCI training which involves controlling a computer cursor to move in one-dimensional and two-dimensional spaces using subject’s “intent”. EEG data were recorded with 62 electrodes. In addition to the EEG data, behavioral data including the online success rate of BCI cursor control are also included.This dataset was collected under support from the National Institutes of Health via grants AT009263, EB021027, NS096761, MH114233, RF1MH to Dr. Bin He. Correspondence about the dataset: Dr. Bin He, Carnegie Mellon University, Department of Biomedical Engineering, Pittsburgh, PA 15213. E-mail: bhe1@andrew.cmu.edu This dataset has been used and analyzed to study the learning of BCI control and the effects of mind-body awareness training on this process. The results are reported in: Stieger et al, “Mindfulness Improves Brain Computer Interface Performance by Increasing Control over Neural Activity in the Alpha Band,” Cerebral Cortex, 2020 (https://doi.org/10.1093/cercor/bhaa234). Please cite this paper if you use any data included in this dataset.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Rahul Kher (2020). EEG Signal Dataset [Dataset]. https://ieee-dataport.org/documents/eeg-signal-dataset

EEG Signal Dataset

Explore at:
Dataset updated
Jun 11, 2020
Authors
Rahul Kher
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

PCA

Search
Clear search
Close search
Google apps
Main menu