100+ datasets found
  1. o

    A large and rich EEG dataset for modeling human visual object recognition

    • osf.io
    Updated Aug 30, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Thomas Gifford (2024). A large and rich EEG dataset for modeling human visual object recognition [Dataset]. http://doi.org/10.17605/OSF.IO/3JK45
    Explore at:
    Dataset updated
    Aug 30, 2024
    Dataset provided by
    Center For Open Science
    Authors
    Alessandro Thomas Gifford
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Source, raw and preprocessed EEG data, resting state EEG data, image set, DNN feature maps and code of the paper: "A large and rich EEG dataset for modeling human visual object recognition".

  2. p

    CHB-MIT Scalp EEG Database

    • physionet.org
    Updated Jun 9, 2010
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    John Guttag (2010). CHB-MIT Scalp EEG Database [Dataset]. http://doi.org/10.13026/C2K01R
    Explore at:
    Dataset updated
    Jun 9, 2010
    Authors
    John Guttag
    License

    Open Data Commons Attribution License (ODC-By) v1.0https://www.opendatacommons.org/licenses/by/1.0/
    License information was derived automatically

    Description

    This database, collected at the Children’s Hospital Boston, consists of EEG recordings from pediatric subjects with intractable seizures. Subjects were monitored for up to several days following withdrawal of anti-seizure medication in order to characterize their seizures and assess their candidacy for surgical intervention. The recordings are grouped into 23 cases and were collected from 22 subjects (5 males, ages 3–22; and 17 females, ages 1.5–19).

  3. i

    EEG Signal Dataset

    • ieee-dataport.org
    Updated Jun 11, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rahul Kher (2020). EEG Signal Dataset [Dataset]. http://doi.org/10.21227/t5rz-g722
    Explore at:
    Dataset updated
    Jun 11, 2020
    Dataset provided by
    IEEE Dataport
    Authors
    Rahul Kher
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    EEG signals of various subjects in text files are uploaded. It can be useful for various EEG signal processing algorithms- filtering, linear prediction, abnormality detection, PCA, ICA etc.

  4. EEG Alpha Waves dataset

    • zenodo.org
    • explore.openaire.eu
    • +1more
    bin
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Grégoire Cattan; Pedro L. C. Rodrigues; Pedro L. C. Rodrigues; Marco Congedo; Marco Congedo; Grégoire Cattan (2020). EEG Alpha Waves dataset [Dataset]. http://doi.org/10.5281/zenodo.2348892
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Grégoire Cattan; Pedro L. C. Rodrigues; Pedro L. C. Rodrigues; Marco Congedo; Marco Congedo; Grégoire Cattan
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Summary:

    This dataset contains electroencephalographic recordings of subjects in a simple resting-state eyes open/closed experimental protocol. Data were recorded during a pilot experiment taking place in the GIPSA-lab, Grenoble, France, in 2017 [1]. Python code is available at https://github.com/plcrodrigues/Alpha-Waves-Dataset for manipulating the data.

    Principal Investigators: Eng. Grégoire CATTAN, Eng. Pedro L. C. RODRIGUES
    Scientific Supervisor: Dr. Marco Congedo

    Introduction :

    The occipital dominant rhythm (commonly referred to as occipital ‘Alpha’) is prominent in occipital and parietal regions when a subject is exempt of visual stimulations, as in the case when keeping the eyes closed (2). In normal subjects its peak frequency is in the range 8-12Hz. The detection of alpha waves on the ongoing electroencephalography (EEG) is a useful indicator of the subject’s level of stress, concentration, relaxation or mental load (3,4) and an easy marker to detect in the recorded signals because of its high signal-to-noise-ratio. This experiment was conducted to provide a simple yet reliable set of EEG signals carrying very distinct signatures on each experimental condition. It can be useful for researchers and students looking for an EEG dataset to perform tests with signal processing and machine learning algorithms. An example of application of this dataset can be seen in (5).

    I.Participants

    A total of 20 volunteers participated in the experiment (7 females), with mean (sd) age 25.8 (5.27) and median 25.5. 18 subjects were between 19 and 28 years old. Two participants with age 33 and 44 were outside this range.

    II.Procedures

    EEG signals were acquired using a standard research grade amplifier (g.USBamp, g.tec, Schiedlberg, Austria) and the EC20 cap equipped with 16 wet electrodes (EasyCap, Herrsching am Ammersee, Germany), placed according to the 10-20 international system. The locations of the electrodes were FP1, FP2, FC5, FC6, FZ, T7, CZ, T8, P7, P3, PZ, P4, P8, O1, Oz, and O2. The reference was placed on the right earlobe and the ground at the AFZ scalp location. The amplifier was linked by USB connection to the PC where the data were acquired by means of the software OpenVibe (6,7). We acquired the data with no digital filter and a sampling frequency of 512 samples per second was used. For ensuing analyses, the experimenter was able to tag the EEG signal using an in-house application based on a C/C++ library (8). The tag were sent by the application to the amplifier through the USB port of the PC. It was then recorded along with the EEG signal as a supplementary channel.

    For each recording we provide the age, genre and fatigue of each participant. Fatigue was evaluated by the subjects thanks to a scale ranging from 0 to 10, where 10 represents exhaustion. Each participant underwent one session consisting of ten blocks of ten seconds of EEG data recording. Five blocks were recorded while a subject was keeping his eyes closed (condition 1) and the others while his eyes were open (condition 2). The two conditions were alternated. Before the onset of each block, the subject was asked to close or open his eyes according to the experimental condition. The experimenter then tagged the EEG signal using the in-house application and started a 10-second countdown of a block.

    III.Organization of the dataset

    For each subject we provide a single .mat file containing the complete recording of the session. The file is a 2D-matrix where the rows contain the observations at each time sample. Columns 2 to 17 contain the recordings on each of the 16 EEG electrodes. The first column of the matrix represents the timestamp of each observation and column 18 and 19 contain the triggers for the experimental condition 1 and 2. The rows in column 18 (resp. 19) are filled with zeros, except at the timestamp corresponding to the beginning of the block for condition 1 (resp. 2), when the row gets a value of one.

    We supply an online and open-source example working with Python (9).

    IV.References

    1. Cattan G, Andreev A, Mendoza C, Congedo M. The Impact of Passive Head-Mounted Virtual Reality Devices on the Quality of EEG Signals. In Delft: The Eurographics Association; 2018 [cited 2018 Apr 16]. Available from: https://diglib.eg.org:443/handle/10.2312/vriphys20181064

    2. Pfurtscheller G, Stancák A, Neuper C. Event-related synchronization (ERS) in the alpha band — an electrophysiological correlate of cortical idling: A review. Int J Psychophysiol. 1996 Nov 1;24(1):39–46.

    3. Banquet JP. Spectral analysis of the EEG in meditation. Electroencephalogr Clin Neurophysiol. 1973 Aug 1;35(2):143–51.

    4. Antonenko P, Paas F, Grabner R, van Gog T. Using Electroencephalography to Measure Cognitive Load. Educ Psychol Rev. 2010 Dec 1;22(4):425–38.

    5. Rodrigues PLC, Congedo M, Jutten C. Multivariate Time-Series Analysis Via Manifold Learning. In: 2018 IEEE Statistical Signal Processing Workshop (SSP). 2018. p. 573–7.

    6. Renard Y, Lotte F, Gibert G, Congedo M, Maby E, Delannoy V, et al. OpenViBE: An Open-Source Software Platform to Design, Test, and Use Brain–Computer Interfaces in Real and Virtual Environments. Presence Teleoperators Virtual Environ. 2010 Feb 1;19(1):35–53.

    7. Arrouët C, Congedo M, Marvie J-E, Lamarche F, Lécuyer A, Arnaldi B. Open-ViBE: A Three Dimensional Platform for Real-Time Neuroscience. J Neurother. 2005 Jul 8;9(1):3–25.

    8. Mandal MK. C++ Library for Serial Communication with Arduino [Internet]. 2016 [cited 2018 Dec 15]. Available from : https://github.com/manashmndl/SerialPort

    9. Rodrigues PLC. Alpha-Waves-Dataset [Internet]. Grenoble: GIPSA-lab; 2018. Available from : https://github.com/plcrodrigues/Alpha-Waves-Dataset

  5. f

    EEG Dataset for 'Immediate effects of short-term meditation on sensorimotor...

    • figshare.com
    pdf
    Updated Dec 9, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jeehyun Kim; Xiyuan Jiang; Dylan Forenzo; Bin He (2022). EEG Dataset for 'Immediate effects of short-term meditation on sensorimotor rhythm-based brain–computer interface performance' [Dataset]. http://doi.org/10.6084/m9.figshare.21644429.v5
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Dec 9, 2022
    Dataset provided by
    figshare
    Authors
    Jeehyun Kim; Xiyuan Jiang; Dylan Forenzo; Bin He
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This database includes the de-identified EEG data from 37 healthy individuals who participated in a brain-computer interface (BCI) study. All but one subject underwent 2 sessions of BCI experiments that involved controlling a computer cursor to move in one-dimensional space using their “intent”. EEG data were recorded with 62 electrodes. In addition to the EEG data, behavioral data including the online success rate and results of BCI cursor control are also included. This dataset was collected under support from the National Institutes of Health via grant AT009263 to Dr. Bin He. Correspondence about the dataset: Dr. Bin He, Carnegie Mellon University, Department of Biomedical Engineering, Pittsburgh, PA 15213. E-mail: bhe1@andrew.cmu.edu This dataset has been used and analyzed to study the immediate effect of short meditation on BCI performance. The results are reported in: Kim et al, “Immediate effects of short-term meditation on sensorimotor rhythm-based brain–computer interface performance,” Frontiers in Human Neuroscience, 2022 (https://doi.org/10.3389/fnhum.2022.1019279). Please cite this paper if you use any data included in this dataset.

  6. u

    EEG Datasets for Naturalistic Listening to "Alice in Wonderland" (Version 1)...

    • deepblue.lib.umich.edu
    Updated Nov 20, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Brennan, Jonathan R. (2018). EEG Datasets for Naturalistic Listening to "Alice in Wonderland" (Version 1) [Dataset]. http://doi.org/10.7302/Z29C6VNH
    Explore at:
    Dataset updated
    Nov 20, 2018
    Dataset provided by
    Deep Blue Data
    Authors
    Brennan, Jonathan R.
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    These files contain the raw data and processing parameters to go with the paper "Hierarchical structure guides rapid linguistic predictions during naturalistic listening" by Jonathan R. Brennan and John T. Hale. These files include the stimulus (wav files), raw data (matlab format for the Fieldtrip toolbox), data processing paramters (matlab), and variables used to align the stimuli with the EEG data and for the statistical analyses reported in the paper.

  7. i

    EEG datasets with different levels of fatigue for personal identification

    • ieee-dataport.org
    Updated May 2, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jianliang Min (2023). EEG datasets with different levels of fatigue for personal identification [Dataset]. http://doi.org/10.21227/6f0t-y338
    Explore at:
    Dataset updated
    May 2, 2023
    Dataset provided by
    IEEE Dataport
    Authors
    Jianliang Min
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Dataset I: This is the original EEG data of twelve healthy subjects for driver fatigue detection. Due to personal privacy, the digital number represents different participants. The .cnt files were created by a 40-channel Neuroscan amplifier, including the EEG data in two states in the process of driving.Dataset II: This project adopted an event-related lane-departure paradigm in a virtual-reality (VR) dynamic driving simulator to quantitatively measure brain EEG dynamics along with the fluctuation of task performance throughout the experiment.All subjects were required to have driving license. None of the participants had a history of psychological disorders. All participants were instructed to sustain their attention to perform the task during the experiment, and the 32-ch EEG signals and the vehicle position were recorded simultaneously.Prior to the experiment, all participants completed a consent form stating their clear understanding of the experimental protocol which had been approved by Institutional Review Broad of Taipei Veterans General Hospital, Taiwan.Experiment:All subjects participated in the sustained-attention driving experiment for 1.5 hours in the afternoon (13:00-14:00) after lunch, and all of them were asked to keep their attention focused on driving during the entire period. There was no break or resting session. At the beginning of the experiment (without any recordings), a five-minute pre-test was performed to ensure that every subject understood the instructions and they did not suffer from simulator-induced nausea. To investigate the effect of kinesthesia on brain activity in the sustained-attention driving task, each subject was asked to participate at least two driving sessions on different days. Each session lasted for about 90 min. One was the driving session with a fixed-based simulator but with no kinesthetic feedback, so subject had to monitor the vehicle deviation visually from the virtual scene.The other driving session involved a motion-based simulator with a six degree-of-freedom Stewart platform to simulate the dynamic response of the vehicle to the deviation event or steering. The visual and kinesthetic inputs together aroused the subject to attend to the deviation event and take action to correct the driving trajectory Data Requirement.A wired EEG cap with 32 Ag/AgCl electrodes, including 30 EEG electrodes and two reference electrodes (opposite lateral mastoids) was used to record the electrical activity of the brain from the scalp during the driving task. The EEG electrodes were placed according to a modified international 10-20 system. The contact impedance between all electrodes and the skin was kept

  8. EEG datasets of stroke patients

    • figshare.com
    json
    Updated Sep 14, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Haijie Liu; Xiaodong Lv (2023). EEG datasets of stroke patients [Dataset]. http://doi.org/10.6084/m9.figshare.21679035.v5
    Explore at:
    jsonAvailable download formats
    Dataset updated
    Sep 14, 2023
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Haijie Liu; Xiaodong Lv
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data set consists of electroencephalography (EEG) data from 50 (Subject1 – Subject50) participants with acute ischemic stroke aged between 30 and 77 years. The participants included 39 male and 11 female. The time after stroke ranged from 1 days to 30 days. 22 participants had right hemisphere hemiplegia and 28 participants had left hemisphere hemiplegia. All participants were originally right-handed. Each of the participants sat in front of a computer screen with an arm resting on a pillow on their lap or on a table and they carried out the instructions given on the computer screen. At the trial start, a picture with text description which was circulated with left right hand, were presented for 2s. We asked the participants to focus their mind on the hand motor imagery which was instructed, at the same time, the video of ipsilateral hand movement is displayed on the computer screen and lasts for 4s. Next, take a 2s break.

  9. BED: Biometric EEG dataset

    • zenodo.org
    • producciocientifica.uv.es
    • +1more
    Updated Apr 20, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Pablo Arnau-González; Pablo Arnau-González; Stamos Katsigiannis; Stamos Katsigiannis; Miguel Arevalillo-Herráez; Miguel Arevalillo-Herráez; Naeem Ramzan; Naeem Ramzan (2022). BED: Biometric EEG dataset [Dataset]. http://doi.org/10.5281/zenodo.4309472
    Explore at:
    Dataset updated
    Apr 20, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Pablo Arnau-González; Pablo Arnau-González; Stamos Katsigiannis; Stamos Katsigiannis; Miguel Arevalillo-Herráez; Miguel Arevalillo-Herráez; Naeem Ramzan; Naeem Ramzan
    Description

    The BED dataset

    Version 1.0.0

    Please cite as: Arnau-González, P., Katsigiannis, S., Arevalillo-Herráez, M., Ramzan, N., "BED: A new dataset for EEG-based biometrics", IEEE Internet of Things Journal, vol. 8, no. 15, pp. 12219 - 12230, 2021.

    Disclaimer

    While every care has been taken to ensure the accuracy of the data included in the BED dataset, the authors and the University of the West of Scotland, Durham University, and Universitat de València do not provide any guaranties and disclaim all responsibility and all liability (including without limitation, liability in negligence) for all expenses, losses, damages (including indirect or consequential damage) and costs which you might incur as a result of the provided data being inaccurate or incomplete in any way and for any reason. 2020, University of the West of Scotland, Scotland, United Kingdom.

    Contact

    For inquiries regarding the BED dataset, please contact:

    1. Dr Pablo Arnau-González, arnau.pablo [*AT*] gmail.com
    2. Dr Stamos Katsigiannis, stamos.katsigiannis [*AT*] durham.ac.uk
    3. Prof. Miguel Arevalillo-Herráez, miguel.arevalillo [*AT*] uv.es
    4. Prof. Naeem Ramzan, Naeem.Ramzan [*AT*] uws.ac.uk

    Dataset summary

    BED (Biometric EEG Dataset) is a dataset specifically designed to test EEG-based biometric approaches that use relatively inexpensive consumer-grade devices, more specifically the Emotiv EPOC+ in this case. This dataset includes EEG responses from 21 subjects to 12 different stimuli, across 3 different chronologically disjointed sessions. We have also considered stimuli aimed to elicit different affective states, so as to facilitate future research on the influence of emotions on EEG-based biometric tasks. In addition, we provide a baseline performance analysis to outline the potential of consumer-grade EEG devices for subject identification and verification. It must be noted that, in this work, EEG data were acquired in a controlled environment in order to reduce the variability in the acquired data stemming from external conditions.

    The stimuli include:

    • Images selected to elicit specific emotions
    • Mathematical computations (2-digit additions)
    • Resting-state with eyes closed
    • Resting-state with eyes open
    • Visual Evoked Potentials at 2, 5, 7, 10 Hz - Standard checker-board pattern with pattern reversal
    • Visual Evoked Potentials at 2, 5, 7, 10 Hz - Flashing with a plain colour, set as black

    For more details regarding the experimental protocol and the design of the dataset, please refer to the associated publication: Arnau-González, P., Katsigiannis, S., Arevalillo-Herráez, M., Ramzan, N., "BED: A new dataset for EEG-based biometrics", IEEE Internet of Things Journal, 2021. (Under review)

    Dataset structure and contents

    The BED dataset contains EEG recordings from 21 subjects, acquired during 3 similar sessions for each subject. The sessions were spaced one week apart from each other.

    The BED dataset includes:

    • The raw EEG recordings with no pre-processing and the log files of the experimental procedure, in text format
    • The EEG recordings with no pre-processing, segmented, structured and annotated according to the presented stimuli, in Matlab format
    • The features extracted from each EEG segment, as described in the associated publication

    The dataset is organised in 3 folders:

    • RAW
    • RAW_PARSED
    • Features

    RAW/ Contains the RAW files
    RAW/sN/ Contains the RAW files associated with subject N
    Each folder sN is composed by the following files:
    - sN_s1.csv, sN_s2.csv, sN_s3.csv -- Files containing the EEG recordings for subject N and session 1, 2, and 3, respectively. These files contain 39 columns:
    COUNTER INTERPOLATED F3 FC5 AF3 F7 T7 P7 O1 O2 P8 T8 F8 AF4 FC6 F4 ...UNUSED DATA... UNIX_TIMESTAMP
    - subject_N_session_1_time_X.log, subject_N_session_2_time_X.log, subject_N_session_3_time_X.log -- Log files containing the sequence of events for the subject N and the session 1,2, and 3 respectively.

    RAW_PARSED/
    Contains Matlab files named sN_sM.mat. The files contain the recordings for the subject N in the session M. These files are composed by two variables:
    - recording: size (time@256Hz x 17), Columns: COUNTER INTERPOLATED F3 FC5 AF3 F7 T7 P7 O1 O2 P8 T8 F8 AF4 FC6 F4 UNIX_TIMESTAMP
    - events: cell array with size (events x 3) START_UNIX END_UNIX ADDITIONAL_INFO
    START_UNIX is the UNIX timestamp in which the event starts
    END_UNIX is the UNIX timestamp in which the event ends
    ADDITIONAL INFO contains a struct with additional information regarding the specific event, in the case of the images, the expected score, the voted score, in the case of the cognitive task the input, in the case of the VEP the pattern and the frequency, etc..

    Features/
    Features/Identification
    Features/Identification/[ARRC|MFCC|SPEC]/: Each of these folders contain the extracted features ready for classification for each of the stimuli, each file is composed by two variables, "feat" the feature matrix and "Y" the label matrix.
    - feat: N x number of features
    - Y: N x 2 (the #subject and the #session)
    - INFO: Contains details about the event same as the ADDITIONAL INFO
    Features/Verification: This folder is composed by 3 different files each of them with one different set of features extracted. Each file is composed by one cstruct array composed by:
    - data: the time-series features, as described in the paper
    - y: the #subject
    - stimuli: the stimuli by name
    - session: the #session
    - INFO: Contains details about the event

    The features provided are in sequential order, so index 1 and index 2, etc. are sequential in time if they belong to the same stimulus.

    Additional information

    For additional information regarding the creation of the BED dataset, please refer to the associated publication: Arnau-González, P., Katsigiannis, S., Arevalillo-Herráez, M., Ramzan, N., "BED: A new dataset for EEG-based biometrics", IEEE Internet of Things Journal, vol. 8, no. 15, pp. 12219 - 12230, 2021.

  10. c

    Ultra high-density EEG recording of interictal migraine and controls:...

    • kilthub.cmu.edu
    txt
    Updated Jul 21, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alireza Chaman Zar; Sarah Haigh; Pulkit Grover; Marlene Behrmann (2020). Ultra high-density EEG recording of interictal migraine and controls: sensory and rest [Dataset]. http://doi.org/10.1184/R1/12636731
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jul 21, 2020
    Dataset provided by
    Carnegie Mellon University
    Authors
    Alireza Chaman Zar; Sarah Haigh; Pulkit Grover; Marlene Behrmann
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    We used a high-density electroencephalography (HD-EEG) system, with 128 customized electrode locations, to record from 17 individuals with migraine (12 female) in the interictal period, and 18 age- and gender-matched healthy control subjects, during visual (vertical grating pattern) and auditory (modulated tone) stimulation which varied in temporal frequency (4 and 6Hz), and during rest. This dataset includes the EEG raw data related to the paper entitled Chamanzar, Haigh, Grover, and Behrmann (2020), Abnormalities in cortical pattern of coherence in migraine detected using ultra high-density EEG. The link to our paper will be made available as soon as it is published online.

  11. i

    EMG-EEG dataset for Upper-Limb Gesture Classification

    • ieee-dataport.org
    Updated Jun 22, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    BOREOM LEE (2023). EMG-EEG dataset for Upper-Limb Gesture Classification [Dataset]. http://doi.org/10.21227/5ztn-4k41
    Explore at:
    Dataset updated
    Jun 22, 2023
    Dataset provided by
    IEEE Dataport
    Authors
    BOREOM LEE
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Electromyography (EMG) has limitations in human machine interface due to disturbances like electrode-shift, fatigue, and subject variability. A potential solution to prevent model degradation is to combine multi-modal data such as EMG and electroencephalography (EEG). This study presents an EMG-EEG dataset for enhancing the development of upper-limb assistive rehabilitation devices. The dataset, acquired from thirty-three volunteers without neuromuscular dysfunction or disease using commercial biosensors is easily replicable and deployable. The dataset consists of seven distinct gestures to maximize performance on the Toronto Rehabilitation Institute hand function test and the Jebsen-Taylor hand function test. The authors aim for this dataset to benefit the research community in creating intelligent and neuro-inspired upper limb assistive rehabilitation devices.

  12. EEG and audio dataset for auditory attention decoding

    • zenodo.org
    bin, zip
    Updated Jan 31, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Søren A. Fuglsang; Søren A. Fuglsang; Daniel D.E. Wong; Daniel D.E. Wong; Jens Hjortkjær; Jens Hjortkjær (2020). EEG and audio dataset for auditory attention decoding [Dataset]. http://doi.org/10.5281/zenodo.1199011
    Explore at:
    zip, binAvailable download formats
    Dataset updated
    Jan 31, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Søren A. Fuglsang; Søren A. Fuglsang; Daniel D.E. Wong; Daniel D.E. Wong; Jens Hjortkjær; Jens Hjortkjær
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Description

    This dataset contains EEG recordings from 18 subjects listening to one of two competing speech audio streams. Continuous speech in trials of ~50 sec. was presented to normal hearing listeners in simulated rooms with different degrees of reverberation. Subjects were asked to attend one of two spatially separated speakers (one male, one female) and ignore the other. Repeated trials with presentation of a single talker were also recorded. The data were recorded in a double-walled soundproof booth at the Technical University of Denmark (DTU) using a 64-channel Biosemi system and digitized at a sampling rate of 512 Hz. Full details can be found in:

    • Søren A. Fuglsang, Torsten Dau & Jens Hjortkjær (2017): Noise-robust cortical tracking of attended speech in real-life environments. NeuroImage, 156, 435-444

    and

    • Daniel D.E. Wong, Søren A. Fuglsang, Jens Hjortkjær, Enea Ceolini, Malcolm Slaney & Alain de Cheveigné: A Comparison of Temporal Response Function Estimation Methods for Auditory Attention Decoding. Frontiers in Neuroscience, https://doi.org/10.3389/fnins.2018.00531

    The data is organized in format of the publicly available COCOHA Matlab Toolbox. The preproc_script.m demonstrates how to import and align the EEG and audio data. The script also demonstrates some EEG preprocessing steps as used the Wong et al. paper above. The AUDIO.zip contains wav-files with the speech audio used in the experiment. The EEG.zip contains MAT-files with the EEG/EOG data for each subject. The EEG/EOG data are found in data.eeg with the following channels:

    • channels 1-64: scalp EEG electrodes
    • channel 65: right mastoid electrode
    • channel 66: left mastoid electrode
    • channel 67: vertical EOG below right eye
    • channel 68: horizontal EOG right eye
    • channel 69: vertical EOG above right eye
    • channel 70: vertical EOG below left eye
    • channel 71: horizontal EOG left eye
    • channel 72: vertical EOG above left eye

    The expinfo table contains information about experimental conditions, including what what speaker the listener was attending to in different trials. The expinfo table contains the following information:

    • attend_mf: attended speaker (1=male, 2=female)
    • attend_lr: spatial position of the attended speaker (1=left, 2=right)
    • acoustic_condition: type of acoustic room (1= anechoic, 2= mild reverberation, 3= high reverberation, see Fuglsang et al. for details)
    • n_speakers: number of speakers presented (1 or 2)
    • wavfile_male: name of presented audio wav-file for the male speaker
    • wavfile_female: name of presented audio wav-file for the female speaker (if any)
    • trigger: trigger event value for each trial also found in data.event.eeg.value

    DATA_preproc.zip contains the preprocessed EEG and audio data as output from preproc_script.m.

    The dataset was created within the COCOHA Project: Cognitive Control of a Hearing Aid

  13. b

    Harvard Electroencephalography Database

    • bdsp.io
    • registry.opendata.aws
    Updated Dec 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sahar Zafar; Tobias Loddenkemper; Jong Woo Lee; Andrew Cole; Daniel Goldenholz; Jurriaan Peters; Alice Lam; Edilberto Amorim; Catherine Chu; Sydney Cash; Valdery Moura Junior; Aditya Gupta; Manohar Ghanta; Marta Fernandes; Haoqi Sun; Jin Jing; M Brandon Westover (2024). Harvard Electroencephalography Database [Dataset]. http://doi.org/10.60508/xbsx-vr76
    Explore at:
    Dataset updated
    Dec 12, 2024
    Authors
    Sahar Zafar; Tobias Loddenkemper; Jong Woo Lee; Andrew Cole; Daniel Goldenholz; Jurriaan Peters; Alice Lam; Edilberto Amorim; Catherine Chu; Sydney Cash; Valdery Moura Junior; Aditya Gupta; Manohar Ghanta; Marta Fernandes; Haoqi Sun; Jin Jing; M Brandon Westover
    License

    https://github.com/bdsp-core/bdsp-license-and-duahttps://github.com/bdsp-core/bdsp-license-and-dua

    Description

    The Harvard EEG Database will encompass data gathered from four hospitals affiliated with Harvard University: Massachusetts General Hospital (MGH), Brigham and Women's Hospital (BWH), Beth Israel Deaconess Medical Center (BIDMC), and Boston Children's Hospital (BCH). The EEG data includes three types:

    rEEG: "routine EEGs" recorded in the outpatient setting.
    EMU: recordings obtained in the inpatient setting, within the Epilepsy Monitoring Unit (EMU).
    ICU/LTM: recordings obtained from acutely and critically ill patients within the intensive care unit (ICU).
    
  14. i

    Preprocessed CHB-MIT Scalp EEG Database

    • ieee-dataport.org
    Updated Jan 24, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Deepa B (2023). Preprocessed CHB-MIT Scalp EEG Database [Dataset]. http://doi.org/10.21227/awcw-mn88
    Explore at:
    Dataset updated
    Jan 24, 2023
    Dataset provided by
    IEEE Dataport
    Authors
    Deepa B
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Recent advances in computational power availibility and cloud computing has prompted extensive research in epileptic seizure detection and prediction. EEG (electroencephalogram) datasets from ‘Dept. of Epileptology, Univ. of Bonn’ and ‘CHB-MIT Scalp EEG Database’ are publically available datasets which are the most sought after amongst researchers. Bonn dataset is very small compared to CHB-MIT. But still researchers prefer Bonn as it is in simple '.txt' format. The dataset being published here is a preprocessed form of CHB-MIT. The dataset is available in '.csv' format. Machine learning and Deep learning models are easily implementable with aid of '.csv' format.

  15. P

    CHB-MIT Dataset

    • paperswithcode.com
    Updated Aug 2, 2016
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2016). CHB-MIT Dataset [Dataset]. https://paperswithcode.com/dataset/chb-mit
    Explore at:
    Dataset updated
    Aug 2, 2016
    Description

    The CHB-MIT dataset is a dataset of EEG recordings from pediatric subjects with intractable seizures. Subjects were monitored for up to several days following withdrawal of anti-seizure mediation in order to characterize their seizures and assess their candidacy for surgical intervention. The dataset contains 23 patients divided among 24 cases (a patient has 2 recordings, 1.5 years apart). The dataset consists of 969 Hours of scalp EEG recordings with 173 seizures. There exist various types of seizures in the dataset (clonic, atonic, tonic). The diversity of patients (Male, Female, 10-22 years old) and different types of seizures contained in the datasets are ideal for assessing the performance of automatic seizure detection methods in realistic settings.

  16. Turkish Epilepsy EEG Dataset

    • kaggle.com
    Updated Jan 26, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    BURAK TAŞCI (2023). Turkish Epilepsy EEG Dataset [Dataset]. https://www.kaggle.com/datasets/buraktaci/turkish-epilepsy
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jan 26, 2023
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    BURAK TAŞCI
    Description

    Cite: Tasci I, Tasci B, Barua PD, Dogan S, Tuncer T, Palmer EE, et al. Epilepsy detection in 121 patient populations using hypercube pattern from EEG signals. Information Fusion. 2023. https://doi.org/10.1016/j.inffus.2023.03.022

    The dataset consists of 71 Healthy Control and 50 Epilepsy EEG signals. The sampling frequency is 500Hz. Channel name abbreviations: parietal (P), temporal (T), frontal (F), occipital (O), frontopolar (Fp), central (C), auricular (A). Even numbers represent the right hemisphere, odd numbers represent the left hemisphere. Note: Thirty-sixth channel was not used in the study.

    |No||Channel||No||Channel| |---|---| |1||FP1A1||19||T6A2| |2||FP2A2||20||FP1F7| |3||F3A1||21||F7T3| |4||F4A2||22||T3T5| |5||FZA2||23||T5O1| |6||C3A1||24||FP1F3| |7||C4A2||25||F3C3| |8||CZA1||26||C3P3| |9||P3A1||27||P3O1| |10||P4A2||28||P2P4| |11||PZA2||29||F4C4| |12||O1A1||30||C4P4| |13||O2A2||31||P4O2| |14||F7A1||32||FP2F8| |15||F8A1||33||F8T4| |16||T3A1||34||T4T6| |17||T4A2||35||T6O2| |18||T5A1||36||X1A1|

  17. m

    An EEG Recordings Dataset for Mental Stress Detection

    • data.mendeley.com
    Updated Apr 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Megha Mane (2023). An EEG Recordings Dataset for Mental Stress Detection [Dataset]. http://doi.org/10.17632/wnshbvdxs2.1
    Explore at:
    Dataset updated
    Apr 3, 2023
    Authors
    Megha Mane
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This article presents an EEG dataset collected using the EMOTIV EEG 5-Channel Sensor kit during four different types of stimulation: Complex mathematical problem solving, Trier mental challenge test, Stroop colour word test, and Horror video stimulation, Listening to relaxing music. The dataset consists of EEG recordings from 22 subjects for Complex mathematical problem solving, 24 for Trier mental challenge test, 24 for Stroop colour word test, 22 for horror video stimulation, and 20 for relaxed state recordings. The data was collected in order to investigate the neural correlates of stress and to develop models for stress detection based on EEG data. The dataset presented in this article can be used for various applications, including stress management, healthcare, and workplace safety. The dataset provides a valuable resource for researchers and developers working on stress detection using EEG data, while the stress detection method provides a useful tool for evaluating the effectiveness of different stress detection models. Overall, this article contributes to the growing body of research on stress detection and management using EEG data and provides a useful resource for researchers and practitioners working in this field.

  18. i

    EEG dataset of 7-day Motor Imagery BCI

    • ieee-dataport.org
    Updated Sep 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Qing Zhou (2023). EEG dataset of 7-day Motor Imagery BCI [Dataset]. http://doi.org/10.21227/f1c7-7x89
    Explore at:
    Dataset updated
    Sep 4, 2023
    Dataset provided by
    IEEE Dataport
    Authors
    Qing Zhou
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In this dataset, we performed a seven-day motor imagery (MI) based BCI experiment without feedback training on 20 healthy subjects. The MI tasks include left hand, right hand, feet and idle task.

  19. p

    Auditory evoked potential EEG-Biometric dataset

    • physionet.org
    Updated Dec 1, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nibras Abo Alzahab; Angelo Di Iorio; Luca Apollonio; Muaaz Alshalak; Alessandro Gravina; Luca Antognoli; Marco Baldi; Lorenzo Scalise; Bilal Alchalabi (2021). Auditory evoked potential EEG-Biometric dataset [Dataset]. http://doi.org/10.13026/ps31-fc50
    Explore at:
    Dataset updated
    Dec 1, 2021
    Authors
    Nibras Abo Alzahab; Angelo Di Iorio; Luca Apollonio; Muaaz Alshalak; Alessandro Gravina; Luca Antognoli; Marco Baldi; Lorenzo Scalise; Bilal Alchalabi
    License

    https://github.com/MIT-LCP/license-and-dua/tree/master/draftshttps://github.com/MIT-LCP/license-and-dua/tree/master/drafts

    Description

    This data set consists of over 240 two-minute EEG recordings obtained from 20 volunteers. Resting-state and auditory stimuli experiments are included in the data. The goal is to develop an EEG-based Biometric system.

    The data includes resting-state EEG signals in both cases: eyes open and eyes closed. The auditory stimuli part consists of six experiments; Three with in-ear auditory stimuli and another three with bone-conducting auditory stimuli. The three stimuli for each case are a native song, a non-native song, and neutral music.

  20. SRM Resting-state EEG

    • openneuro.org
    Updated Aug 25, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Christoffer Hatlestad-Hall; Trine Waage Rygvold; Stein Andersson (2021). SRM Resting-state EEG [Dataset]. http://doi.org/10.18112/openneuro.ds003775.v1.0.0
    Explore at:
    Dataset updated
    Aug 25, 2021
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Christoffer Hatlestad-Hall; Trine Waage Rygvold; Stein Andersson
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    SRM Resting-state EEG

    Introduction

    This EEG dataset contains resting-state EEG extracted from the experimental paradigm used in the Stimulus-Selective Response Modulation (SRM) project at the Dept. of Psychology, University of Oslo, Norway.

    The data is recorded with a BioSemi ActiveTwo system, using 64 electrodes following the positional scheme of the extended 10-20 system (10-10). Each datafile comprises four minutes of uninterrupted EEG acquired while the subjects were resting with their eyes closed. The dataset includes EEG from 111 healthy control subjects (the "t1" session), of which a number underwent an additional EEG recording at a later date (the "t2" session). Thus, some subjects have one associated EEG file, whereas others have two.

    Disclaimer

    The dataset is provided "as is". Hereunder, the authors take no responsibility with regard to data quality. The user is solely responsible for ascertaining that the data used for publications or in other contexts fulfil the required quality criteria.

    The data

    Raw data files

    The raw EEG data signals are rereferenced to the average reference. Other than that, no operations have been performed on the data. The files contain no events; the whole continuous segment is resting-state data. The data signals are unfiltered (recorded in Europe, the line noise frequency is 50 Hz). The time points for the subject's EEG recording(s), are listed in the *_scans.tsv file (particularly interesting for the subjects with two recordings).

    Please note that the quality of the raw data has not been carefully assessed. While most data files are of high quality, a few might be of poorer quality. The data files are provided "as is", and it is the user's esponsibility to ascertain the quality of the individual data file.

    /derivatives/cleaned_data

    For convenience, a cleaned dataset is provided. The files in this derived dataset have been preprocessed with a basic, fully automated pipeline (see /code/s2_preprocess.m for details) directory for details. The derived files are stored as EEGLAB .set files in a directory structure identical to that of the raw files. Please note that the *_channels.tsv files associated with the derived files have been updated with status information about each channel ("good" or "bad"). The "bad" channels are – for the sake of consistency – interpolated, and thus still present in the data. It might be advisable to remove these channels in some analyses, as they (per definition) do not provide anything to the EEG data. The cleaned data signals are referenced to the average reference (including the interpolated channels).

    Please mind the automatic nature of the employed pipeline. It might not perform optimally on all data files (e.g. over-/underestimating proportion of bad channels). For publications, we recommend implementing a more sensitive cleaning pipeline.

    Demographic and cognitive test data

    The participants.tsv file in the root folder contains the variables age, sex, and a range of cognitive test scores. See the sidecar participants.json for more information on the behavioural measures. Please note that these measures were collected in connection with the "t1" session recording.

    How to cite

    All use of this dataset in a publication context requires the following paper to be cited:

    Rygvold, T. W., Hatlestad‐Hall, C., Elvsåshagen, T., Moberget, T., & Andersson, S. (2021). Do visual and auditory stimulus‐specific response modulation reflect different mechanisms of neocortical plasticity?. European Journal of Neuroscience, 53(4), 1072-1085. DOI: http://dx.doi.org/10.1111/ejn.14964

    Contact

    Questions regarding the EEG data may be addressed to Christoffer Hatlestad-Hall (chr.hh@pm.me).

    Question regarding the project in general may be addressed to Stein Andersson (stein.andersson@psykologi.uio.no) or Trine W. Rygvold (t.w.rygvold@psykologi.uio.no).

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Alessandro Thomas Gifford (2024). A large and rich EEG dataset for modeling human visual object recognition [Dataset]. http://doi.org/10.17605/OSF.IO/3JK45

A large and rich EEG dataset for modeling human visual object recognition

Explore at:
4 scholarly articles cite this dataset (View in Google Scholar)
Dataset updated
Aug 30, 2024
Dataset provided by
Center For Open Science
Authors
Alessandro Thomas Gifford
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

Source, raw and preprocessed EEG data, resting state EEG data, image set, DNN feature maps and code of the paper: "A large and rich EEG dataset for modeling human visual object recognition".

Search
Clear search
Close search
Google apps
Main menu