CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset was obtained from The Open MEG Archive (OMEGA, https://omega.bic.mni.mcgill.ca).
You are free to use all data in OMEGA for research purposes; please acknowledge its authors and cite the following reference in your publications if you have used data from OMEGA:
Niso G., Rogers C., Moreau J.T., Chen L.Y., Madjar C., Das S., Bock E., Tadel F., Evans A.C., Jolicoeur P., Baillet S. (2016). OMEGA: The Open MEG Archive. NeuroImage 124, 1182-1187. doi: https://doi.org/10.1016/j.neuroimage.2015.04.028. OMEGA is available at: https://omega.bic.mni.mcgill.ca
Experiment
MEG acquisition
Head shape and fiducial points
Subject anatomy
BIDS
The data in this dataset has been organized according to the MEG-BIDS specification (Brain Imaging Data Structure, http://bids.neuroimaging.io) (Niso et al. 2018)
Niso G., Gorgolewski K.J., Bock E., Brooks T.L., Flandin G., Gramfort A., Henson R.N., Jas M., Litvak V., Moreau J., Oostenveld R., Schoffelen J.M., Tadel F., Wexler J., Baillet S. (2018). MEG-BIDS: an extension to the Brain Imaging Data Structure for magnetoencephalography. Scientific Data; 5, 180110. https://doi.org/10.1038/sdata.2018.110
Release history:
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This archive contains sample output files for the sample data accompanying the Princeton Handbook for Reproducible Neuroimaging. Outputs include the NIfTI images converted using HeuDiConv (v0.5.dev1) and organized according to the BIDS standard, quality control evaluation using MRIQC (v0.10.4), data preprocessed using fMRIPrep (v1.4.1rc1), and other auxiliary files. All outputs were created according to the procedures outlined in the handbook, and are intended to serve as a didactic reference for use with the handbook. The sample data from which the outputs are derived were acquired (with informed consent) using the ReproIn naming convention on a Siemens Skyra 3T MRI scanner. The sample data include a T1-weighted anatomical image, four functional runs with the “prettymouth” spoken story stimulus, and one functional run with a block design emotional faces task, as well as auxiliary scans (e.g., scout, soundcheck). The “prettymouth” story stimulus created by Yeshurun et al., 2017 and is available as part of the Narratives collection, and the emotional faces task is similar to Chai et al., 2015. The brain data are contributed by author S.A.N. and are authorized for non-anonymized distribution.
The underlying mechanisms of recovery of motor function after stroke are an important study target to enable individual rehabilitation strategies and improve motor outcome of patients. To this end longitudinal studies of stroke patients are crucial to further increase our understanding of such mechanisms. The present data set provides longitudinal magnetic resonance imaging data of 36 ischemic stroke patients and 15 healthy controls. Data were acquired at 3-5, 30-40, 85-95 and 340-380 days post stroke onset for the patient group. The data set was brought into BIDS structure and annotated following the openMINDS standard to prepare it for the automated preprocessing pipeline tailored for stroke data (Bey et al. 2024).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is a eye-tracking dataset in the BIDS format (http://bids.neuroimaging.io/).
Please find the details on the study here:
Gagl B. (2016) Blue hypertext is a good design decision: no perceptual disadvantage in reading and successful highlighting of relevant information. PeerJ 4:e2467 https://doi.org/10.7717/peerj.2467
We present processed multimodal empirical data from a study with The Virtual Brain (TVB) based on this data. Structural and functional data have been prepared in accordance with Brain Imaging Data Structure (BIDS) standards and annotated according to the openMINDS metadata framework. This simultaneous electroencephalography (EEG) - functional magnetic resonance imaging (fMRI) resting-state data, diffusion-weighted MRI (dwMRI), and structural MRI were acquired for 50 healthy adult subjects (18 - 80 years of age, mean 41.24±18.33; 31 females, 19 males) at the Berlin Center for Advanced Imaging, Charité University Medicine, Berlin, Germany. We constructed personalized models from this multimodal data of 50 healthy individuals with TVB in a previous study (Triebkorn et al. 2024). We present this large comprehensive processed data set in an annotated and structured format following BIDS standards for derivatives of MRI and BIDS Extension Proposal for computational modeling data. We describe how we processed and converted the diverse data sources to make it reusable. In its current form, this dataset can be reused for further research and provides ready-to-use data at various levels of processing for a large data set of healthy subjects with a wide age range.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset contains raw and pre-processed EEG data from a mobile EEG study investigating the effects of cognitive task demands, motor demands, and environmental complexity on attentional processing (see below for experiment details).
All preprocessing and analysis code is deposited in the code
directory. The entire MATLAB pipeline can be reproduced by executing the run_pipeline.m
script. In order to run these scripts, you will need to ensure you have the required MATLAB toolboxes and R packages on your system. You will also need to adapt def_local.m
to specify local paths to MATLAB and EEGLAB. Descriptive statistics and mixed-effects models can be reproduced in R by running the stat_analysis.R
script.
See below for software details.
In addition to citing this dataset, please cite the original manuscript reporting data collection and experimental procedures.
For more information, see the dataset_description.json
file.
ODC Open Database License (ODbL). For more information, see the LICENCE
file.
Dataset is formatted according to the EEG-BIDS extension (Pernet et al., 2019) and the BIDS extension proposal for common electrophysiological derivatives (BEP021) v0.0.1, which can be found here:
Note that BEP021 is still a work in progress as of 2021-03-01.
Generally, you can find data in the .tsv files and descriptions in the accompanying .json files.
An important BIDS definition to consider is the "Inheritance Principle" (see 3.5 in the BIDS specification: http://bids.neuroimaging.io/bids_spec.pdf), which states:
Any metadata file (.json, .bvec, .tsv, etc.) may be defined at any directory level. The values from the top level are inherited by all lower levels unless they are overridden by a file at the lower level.
Forty-four healthy adults aged 18-40 performed an oddball task involving complex tone (piano and horn) stimuli in three settings: (1) sitting in a quiet room in the lab (LAB); (2) walking around a sports field (FIELD); (3) navigating a route through a university campus (CAMPUS).
Participants performed each environmental condition twice: once while attending to oddball stimuli (i.e. counting the number of presented deviant tones; COUNT), and once while disregarding or ignoring the tone stimuli (IGNORE).
EEG signals were recorded from 32 active electrodes using a Brain Vision LiveAmp 32 amplifier. See manuscript for further details.
MATLAB Version: 9.7.0.1319299 (R2019b) Update 5 MATLAB License Number: 678256 Operating System: Microsoft Windows 10 Enterprise Version 10.0 (Build 18363) Java Version: Java 1.8.0_202-b08 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
The following toolboxes/helper functions were also used:
R version 3.6.2 (2019-12-12)
Platform: x86_64-w64-mingw32/x64 (64-bit)
locale: _LC_COLLATE=English_Australia.1252_, _LC_CTYPE=English_Australia.1252_, _LC_MONETARY=English_Australia.1252_, _LC_NUMERIC=C_ and _LC_TIME=English_Australia.1252_
attached base packages:
other attached packages:
loaded via a namespace (and not attached):
This archive contains a raw DICOM dataset acquired (with informed consent) using the ReproIn naming convention on a Siemens Skyra 3T MRI scanner. The dataset includes a T1-weighted anatomical image, four functional runs with the “prettymouth” spoken story stimulus, and one functional run with a block design emotional faces task, as well as auxiliary scans (e.g., scout, soundcheck). The “prettymouth” story stimulus created by Yeshurun et al., 2017 and is available as part of the Narratives collection, and the emotional faces task is similar to Chai et al., 2015. These data are intended for use with the Princeton Handbook for Reproducible Neuroimaging. The handbook provides guidelines for BIDS conversion and execution of BIDS apps (e.g., fMRIPrep, MRIQC). The brain data are contributed by author S.A.N. and are authorized for non-anonymized distribution.
We present simulation results from a study with The Virtual Brain (TVB). Structural, functional and simulated data have been prepared in accordance with Brain Imaging Data Structure (BIDS) standards and annotated according to the openMINDS metadata framework. This simultaneous electroencephalography (EEG) - functional magnetic resonance imaging (fMRI) resting-state data, diffusion-weighted MRI (dwMRI), and structural MRI were acquired for 50 healthy adult subjects (18 - 80 years of age, mean 41.24±18.33; 31 females, 19 males) at the Berlin Center for Advanced Imaging, Charité University Medicine, Berlin, Germany. We constructed personalized models from this multimodal data of 50 healthy individuals with TVB. We calculated the optimal parameters on an individual basis that predict multiple empirical features in fMRI and EEG, e.g. dynamic functional connectivity and bimodality in the alpha band power, and analyzed inter-individual differences with respect to optimized parameters and structural as well as functional connectivity in a previous study (Triebkorn et al. 2024). We present this large comprehensive empirical and simulated data set in an annotated and structured format following the BIDS Extension Proposal for computational modeling data. We describe how we processed and converted the diverse data sources to make it reusable. In its current form, this dataset can be reused for further research and provides ready-to-use data at various levels of processing including the thereof inferred brain simulation results for a large data set of healthy subjects with a wide age range.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains unprocessed functional MRI (fMRI) data acquired in common marmosets (Callithrix jacchus), The data were obtained during a continuous infusion of the sedative medetomidine, supplemented with a low concentration of isoflurane. All experiments were carried out in accordance with the guidelines from Directive 2010/63/EU of the European Parliament on the protection of animals used for scientific purposes.
Related paper
This dataset supplements the following manuscript.
Preserving functional network structure under anesthesia in the marmoset monkey brain
M Ortiz-Rios, N Sirmpilatze, J Koenig, S Boretius - bioRxiv, 2023
doi: https://doi.org/10.1101/2023.11.21.568138
Data structure
The main data files are organized into eight zipped folders - sub-02.zip, .... sub-09.zip - each constituting a dataset formatted according to the Brain Imaging Data Structure specifications (BIDS v1.6.0).
BIDS-formatted datasets
The basic characteristics of the datasets are given below. More details can be found in the preprint.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Synopsis This is the GX dataset formatted to comply with BIDS standard format.
The tES/EEG/CTT/Vigilance experiment contains 19 unique participants (some repeated experiments). Over a 70 min period EEG/ECG/EOG were recorded concurrently with a CTT where participants maintained a ball at the center of the screen and were periodically stimulated (with low-intensity noninvasive brain stimulation) for 30 secs with combinations of 9 stimulation montages.
For the raw data please see: https://zenodo.org/record/4456079
For methodological details please see corresponding article titled: Dataset of concurrent EEG, ECG, and behavior with multiple doses of transcranial Electrical Stimulation
Data Descriptor Abstract We present a dataset combining human-participant high-density electroencephalography (EEG) with physiological and continuous behavioral metrics during transcranial electrical stimulation (tES). Data include within participant application of nine High-Definition tES (HD-tES) types, targeting three cortical regions (frontal, motor, parietal) with three stimulation waveforms (DC, 5 Hz, 30 Hz); more than 783 total stimulation trials over 62 sessions with EEG, physiological (ECG, EOG), and continuous behavioral vigilance/alertness metrics. Experiment 1 and 2 consisted of participants performing a continuous vigilance/alertness task over three 70-minute and two 70.5-minute sessions, respectively. Demographic data were collected, as well as self-reported wellness questionnaires before and after each session. Participants received all 9 stimulation types in Experiment 1, with each session including three stimulation types, with 4 trials per type. Participants received 2 stimulation types in Experiment 2, with 20 trials of a given stimulation type per session. Within-participant reliability was tested by repeating select sessions. This unique dataset supports a range of hypothesis testing including interactions of tDCS/tACS location and frequency, brain-state, physiology, fatigue, and cognitive performance.
For more details please see the full data descriptor article.
Code used to import and process this dataset can be found here: GitHub : https://github.com/ngebodh/GX_tES_EEG_Physio_Behavior
For downsampled data please see: Experiment 1 : https://doi.org/10.5281/zenodo.3840615 Experiment 2 : https://doi.org/10.5281/zenodo.3840617
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
DANDI is a public archive of neurophysiology datasets, including raw and processed data, and associated software containers. Datasets are shared according to a Creative Commons CC0 or CC-BY licenses. The data archive provides a broad range of cellular neurophysiology data. This includes electrode and optical recordings, and associated imaging data using a set of community standards: NWB:N - NWB:Neurophysiology, BIDS - Brain Imaging Data Structure, and NIDM - Neuro Imaging Data Model. Development of DANDI is supported by the National Institute of Mental Health.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The recent and growing focus on reproducibility in neuroimaging studies has led many major academic centers to use cloud-based imaging databases for storing, analyzing, and sharing complex imaging data. Flywheel is one such database platform that offers easily accessible, large-scale data management, along with a framework for reproducible analyses through containerized pipelines. The Brain Imaging Data Structure (BIDS) is the de facto standard for neuroimaging data, but curating neuroimaging data into BIDS can be a challenging and time-consuming task. In particular, standard solutions for BIDS curation are limited on Flywheel. To address these challenges, we developed “FlywheelTools,” a software toolbox for reproducible data curation and manipulation on Flywheel. FlywheelTools includes two elements: fw-heudiconv, for heuristic-driven curation of data into BIDS, and flaudit, which audits and inventories projects on Flywheel. Together, these tools accelerate reproducible neuroscience research on the widely used Flywheel platform.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset was acquired using various EPI protovols on multiple subjects, multiple sites and multiple MRI vendors and models to develop a method to automate the time-consuming segmentation of the spinal cord for fMRI. The list of subjects is available in participants.tsv.
This dataset follows the BIDS convention. The contributors have the necessary ethics & permissions to share the data publicly.
The dataset does not include any identifiable personal health information, including names,zip codes, dates of birth, facial features.
Each participant's data is in one subdirectory, which contains the mean of motion-corrected volumes (the mean image that was used to draw the spinal cord mask) as well as the associated metadata. Spinal cord masks that were generated based on mean of motion-corrected volumes are found under derivatives/label/sub-subjectID/sub-subjectID_task-rest_desc-spinalcordmask.nii.gz.
If you reference this dataset in your publications, please cite the following publication: Link to be added. Should you have any questions about this data set, please contact mkaptan@stanford.edu and banerjee.rohan98@gmail.com
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset contains fMRI, behavioural and eye-tracking data on a experiment testing the involvement of Medial Temporal Lobe (MTL) subregions in category-specific associative inference in memory. The dataset follows the Brain Imaging Data Structure (BIDS) format.
├── dataset_description.json
├── participants.json
├── participants.tsv
├── README.md
├── derivatives/
│ ├── AIMDeconvolveOutput
│ │ └── sub-[subjectID]/
│ ├── LocalDeconvolveOutput
│ │ └── sub-[subjectID]/
│ └── ROIs
│ └── sub-[subjectID]/
└── sub-[subjectID]/
├── anat/
│ ├── sub-[subjectID]_T1w.nii.gz
│ └── sub-[subjectID]_T2w.nii.gz
└── func/
├── sub-[subjectID]_task-aim_run-[01-08]_bold.nii.gz
├── sub-[subjectID]_task-aim_run-[01-08]_bold.json
├── sub-[subjectID]_task-local_bold.nii.gz
├── sub-[subjectID]_task-local_bold.json
├── sub-[subjectID]_task-aim_events.tsv
├── sub-[subjectID]_task-aim_events.json
├── sub-[subjectID]_task-local_events.tsv
├── sub-[subjectID]_task-local_events.json
├── sub-[subjectID]_task-aim_eyetracking.tsv
└── sub-[subjectID]_task-aim_eyetracking.json
Associative Inference Memory (AIM) Task
Local Task
AIMDeconvolveOutput
)
LocalDeconvolveOutput
)
ROIs
)
antsRegistrationSynQuick
in ANTS.antsApplyTransform.sh
in ANTS.3dWarp
in AFNI.3dAllineate
in AFNI.dataset_description.json
: Dataset-level metadataparticipants.tsv
: Participant informationparticipants.json
: Description of participant-level variablesAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains unprocessed task-free functional MRI (fMRI) data acquired in three different mammalian species: long-tailed macaques (Macaca fascicularis), common marmosets (Callithrix jacchus), and rats (Rattus Norvegicus, Wistar strain). The data were obtained during isoflurane anesthesia, with the animals intubated and mechanically ventilated. All experiments were carried out in accordance with the guidelines from Directive 2010/63/EU of the European Parliament on the protection of animals used for scientific purposes.
Related paper
This dataset supplements the following preprint:
Sirmpilatze N, Mylius J, Ortiz-Rios M, Baudewig J, Paasonen J, Golkowski D, Ranft A, Ilg R, Gröhn O, Boretius S. 2021. Spatial signatures of anesthesia-induced burst-suppression differ between primates and rodents. bioRxiv. doi:10.1101/2021.10.15.464515
Data structure
The main data files are organized into four zipped folders - Macaque.zip, Marmoset.zip, Rat1.zip, Rat2.zip - each constituting a dataset formatted according to the Brain Imaging Data Structure specifications (BIDS v1.6.0).
BIDS-formatted datasets
The basic characteristics of the datasets are given below. More details can be found in the preprint.
Example data
Before you commit to downloading the BIDS-formatted datasets, we encourage you to examine the example data that we provide in the root folder. These include one anatomical (stuctural MRI) and one functional (fMRI) scan from each of the four datasets (Rat2 contains functional scans only), with their respecitve .json sidecars. A preview of these example scans is provided by 0_preview.pdf.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Neuroscience studies entail the generation of massive collections of heterogeneous data (e.g. demographics, clinical records, medical images). Integration and analysis of such data in research centers is pivotal for elucidating disease mechanisms and improving clinical outcomes. However, data collection in clinics often relies on non-standardized methods, such as paper-based documentation. Moreover, diverse data types are collected in different departments hindering efficient data organization, secure sharing and compliance to the FAIR (Findable, Accessible, Interoperable, Reusable) principles. Henceforth, in this manuscript we present a specialized data management system designed to enhance research workflows in Deep Brain Stimulation (DBS), a state-of-the-art neurosurgical procedure employed to treat symptoms of movement and psychiatric disorders. The system leverages REDCap to promote accurate data capture in hospital settings and secure sharing with research institutes, Brain Imaging Data Structure (BIDS) as image storing standard and a DBS-specific SQLite database as comprehensive data store and unified interface to all data types. A self-developed Python tool automates the data flow between these three components, ensuring their full interoperability. The proposed framework has already been successfully employed for capturing and analyzing data of 107 patients from 2 medical institutions. It effectively addresses the challenges of managing, sharing and retrieving diverse data types, fostering advancements in data quality, organization, analysis, and collaboration among medical and research institutions.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
GX Dataset downsampled - Experiment 1
The GX Dataset is a dataset of combined tES, EEG, physiological, and behavioral signals from human subjects.
Here the GX Dataset for Experiment 1 is downsampled to 1 kHz and saved in .MAT format which can be used in both MATLAB and Python.
Publication
A full data descriptor is published in Nature Scientific Data. Please cite this work as:
Gebodh, N., Esmaeilpour, Z., Datta, A. et al. Dataset of concurrent EEG, ECG, and behavior with multiple doses of transcranial electrical stimulation. Sci Data 8, 274 (2021). https://doi.org/10.1038/s41597-021-01046-y
Descriptions
A dataset combining high-density electroencephalography (EEG) with physiological and continuous behavioral metrics during transcranial electrical stimulation (tES). Data includes within subject application of nine High-Definition tES (HD-tES) types targeted three brain regions (frontal, motor, parietal) with three waveforms (DC, 5Hz, 30Hz), with more than 783 total stimulation trials over 62 sessions with EEG, physiological (ECG, EOG), and continuous behavioral vigilance/alertness metrics.
Acknowledgments
Portions of this study were funded by X (formerly Google X), the Moonshot Factory. The funding source had no influence on study conduction or result evaluation. MB is further supported by grants from the National Institutes of Health: R01NS101362, R01NS095123, R01NS112996, R01MH111896, R01MH109289, and (to NG) NIH-G-RISE T32GM136499.
Extras
Back to Full GX Dataset : https://doi.org/10.5281/zenodo.4456079
For downsampled data (1 kHz ) please see (in .mat format):
Code used to import, process, and plot this dataset can be found here:
Additional figures for this project have been shared on Figshare. Trial-wise figures can be found here:
The full dataset is also provided in BIDS format here:
Data License
Creative Common 4.0 with attribution (CC BY 4.0)
NOTE
Please email ngebodh01@citymail.cuny.edu with any questions.
Updates
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Abstract: Influential accounts claim that violent video games (VVG) decrease players' emotional empathy by desensitizing them to both virtual and real-life violence. However, scientific evidence for this claim is inconclusive and controversially debated. To assess the causal effect of VVGs on the behavioral and neural correlates of empathy and emotional reactivity to violence, we conducted a prospective experimental study using functional magnetic resonance imaging (fMRI). We recruited eighty-nine male participants without prior VVG experience. Over the course of two weeks, participants played either a highly violent video game, or a non-violent version of the same game. Before and after this period, participants completed an fMRI experiment with paradigms measuring their empathy for pain and emotional reactivity to violent images. Applying a Bayesian analysis approach throughout enabled us to find substantial evidence for the absence of an effect of VVGs on the behavioral and neural correlates of empathy. Moreover, participants in the VVG group were not desensitized to images of real-world violence. These results imply that short and controlled exposure to VVGs does not numb empathy nor the responses to real-world violence. We discuss the implications of our findings regarding the potential and limitations of experimental research on the causal effects of VVGs. While VVGs might not have a discernible effect on the investigated subpopulation within our carefully controlled experimental setting, our results cannot preclude that effects could be found in special vulnerable subpopulations, or in settings with higher ecological validity. Dataset:This dataset contains the fMRI data collected for the study in the BIDS-format (https://bids.neuroimaging.io/)
functional neuroimaging (*_bold.nii.gz) data of 89 human participants, collected during two tasks:
Empathy-for-Pain paradigm (Session 1 & 2) Emotional Reactivity paradigm (Session 2) associated event files (*_events.tsv) containing event onsets, durations, and behavioral covariates metadata FMRI bold timeseries are fully preprocessed, as described in the manuscript. Additional data, such as behavioral data in a simpler format, can be accessed on https://osf.io/yx423/
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
List of currently available BIDS Apps.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Neuroscience studies require considerable bioinformatic support and expertise. Numerous high-dimensional and multimodal datasets must be preprocessed and integrated to create robust and reproducible analysis pipelines. We describe a common data elements and scalable data management infrastructure that allows multiple analytics workflows to facilitate preprocessing, analysis and sharing of large-scale multi-level data. The process uses the Brain Imaging Data Structure (BIDS) format and supports MRI, fMRI, EEG, clinical, and laboratory data. The infrastructure provides support for other datasets such as Fitbit and flexibility for developers to customize the integration of new types of data. Exemplar results from 200+ participants and 11 different pipelines demonstrate the utility of the infrastructure.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset was obtained from The Open MEG Archive (OMEGA, https://omega.bic.mni.mcgill.ca).
You are free to use all data in OMEGA for research purposes; please acknowledge its authors and cite the following reference in your publications if you have used data from OMEGA:
Niso G., Rogers C., Moreau J.T., Chen L.Y., Madjar C., Das S., Bock E., Tadel F., Evans A.C., Jolicoeur P., Baillet S. (2016). OMEGA: The Open MEG Archive. NeuroImage 124, 1182-1187. doi: https://doi.org/10.1016/j.neuroimage.2015.04.028. OMEGA is available at: https://omega.bic.mni.mcgill.ca
Experiment
MEG acquisition
Head shape and fiducial points
Subject anatomy
BIDS
The data in this dataset has been organized according to the MEG-BIDS specification (Brain Imaging Data Structure, http://bids.neuroimaging.io) (Niso et al. 2018)
Niso G., Gorgolewski K.J., Bock E., Brooks T.L., Flandin G., Gramfort A., Henson R.N., Jas M., Litvak V., Moreau J., Oostenveld R., Schoffelen J.M., Tadel F., Wexler J., Baillet S. (2018). MEG-BIDS: an extension to the Brain Imaging Data Structure for magnetoencephalography. Scientific Data; 5, 180110. https://doi.org/10.1038/sdata.2018.110
Release history: