51 datasets found
  1. MEG-BIDS OMEGA RestingState_sample

    • openneuro.org
    Updated Apr 24, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Guiomar Niso; Jeremy Moreau; Elizabeth Bock; Francois Tadel; Sylvain Baillet (2024). MEG-BIDS OMEGA RestingState_sample [Dataset]. http://doi.org/10.18112/openneuro.ds000247.v1.0.2
    Explore at:
    Dataset updated
    Apr 24, 2024
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Guiomar Niso; Jeremy Moreau; Elizabeth Bock; Francois Tadel; Sylvain Baillet
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    OMEGA - Resting State Sample Dataset

    License

    • This dataset was obtained from The Open MEG Archive (OMEGA, https://omega.bic.mni.mcgill.ca).

    • You are free to use all data in OMEGA for research purposes; please acknowledge its authors and cite the following reference in your publications if you have used data from OMEGA:

    • Niso G., Rogers C., Moreau J.T., Chen L.Y., Madjar C., Das S., Bock E., Tadel F., Evans A.C., Jolicoeur P., Baillet S. (2016). OMEGA: The Open MEG Archive. NeuroImage 124, 1182-1187. doi: https://doi.org/10.1016/j.neuroimage.2015.04.028. OMEGA is available at: https://omega.bic.mni.mcgill.ca

    Description

    Experiment

    • 5 subjects x 5 minute resting sessions, eyes open

    MEG acquisition

    • Recorded at the Montreal Neurological Institute in 2012-2016
    • Acquisition with CTF 275 MEG system at 2400Hz sampling rate
    • Anti-aliasing low-pass filter at 600Hz, files may be saved with or without the CTF 3rd order gradient compensation
    • Recorded channels (at least 297), include:
      • 26 MEG reference sensors (#2-#27)
      • 270 MEG axial gradiometers (#28-#297)
      • 1 ECG bipolar (EEG057/#298) - Not available in the empty room recordings
      • 1 vertical EOG bipolar (EEG058/#299) - Not available in the empty room recordings
      • 1 horizontal EOG bipolar (EEG059/#300) - Not available in the empty room recordings

    Head shape and fiducial points

    • 3D digitization using a Polhemus Fastrak device driven by Brainstorm. The .pos files contain:
      • The center of the CTF coils
      • The anatomical references we use in Brainstorm: nasion and ears as illustrated here
      • Around 100 head points distributed on the hard parts of the head (no soft tissues).

    Subject anatomy

    • Structural T1 image (defaced for anonymization purposes)
    • Processed with FreeSurfer 5.3
    • The anatomical fiducials (NAS, LPA, RPA) have already been marked and saved in the files fiducials.m

    BIDS

    • The data in this dataset has been organized according to the MEG-BIDS specification (Brain Imaging Data Structure, http://bids.neuroimaging.io) (Niso et al. 2018)

    • Niso G., Gorgolewski K.J., Bock E., Brooks T.L., Flandin G., Gramfort A., Henson R.N., Jas M., Litvak V., Moreau J., Oostenveld R., Schoffelen J.M., Tadel F., Wexler J., Baillet S. (2018). MEG-BIDS: an extension to the Brain Imaging Data Structure for magnetoencephalography. Scientific Data; 5, 180110. https://doi.org/10.1038/sdata.2018.110

    Release history:

    • 2016-12-01: initial release
    • 2018-07-18: release OpenNeuro ds000247 (00001 and 00002)
  2. Princeton Handbook for Reproducible Neuroimaging: Sample Output

    • zenodo.org
    • data.niaid.nih.gov
    application/gzip
    Updated Dec 10, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Elizabeth A. McDevitt; Elizabeth A. McDevitt; Anne C. Mennen; Anne C. Mennen; Paula P. Brooks; Paula P. Brooks; Samuel A. Nastase; Samuel A. Nastase (2020). Princeton Handbook for Reproducible Neuroimaging: Sample Output [Dataset]. http://doi.org/10.5281/zenodo.3727775
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Dec 10, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Elizabeth A. McDevitt; Elizabeth A. McDevitt; Anne C. Mennen; Anne C. Mennen; Paula P. Brooks; Paula P. Brooks; Samuel A. Nastase; Samuel A. Nastase
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This archive contains sample output files for the sample data accompanying the Princeton Handbook for Reproducible Neuroimaging. Outputs include the NIfTI images converted using HeuDiConv (v0.5.dev1) and organized according to the BIDS standard, quality control evaluation using MRIQC (v0.10.4), data preprocessed using fMRIPrep (v1.4.1rc1), and other auxiliary files. All outputs were created according to the procedures outlined in the handbook, and are intended to serve as a didactic reference for use with the handbook. The sample data from which the outputs are derived were acquired (with informed consent) using the ReproIn naming convention on a Siemens Skyra 3T MRI scanner. The sample data include a T1-weighted anatomical image, four functional runs with the “prettymouth” spoken story stimulus, and one functional run with a block design emotional faces task, as well as auxiliary scans (e.g., scout, soundcheck). The “prettymouth” story stimulus created by Yeshurun et al., 2017 and is available as part of the Narratives collection, and the emotional faces task is similar to Chai et al., 2015. The brain data are contributed by author S.A.N. and are authorized for non-anonymized distribution.

  3. e

    Longitudinal UKE multimodal magnetic resonance imaging data for ischemic...

    • search.kg.ebrains.eu
    Updated Mar 16, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Patrik Bey; Kiret Dhindsa; Amrit Kashyap; Michael Schirner; Jan Feldheim; Marlene Bönstrup; Robert Schulz; Bastian Cheng; Götz Thomalla; Christian Gerloff; Petra Ritter (2025). Longitudinal UKE multimodal magnetic resonance imaging data for ischemic stroke patients and healthy controls prepared as BIDS & openMINDS data set by CHARITÉ [Dataset]. http://doi.org/10.25493/3VPN-ZY7
    Explore at:
    Dataset updated
    Mar 16, 2025
    Authors
    Patrik Bey; Kiret Dhindsa; Amrit Kashyap; Michael Schirner; Jan Feldheim; Marlene Bönstrup; Robert Schulz; Bastian Cheng; Götz Thomalla; Christian Gerloff; Petra Ritter
    Description

    The underlying mechanisms of recovery of motor function after stroke are an important study target to enable individual rehabilitation strategies and improve motor outcome of patients. To this end longitudinal studies of stroke patients are crucial to further increase our understanding of such mechanisms. The present data set provides longitudinal magnetic resonance imaging data of 36 ischemic stroke patients and 15 healthy controls. Data were acquired at 3-5, 30-40, 85-95 and 340-380 days post stroke onset for the patient group. The data set was brought into BIDS structure and annotated following the openMINDS standard to prepare it for the automated preprocessing pipeline tailored for stroke data (Bey et al. 2024).

  4. Reading hyperlinks

    • zenodo.org
    zip
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Benjamin Gagl; Benjamin Gagl (2020). Reading hyperlinks [Dataset]. http://doi.org/10.5281/zenodo.1219677
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Benjamin Gagl; Benjamin Gagl
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This is a eye-tracking dataset in the BIDS format (http://bids.neuroimaging.io/).

    Please find the details on the study here:

    Gagl B. (2016) Blue hypertext is a good design decision: no perceptual disadvantage in reading and successful highlighting of relevant information. PeerJ 4:e2467 https://doi.org/10.7717/peerj.2467

  5. e

    Structural and functional connectomes and region-average fMRI from 50...

    • search.kg.ebrains.eu
    Updated Mar 17, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jil M. Meier; Paul Triebkorn; Michael Schirner; Petra Ritter (2025). Structural and functional connectomes and region-average fMRI from 50 healthy participants, age range 18-80 years [Dataset]. http://doi.org/10.25493/6CKF-MJS
    Explore at:
    Dataset updated
    Mar 17, 2025
    Authors
    Jil M. Meier; Paul Triebkorn; Michael Schirner; Petra Ritter
    Description

    We present processed multimodal empirical data from a study with The Virtual Brain (TVB) based on this data. Structural and functional data have been prepared in accordance with Brain Imaging Data Structure (BIDS) standards and annotated according to the openMINDS metadata framework. This simultaneous electroencephalography (EEG) - functional magnetic resonance imaging (fMRI) resting-state data, diffusion-weighted MRI (dwMRI), and structural MRI were acquired for 50 healthy adult subjects (18 - 80 years of age, mean 41.24±18.33; 31 females, 19 males) at the Berlin Center for Advanced Imaging, Charité University Medicine, Berlin, Germany. We constructed personalized models from this multimodal data of 50 healthy individuals with TVB in a previous study (Triebkorn et al. 2024). We present this large comprehensive processed data set in an annotated and structured format following BIDS standards for derivatives of MRI and BIDS Extension Proposal for computational modeling data. We describe how we processed and converted the diverse data sources to make it reusable. In its current form, this dataset can be reused for further research and provides ready-to-use data at various levels of processing for a large data set of healthy subjects with a wide age range.

  6. Runabout: A mobile EEG study of auditory oddball processing in laboratory...

    • openneuro.org
    Updated Nov 9, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Magnus Liebherr; Andrew W. Corcoran; Phillip M. Alday; Scott Coussens; Valeria Bellan; Caitlin A. Howlett; Maarten A. Immink; Mark Kohler; Matthias Schlesewsky; Ina Bornkessel-Schlesewsky (2021). Runabout: A mobile EEG study of auditory oddball processing in laboratory and real-world conditions [Dataset]. http://doi.org/10.18112/openneuro.ds003620.v1.1.1
    Explore at:
    Dataset updated
    Nov 9, 2021
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Magnus Liebherr; Andrew W. Corcoran; Phillip M. Alday; Scott Coussens; Valeria Bellan; Caitlin A. Howlett; Maarten A. Immink; Mark Kohler; Matthias Schlesewsky; Ina Bornkessel-Schlesewsky
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Overview

    This dataset contains raw and pre-processed EEG data from a mobile EEG study investigating the effects of cognitive task demands, motor demands, and environmental complexity on attentional processing (see below for experiment details).

    All preprocessing and analysis code is deposited in the code directory. The entire MATLAB pipeline can be reproduced by executing the run_pipeline.m script. In order to run these scripts, you will need to ensure you have the required MATLAB toolboxes and R packages on your system. You will also need to adapt def_local.m to specify local paths to MATLAB and EEGLAB. Descriptive statistics and mixed-effects models can be reproduced in R by running the stat_analysis.R script.

    See below for software details.

    Citing this dataset

    In addition to citing this dataset, please cite the original manuscript reporting data collection and experimental procedures. For more information, see the dataset_description.json file.

    License

    ODC Open Database License (ODbL). For more information, see the LICENCE file.

    Format

    Dataset is formatted according to the EEG-BIDS extension (Pernet et al., 2019) and the BIDS extension proposal for common electrophysiological derivatives (BEP021) v0.0.1, which can be found here:

    https://docs.google.com/document/d/1PmcVs7vg7Th-cGC-UrX8rAhKUHIzOI-uIOh69_mvdlw/edit#heading=h.mqkmyp254xh6

    Note that BEP021 is still a work in progress as of 2021-03-01.

    Generally, you can find data in the .tsv files and descriptions in the accompanying .json files.

    An important BIDS definition to consider is the "Inheritance Principle" (see 3.5 in the BIDS specification: http://bids.neuroimaging.io/bids_spec.pdf), which states:

    Any metadata file (.json, .bvec, .tsv, etc.) may be defined at any directory level. The values from the top level are inherited by all lower levels unless they are overridden by a file at the lower level.

    Details about the experiment

    Forty-four healthy adults aged 18-40 performed an oddball task involving complex tone (piano and horn) stimuli in three settings: (1) sitting in a quiet room in the lab (LAB); (2) walking around a sports field (FIELD); (3) navigating a route through a university campus (CAMPUS).

    Participants performed each environmental condition twice: once while attending to oddball stimuli (i.e. counting the number of presented deviant tones; COUNT), and once while disregarding or ignoring the tone stimuli (IGNORE).

    EEG signals were recorded from 32 active electrodes using a Brain Vision LiveAmp 32 amplifier. See manuscript for further details.

    MATLAB software details

    MATLAB Version: 9.7.0.1319299 (R2019b) Update 5 MATLAB License Number: 678256 Operating System: Microsoft Windows 10 Enterprise Version 10.0 (Build 18363) Java Version: Java 1.8.0_202-b08 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode

    • MATLAB (v9.7)
    • Simulink (v10.0)
    • Curve Fitting Toolbox (v3.5.10)
    • DSP System Toolbox (v9.9)
    • Image Processing Toolbox (v11.0)
    • MATLAB Compiler (v7.1)
    • MATLAB Compiler SDK (v6.7)
    • Parallel Computing Toolbox (v7.1)
    • Signal Processing Toolbox (v8.3)
    • Statistics and Machine Learning Toolbox (v11.6)
    • Symbolic Math Toolbox (v8.4)
    • Wavelet Toolbox (v5.3)

    The following toolboxes/helper functions were also used:

    • EEGLAB (v2019.1)
    • ERPLAB (v8.10)
    • ICLabel (v1.3)
    • clean_rawdata (v2.3)
    • bids-matlab-tools (v5.2)
    • dipfit (v3.4)
    • firfilt (v2.4)
    • export_fig (v3.12)
    • ColorBrewer (v3.1.0)

    R software details

    R version 3.6.2 (2019-12-12)

    Platform: x86_64-w64-mingw32/x64 (64-bit)

    locale: _LC_COLLATE=English_Australia.1252_, _LC_CTYPE=English_Australia.1252_, _LC_MONETARY=English_Australia.1252_, _LC_NUMERIC=C_ and _LC_TIME=English_Australia.1252_

    attached base packages:

    • stats
    • graphics
    • grDevices
    • utils
    • datasets
    • methods
    • base

    other attached packages:

    • sjPlot(v.2.8.7)
    • emmeans(v.1.5.1)
    • car(v.3.0-10)
    • carData(v.3.0-4)
    • lme4(v.1.1-23)
    • Matrix(v.1.2-18)
    • data.table(v.1.13.0)
    • forcats(v.0.5.0)
    • stringr(v.1.4.0)
    • dplyr(v.1.0.2)
    • purrr(v.0.3.4)
    • readr(v.1.4.0)
    • tidyr(v.1.1.2)
    • tibble(v.3.0.4)
    • ggplot2(v.3.3.2)
    • tidyverse(v.1.3.0)

    loaded via a namespace (and not attached):

    • nlme(v.3.1-149)
    • pbkrtest(v.0.4-8.6)
    • fs(v.1.5.0)
    • lubridate(v.1.7.9)
    • insight(v.0.12.0)
    • httr(v.1.4.2)
    • numDeriv(v.2016.8-1.1)
    • tools(v.3.6.2)
    • backports(v.1.1.10)
    • utf8(v.1.1.4)
    • R6(v.2.4.1)
    • sjlabelled(v.1.1.7)
    • DBI(v.1.1.0)
    • colorspace(v.1.4-1)
    • withr(v.2.3.0)
    • tidyselect(v.1.1.0)
    • curl(v.4.3)
    • compiler(v.3.6.2)
    • performance(v.0.5.0)
    • cli(v.2.1.0)
    • rvest(v.0.3.6)
    • xml2(v.1.3.2)
    • sandwich(v.3.0-0)
    • labeling(v.0.3)
    • bayestestR(v.0.7.2)
    • scales(v.1.1.1)
    • mvtnorm(v.1.1-1)
    • digest(v.0.6.25)
    • foreign(v.0.8-76)
    • minqa(v.1.2.4)
    • rio(v.0.5.16)
    • pkgconfig(v.2.0.3)
    • dbplyr(v.1.4.4)
    • rlang(v.0.4.8)
    • readxl(v.1.3.1)
    • rstudioapi(v.0.11)
    • farver(v.2.0.3)
    • generics(v.0.0.2)
    • zoo(v.1.8-8)
    • jsonlite(v.1.7.1)
    • zip(v.2.1.1)
    • magrittr(v.1.5)
    • parameters(v.0.8.6)
    • Rcpp(v.1.0.5)
    • munsell(v.0.5.0)
    • fansi(v.0.4.1)
    • abind(v.1.4-5)
    • lifecycle(v.0.2.0)
    • stringi(v.1.4.6)
    • multcomp(v.1.4-14)
    • MASS(v.7.3-53)
    • plyr(v.1.8.6)
    • grid(v.3.6.2)
    • blob(v.1.2.1)
    • parallel(v.3.6.2)
    • sjmisc(v.2.8.6)
    • crayon(v.1.3.4)
    • lattice(v.0.20-41)
    • ggeffects(v.0.16.0)
    • haven(v.2.3.1)
    • splines(v.3.6.2)
    • pander(v.0.6.3)
    • sjstats(v.0.18.1)
    • hms(v.0.5.3)
    • knitr(v.1.30)
    • pillar(v.1.4.6)
    • boot(v.1.3-25)
    • estimability(v.1.3)
    • effectsize(v.0.3.3)
    • codetools(v.0.2-16)
    • reprex(v.0.3.0)
    • glue(v.1.4.2)
    • modelr(v.0.1.8)
    • vctrs(v.0.3.4)
    • nloptr(v.1.2.2.2)
    • cellranger(v.1.1.0)
    • gtable(v.0.3.0)
    • assertthat(v.0.2.1)
    • xfun(v.0.18)
    • openxlsx(v.4.2.2)
    • xtable(v.1.8-4)
    • broom(v.0.7.1)
    • coda(v.0.19-4)
    • survival(v.3.2-7)
    • lmerTest(v.3.1-3)
    • statmod(v.1.4.34)
    • TH.data(v.1.0-10)
    • ellipsis(v.0.3.1)
  7. o

    Princeton Handbook for Reproducible Neuroimaging: Sample Data

    • explore.openaire.eu
    • data.niaid.nih.gov
    • +1more
    Updated Feb 20, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Samuel A. Nastase; Anne C. Mennen; Paula P. Brooks; Elizabeth A. McDevitt (2020). Princeton Handbook for Reproducible Neuroimaging: Sample Data [Dataset]. http://doi.org/10.5281/zenodo.3677090
    Explore at:
    Dataset updated
    Feb 20, 2020
    Authors
    Samuel A. Nastase; Anne C. Mennen; Paula P. Brooks; Elizabeth A. McDevitt
    Description

    This archive contains a raw DICOM dataset acquired (with informed consent) using the ReproIn naming convention on a Siemens Skyra 3T MRI scanner. The dataset includes a T1-weighted anatomical image, four functional runs with the “prettymouth” spoken story stimulus, and one functional run with a block design emotional faces task, as well as auxiliary scans (e.g., scout, soundcheck). The “prettymouth” story stimulus created by Yeshurun et al., 2017 and is available as part of the Narratives collection, and the emotional faces task is similar to Chai et al., 2015. These data are intended for use with the Princeton Handbook for Reproducible Neuroimaging. The handbook provides guidelines for BIDS conversion and execution of BIDS apps (e.g., fMRIPrep, MRIQC). The brain data are contributed by author S.A.N. and are authorized for non-anonymized distribution.

  8. e

    Brain network simulations derived from fMRI and structural MRI from 50...

    • search.kg.ebrains.eu
    Updated Mar 17, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jil M. Meier; Paul Triebkorn; Michael Schirner; Petra Ritter (2025). Brain network simulations derived from fMRI and structural MRI from 50 healthy participants, age range 18-80 years [Dataset]. http://doi.org/10.25493/R7DJ-3NQ
    Explore at:
    Dataset updated
    Mar 17, 2025
    Authors
    Jil M. Meier; Paul Triebkorn; Michael Schirner; Petra Ritter
    Description

    We present simulation results from a study with The Virtual Brain (TVB). Structural, functional and simulated data have been prepared in accordance with Brain Imaging Data Structure (BIDS) standards and annotated according to the openMINDS metadata framework. This simultaneous electroencephalography (EEG) - functional magnetic resonance imaging (fMRI) resting-state data, diffusion-weighted MRI (dwMRI), and structural MRI were acquired for 50 healthy adult subjects (18 - 80 years of age, mean 41.24±18.33; 31 females, 19 males) at the Berlin Center for Advanced Imaging, Charité University Medicine, Berlin, Germany. We constructed personalized models from this multimodal data of 50 healthy individuals with TVB. We calculated the optimal parameters on an individual basis that predict multiple empirical features in fMRI and EEG, e.g. dynamic functional connectivity and bimodality in the alpha band power, and analyzed inter-individual differences with respect to optimized parameters and structural as well as functional connectivity in a previous study (Triebkorn et al. 2024). We present this large comprehensive empirical and simulated data set in an annotated and structured format following the BIDS Extension Proposal for computational modeling data. We describe how we processed and converted the diverse data sources to make it reusable. In its current form, this dataset can be reused for further research and provides ready-to-use data at various levels of processing including the thereof inferred brain simulation results for a large data set of healthy subjects with a wide age range.

  9. Functional MRI data from medetomidine-isoflurane anesthetized marmosets

    • zenodo.org
    csv, zip
    Updated May 2, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zenodo (2024). Functional MRI data from medetomidine-isoflurane anesthetized marmosets [Dataset]. http://doi.org/10.5281/zenodo.11093635
    Explore at:
    zip, csvAvailable download formats
    Dataset updated
    May 2, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Apr 30, 2024
    Description

    This dataset contains unprocessed functional MRI (fMRI) data acquired in common marmosets (Callithrix jacchus), The data were obtained during a continuous infusion of the sedative medetomidine, supplemented with a low concentration of isoflurane. All experiments were carried out in accordance with the guidelines from Directive 2010/63/EU of the European Parliament on the protection of animals used for scientific purposes.

    Related paper

    This dataset supplements the following manuscript.

    Preserving functional network structure under anesthesia in the marmoset monkey brain

    M Ortiz-Rios, N Sirmpilatze, J Koenig, S Boretius - bioRxiv, 2023

    doi: https://doi.org/10.1101/2023.11.21.568138

    Data structure

    The main data files are organized into eight zipped folders - sub-02.zip, .... sub-09.zip - each constituting a dataset formatted according to the Brain Imaging Data Structure specifications (BIDS v1.6.0).

    • Each BIDS-formatted dataset contains subfolders for individual sessions (e.g. ses-0074, etc.).Additionally, a text file, participants.tsv, with some essential information about the subjects (e.g. age, weight, sex).
    • Each subject-specific folder contains subfolders named func and anat, storing fMRI and structural MRI data respectively. The (f)MRI data are provided in NIfTI format (suffixed with .nii.gz).
    • The func files are named as {sub-id}_{ses-id}_{run-id}_{task-id}_BOLD.nii.gz. The task id is based on runs acquired for resting-state or during visual stimulation.

    BIDS-formatted datasets

    The basic characteristics of the datasets are given below. More details can be found in the preprint.

    1. Marmoset
      • Institution: German Primate Center (Deutsches Primatenzentrum GmbH - Leibniz-Institut für Primatenforschung), Göttingen, Germany
      • MR system: Bruker BioSpec 9.4 T, equpped with B-GA 20S gradient
      • Anatomical MRI scan: Proton density-weighted (PDw) with magnetization transfer (MT) pulse, 1 per subject
      • fMRI scan: GE-EPI, several runs per subject (between 6 and 9 runs), duration 330 s for visual runs and 600 s for resting-state runs, all with a TR of 2 s and a resolution of 0.4 mm isotropic.
      • Subjects: 8 Callithrix jacchus
      • Age range: 3 - 10 years
      • Weight range: 382 - 505 g
      • Sex: 5 males and 3 females
      • Ethics oversight: Lower Saxony State Office for Consumer Protection and Food Safety, Hannover, Germany (approval numbers 33.19-42502-04-17/2496 and 33.19-42502-04-17/2535)
  10. Dataset of Concurrent EEG, ECG, and Behavior with Multiple Doses of...

    • openneuro.org
    Updated Jun 17, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nigel Gebodh (2021). Dataset of Concurrent EEG, ECG, and Behavior with Multiple Doses of transcranial Electrical Stimulation - BIDS [Dataset]. http://doi.org/10.18112/openneuro.ds003670.v1.0.0
    Explore at:
    Dataset updated
    Jun 17, 2021
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Nigel Gebodh
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Synopsis This is the GX dataset formatted to comply with BIDS standard format.

    The tES/EEG/CTT/Vigilance experiment contains 19 unique participants (some repeated experiments). Over a 70 min period EEG/ECG/EOG were recorded concurrently with a CTT where participants maintained a ball at the center of the screen and were periodically stimulated (with low-intensity noninvasive brain stimulation) for 30 secs with combinations of 9 stimulation montages.

    For the raw data please see: https://zenodo.org/record/4456079

    For methodological details please see corresponding article titled: Dataset of concurrent EEG, ECG, and behavior with multiple doses of transcranial Electrical Stimulation

    Data Descriptor Abstract We present a dataset combining human-participant high-density electroencephalography (EEG) with physiological and continuous behavioral metrics during transcranial electrical stimulation (tES). Data include within participant application of nine High-Definition tES (HD-tES) types, targeting three cortical regions (frontal, motor, parietal) with three stimulation waveforms (DC, 5 Hz, 30 Hz); more than 783 total stimulation trials over 62 sessions with EEG, physiological (ECG, EOG), and continuous behavioral vigilance/alertness metrics. Experiment 1 and 2 consisted of participants performing a continuous vigilance/alertness task over three 70-minute and two 70.5-minute sessions, respectively. Demographic data were collected, as well as self-reported wellness questionnaires before and after each session. Participants received all 9 stimulation types in Experiment 1, with each session including three stimulation types, with 4 trials per type. Participants received 2 stimulation types in Experiment 2, with 20 trials of a given stimulation type per session. Within-participant reliability was tested by repeating select sessions. This unique dataset supports a range of hypothesis testing including interactions of tDCS/tACS location and frequency, brain-state, physiology, fatigue, and cognitive performance.

    For more details please see the full data descriptor article.

    Code used to import and process this dataset can be found here: GitHub : https://github.com/ngebodh/GX_tES_EEG_Physio_Behavior

    For downsampled data please see: Experiment 1 : https://doi.org/10.5281/zenodo.3840615 Experiment 2 : https://doi.org/10.5281/zenodo.3840617

    • Nigel Gebodh (May 26th, 2021)
  11. o

    Distributed Archives for Neurophysiology Data Integration (DANDI)

    • registry.opendata.aws
    Updated Sep 21, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DANDI Archive (2020). Distributed Archives for Neurophysiology Data Integration (DANDI) [Dataset]. https://registry.opendata.aws/dandiarchive/
    Explore at:
    Dataset updated
    Sep 21, 2020
    Dataset provided by
    <a href="https://about.dandiarchive.org/team">DANDI Archive</a>
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    DANDI is a public archive of neurophysiology datasets, including raw and processed data, and associated software containers. Datasets are shared according to a Creative Commons CC0 or CC-BY licenses. The data archive provides a broad range of cellular neurophysiology data. This includes electrode and optical recordings, and associated imaging data using a set of community standards: NWB:N - NWB:Neurophysiology, BIDS - Brain Imaging Data Structure, and NIDM - Neuro Imaging Data Model. Development of DANDI is supported by the National Institute of Mental Health.

  12. f

    Table_1_FlywheelTools: Data Curation and Manipulation on the Flywheel...

    • frontiersin.figshare.com
    docx
    Updated Jun 10, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Tinashe M. Tapera; Matthew Cieslak; Max Bertolero; Azeez Adebimpe; Geoffrey K. Aguirre; Ellyn R. Butler; Philip A. Cook; Diego Davila; Mark A. Elliott; Sophia Linguiti; Kristin Murtha; William Tackett; John A. Detre; Theodore D. Satterthwaite (2023). Table_1_FlywheelTools: Data Curation and Manipulation on the Flywheel Platform.DOCX [Dataset]. http://doi.org/10.3389/fninf.2021.678403.s001
    Explore at:
    docxAvailable download formats
    Dataset updated
    Jun 10, 2023
    Dataset provided by
    Frontiers
    Authors
    Tinashe M. Tapera; Matthew Cieslak; Max Bertolero; Azeez Adebimpe; Geoffrey K. Aguirre; Ellyn R. Butler; Philip A. Cook; Diego Davila; Mark A. Elliott; Sophia Linguiti; Kristin Murtha; William Tackett; John A. Detre; Theodore D. Satterthwaite
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The recent and growing focus on reproducibility in neuroimaging studies has led many major academic centers to use cloud-based imaging databases for storing, analyzing, and sharing complex imaging data. Flywheel is one such database platform that offers easily accessible, large-scale data management, along with a framework for reproducible analyses through containerized pipelines. The Brain Imaging Data Structure (BIDS) is the de facto standard for neuroimaging data, but curating neuroimaging data into BIDS can be a challenging and time-consuming task. In particular, standard solutions for BIDS curation are limited on Flywheel. To address these challenges, we developed “FlywheelTools,” a software toolbox for reproducible data curation and manipulation on Flywheel. FlywheelTools includes two elements: fw-heudiconv, for heuristic-driven curation of data into BIDS, and flaudit, which audits and inventories projects on Flywheel. Together, these tools accelerate reproducible neuroscience research on the widely used Flywheel platform.

  13. Spinal Cord fMRI Segmentation Database (Multi-subject)

    • openneuro.org
    Updated Nov 28, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rohan Banerjee; Merve Kaptan; Alexandra Tinnermann; Ali Khatibi; Alice Dabbagh; Christian Buechel; Christian Kündig; Christine S.W. Law; Dario Pfyffer; David J. Lythgoe; Dimitra Tsivaka; Dimitri Van De Ville; Falk Eippert; Fauziyya Muhammad; Gary H. Glover; Gergely David; Grace Haynes; Jan Haaker; Jonathan C. W. Brooks; Julien Doyon; Jürgen Finsterbusch; Katherine T. Martucci; Kimberly J. Hemmerling; Mahdi Mobarak-Abadi; Mark A. Hoggarth; Matthew A. Howard; Molly G. Bright; Nawal Kinany; Olivia S. Kowalczyk; Ovidiu Lungu; Patrick Freund; Rangaprakash Deshpande; Robert L. Barry; Sean Mackey; Shahabeddin Vahdat; Simon Schading; Sonia Medina; Stephen B. McMahon; Steven C. R. Williams; Todd B. Parrish; Véronique Marchand-Pauvert; Yasin Dhaher; Yufen Chen; Zachary A. Smith; Kenneth A. Weber II; Benjamin De Leener; Julien Cohen-Adad (2024). Spinal Cord fMRI Segmentation Database (Multi-subject) [Dataset]. http://doi.org/10.18112/openneuro.ds005143.v1.3.0
    Explore at:
    Dataset updated
    Nov 28, 2024
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Rohan Banerjee; Merve Kaptan; Alexandra Tinnermann; Ali Khatibi; Alice Dabbagh; Christian Buechel; Christian Kündig; Christine S.W. Law; Dario Pfyffer; David J. Lythgoe; Dimitra Tsivaka; Dimitri Van De Ville; Falk Eippert; Fauziyya Muhammad; Gary H. Glover; Gergely David; Grace Haynes; Jan Haaker; Jonathan C. W. Brooks; Julien Doyon; Jürgen Finsterbusch; Katherine T. Martucci; Kimberly J. Hemmerling; Mahdi Mobarak-Abadi; Mark A. Hoggarth; Matthew A. Howard; Molly G. Bright; Nawal Kinany; Olivia S. Kowalczyk; Ovidiu Lungu; Patrick Freund; Rangaprakash Deshpande; Robert L. Barry; Sean Mackey; Shahabeddin Vahdat; Simon Schading; Sonia Medina; Stephen B. McMahon; Steven C. R. Williams; Todd B. Parrish; Véronique Marchand-Pauvert; Yasin Dhaher; Yufen Chen; Zachary A. Smith; Kenneth A. Weber II; Benjamin De Leener; Julien Cohen-Adad
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    This dataset was acquired using various EPI protovols on multiple subjects, multiple sites and multiple MRI vendors and models to develop a method to automate the time-consuming segmentation of the spinal cord for fMRI. The list of subjects is available in participants.tsv.

    This dataset follows the BIDS convention. The contributors have the necessary ethics & permissions to share the data publicly.

    The dataset does not include any identifiable personal health information, including names,zip codes, dates of birth, facial features.

    Each participant's data is in one subdirectory, which contains the mean of motion-corrected volumes (the mean image that was used to draw the spinal cord mask) as well as the associated metadata. Spinal cord masks that were generated based on mean of motion-corrected volumes are found under derivatives/label/sub-subjectID/sub-subjectID_task-rest_desc-spinalcordmask.nii.gz.

    If you reference this dataset in your publications, please cite the following publication: Link to be added. Should you have any questions about this data set, please contact mkaptan@stanford.edu and banerjee.rohan98@gmail.com

  14. Category-specific Associative Inference in Memory Dataset

    • openneuro.org
    Updated Apr 11, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mrinmayi Kulkarni; Lydia Jiang; Jessica Robin; Jung Won Choi; Bradley R. Buchsbaum; Rosanna K. Olsen (2025). Category-specific Associative Inference in Memory Dataset [Dataset]. http://doi.org/10.18112/openneuro.ds006039.v1.0.2
    Explore at:
    Dataset updated
    Apr 11, 2025
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Mrinmayi Kulkarni; Lydia Jiang; Jessica Robin; Jung Won Choi; Bradley R. Buchsbaum; Rosanna K. Olsen
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Category-specific Associative Inference in Memory Dataset

    This dataset contains fMRI, behavioural and eye-tracking data on a experiment testing the involvement of Medial Temporal Lobe (MTL) subregions in category-specific associative inference in memory. The dataset follows the Brain Imaging Data Structure (BIDS) format.

    Dataset Structure

    ├── dataset_description.json
    ├── participants.json
    ├── participants.tsv
    ├── README.md
    ├── derivatives/
    │  ├── AIMDeconvolveOutput
    │  │  └── sub-[subjectID]/
    │  ├── LocalDeconvolveOutput
    │  │  └── sub-[subjectID]/
    │  └── ROIs
    │    └── sub-[subjectID]/
    └── sub-[subjectID]/
      ├── anat/
      │  ├── sub-[subjectID]_T1w.nii.gz
      │  └── sub-[subjectID]_T2w.nii.gz
      └── func/
       ├── sub-[subjectID]_task-aim_run-[01-08]_bold.nii.gz
       ├── sub-[subjectID]_task-aim_run-[01-08]_bold.json
       ├── sub-[subjectID]_task-local_bold.nii.gz
       ├── sub-[subjectID]_task-local_bold.json
       ├── sub-[subjectID]_task-aim_events.tsv
       ├── sub-[subjectID]_task-aim_events.json
       ├── sub-[subjectID]_task-local_events.tsv
       ├── sub-[subjectID]_task-local_events.json
       ├── sub-[subjectID]_task-aim_eyetracking.tsv
       └── sub-[subjectID]_task-aim_eyetracking.json
    

    Dataset Contents

    Anatomical Data

    • T1-weighted structural scans (defaced)
    • T2-weighted structural scans (slab collected perpendicular to the long-axis of the hippocampus)

    Raw Functional Data

    1. Associative Inference Memory (AIM) Task

      • 8 runs of BOLD data
      • Event files containing task timing and conditions
      • Eye tracking data
      • Associated JSON metadata files
    2. Local Task

      • Single run of BOLD data
      • Event files containing task timing and conditions
      • Associated JSON metadata files

    Derivatives Dtaa

    1. AIM Deconvolve Output (AIMDeconvolveOutput)
      • Whole-brain maps containing results from the GLM analysis: sub-[subjectID]_StudyIndirectSubMemTestAcc_StrictAcc.nii.gz
      • Design matrix generated by 3dDeconvolve in AFNI (v21.2.10): sub-[subjectID]_StudyIndirectSubMemTestAcc_StrictAcc_glm_design1.1D
      • 3dREMLfit command generated by 3dDeconvolve in AFNI (v21.2.10): sub-[subjectID]_StudyIndirectSubMemTestAcc_StrictAcc+tlrc.REML_cmd
    2. Localiser Deconvolve Output (LocalDeconvolveOutput)
      • Whole-brain maps containing results from the GLM analysis: sub-[subjectID]_LocCat.nii.gz
      • Design matrix generated by 3dDeconvolve in AFNI (v21.2.10): sub-[subjectID]_LocCat_glm_design1.1D
      • 3dREMLfit command generated by 3dDeconvolve in AFNI (v21.2.10): sub-[subjectID]_LocCat+tlrc.REML_cmd
    3. Regions of Interest (ROIs)
      • Manually segmented MTL subregion ROIs in T1 space: sub-1004_space-T1w_desc-HandSeg_mask.nii.gz
        The manual segmentation was conducted on raw hi-resolution T2w images. These were transformed to T1w space using the following steps:
        1. The raw T2w image was aligned to the T1w image using antsRegistrationSynQuick in ANTS.
        2. The transform computed in the above step was applied to the manually segmented ROIs in the raw T2w space using antsApplyTransform.sh in ANTS.
      • Manually segmented MTL subregion ROIs resampled to BOLD space: sub-1004_space-T1w_desc-HandSeg_mask_Resampled.nii.gz
        Because of differences in obliquity of the BOLD and T1 scans following preprocessing in fMRIprep, the following steps were undertaken to resample the ROIs from T1 space to BOLD space (see here for more details):
        1. The difference in obliquity between the preprocessed T1w image and the preprocessed BOLD image was computed using 3dWarp in AFNI.
        2. The obliquity difference was applied to the ROIs in T1w space using 3dAllineate in AFNI.

    Metadata Files

    • dataset_description.json: Dataset-level metadata
    • participants.tsv: Participant information
    • participants.json: Description of participant-level variables

    Notes

    • All anatomical images have been defaced for privacy
    • The dataset includes 41 participants who completed the AIM scan, and 38 participants who completed the funcational localiser scan (sub-1003 through sub-1041)
    • Eye tracking data is available for the AIM task
    • Scripts used to analyse this dataset are available on Github
  15. Functional MRI data from isoflurane-anesthetized macaques, marmosets, and...

    • zenodo.org
    application/gzip +3
    Updated Jul 17, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nikoloz Sirmpilatze; Nikoloz Sirmpilatze; Judith Mylius; Jaakko Paasonen; Jaakko Paasonen; Olli Gröhn; Olli Gröhn; Susann Boretius; Judith Mylius; Susann Boretius (2024). Functional MRI data from isoflurane-anesthetized macaques, marmosets, and rats [Dataset]. http://doi.org/10.5281/zenodo.5565306
    Explore at:
    json, zip, application/gzip, pdfAvailable download formats
    Dataset updated
    Jul 17, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Nikoloz Sirmpilatze; Nikoloz Sirmpilatze; Judith Mylius; Jaakko Paasonen; Jaakko Paasonen; Olli Gröhn; Olli Gröhn; Susann Boretius; Judith Mylius; Susann Boretius
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains unprocessed task-free functional MRI (fMRI) data acquired in three different mammalian species: long-tailed macaques (Macaca fascicularis), common marmosets (Callithrix jacchus), and rats (Rattus Norvegicus, Wistar strain). The data were obtained during isoflurane anesthesia, with the animals intubated and mechanically ventilated. All experiments were carried out in accordance with the guidelines from Directive 2010/63/EU of the European Parliament on the protection of animals used for scientific purposes.

    Related paper

    This dataset supplements the following preprint:

    Sirmpilatze N, Mylius J, Ortiz-Rios M, Baudewig J, Paasonen J, Golkowski D, Ranft A, Ilg R, Gröhn O, Boretius S. 2021. Spatial signatures of anesthesia-induced burst-suppression differ between primates and rodents. bioRxiv. doi:10.1101/2021.10.15.464515

    Data structure

    The main data files are organized into four zipped folders - Macaque.zip, Marmoset.zip, Rat1.zip, Rat2.zip - each constituting a dataset formatted according to the Brain Imaging Data Structure specifications (BIDS v1.6.0).

    • Each BIDS-formatted dataset contains subfolders for individual subjects (e.g. sub-01, sub-02, etc.), as well as a tab-separated text file, participants.tsv, with some essential information about the subjects (e.g. age, weight, sex).
    • Each subject-specific folder contains subfolders named func and anat, storing fMRI and structural MRI data respectively. The (f)MRI data are provided in NIfTI format (suffixed with .nii.gz). Each NIfTI file is accompanied by a .json sidecar holding metadata.
    • The func subfolders also include tab-separated text files named as {sub-id}_scans.tsv (e.g. sub-01_scans.tsv). These files provide additional information on the fMRI runs within the func subfolder, such as the isoflurane concentration during the acquisition of the fMRI run, duration of the run, etc.
    • The column names in participants.tsv and {sub-id}_scans.tsv files are explained in accompanying participants.json and {sub-id}_scans.json files.

    BIDS-formatted datasets

    The basic characteristics of the datasets are given below. More details can be found in the preprint.

    1. Macaque
      • Institution: German Primate Center (Deutsches Primatenzentrum GmbH - Leibniz-Institut für Primatenforschung), Göttingen, Germany
      • MR system: Siemens MAGNETOM Prisma 3T
      • Anatomical MRI scan: T1-weighted (MPRAGE), 1 per subject
      • fMRI scan: GE-EPI, 1 or 2 runs per subject, run duration 600 - 1200 s
      • Subjects: 13 Macaca fascicularis
      • Age range: 6.8 - 19.8 years
      • Weight range: 3.6 - 8.1 kg
      • Sex: all females
      • Ethics oversight: Lower Saxony State Office for Consumer Protection and Food Safety, Hannover, Germany (approval number 33.19-42502-04-16/2278)
    2. Marmoset
      • Institution: German Primate Center (Deutsches Primatenzentrum GmbH - Leibniz-Institut für Primatenforschung), Göttingen, Germany
      • MR system: Bruker BioSpec 9.4 T, equpped with B-GA 20S gradient
      • Anatomical MRI scan: Proton density-weighted (PDw) with magnetization transfer (MT) pulse , 1 per subject
      • fMRI scan: GE-EPI, 1 run per subject, run duration 600 s
      • Subjects: 20 Callithrix jacchus
      • Age range: 1.9 - 14.2 years
      • Weight range: 337 - 517 g
      • Sex: 10 females
      • Ethics oversight: Lower Saxony State Office for Consumer Protection and Food Safety, Hannover, Germany (approval number 33.19-42502-04-17/2496)
    3. Rat1
      • Institution: German Primate Center (Deutsches Primatenzentrum GmbH - Leibniz-Institut für Primatenforschung), Göttingen, Germany
      • MR system: Bruker BioSpec 9.4 T, equpped with B-GA 12S2 gradient
      • Anatomical MRI scan: T2-weighted (TurboRARE) , 1 per subject
      • fMRI scan: GE-EPI, 6 runs per subject (except for sub-10: 4 runs), run duration 720 s
      • Subjects: 11 Rattus norvegicus, Wistar strain
      • Weight range: 350 - 450 g
      • Sex: all females
      • Ethics oversight: Lower Saxony State Office for Consumer Protection and Food Safety, Hannover, Germany (approval number 33.19-42502-04-15/2042)
    4. Rat2
      • Institution: A.I.V. Institute for Molecular Sciences, University of Eastern Finland, Kuopio, Finland
      • MR system: Bruker PharmaScan 7 T
      • Anatomical MRI scan: NOT provided
      • fMRI scan: GE-EPI, 6 runs per subject (except for sub-10: 4 runs), run duration 720 s
      • Subjects: 6 Rattus norvegicus, Wistar strain
      • Weight range: 265 - 350 g
      • Sex: all males
      • Ethics oversight: Animal Ethics Committee of the Provincial Government of Southern Finland

    Example data

    Before you commit to downloading the BIDS-formatted datasets, we encourage you to examine the example data that we provide in the root folder. These include one anatomical (stuctural MRI) and one functional (fMRI) scan from each of the four datasets (Rat2 contains functional scans only), with their respecitve .json sidecars. A preview of these example scans is provided by 0_preview.pdf.

  16. f

    Data_Sheet_1_Optimizing neuroscience data management by combining REDCap,...

    • frontiersin.figshare.com
    docx
    Updated Sep 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Marc Stawiski; Vittoria Bucciarelli; Dorian Vogel; Simone Hemm (2024). Data_Sheet_1_Optimizing neuroscience data management by combining REDCap, BIDS and SQLite: a case study in Deep Brain Stimulation.docx [Dataset]. http://doi.org/10.3389/fninf.2024.1435971.s001
    Explore at:
    docxAvailable download formats
    Dataset updated
    Sep 5, 2024
    Dataset provided by
    Frontiers
    Authors
    Marc Stawiski; Vittoria Bucciarelli; Dorian Vogel; Simone Hemm
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Neuroscience studies entail the generation of massive collections of heterogeneous data (e.g. demographics, clinical records, medical images). Integration and analysis of such data in research centers is pivotal for elucidating disease mechanisms and improving clinical outcomes. However, data collection in clinics often relies on non-standardized methods, such as paper-based documentation. Moreover, diverse data types are collected in different departments hindering efficient data organization, secure sharing and compliance to the FAIR (Findable, Accessible, Interoperable, Reusable) principles. Henceforth, in this manuscript we present a specialized data management system designed to enhance research workflows in Deep Brain Stimulation (DBS), a state-of-the-art neurosurgical procedure employed to treat symptoms of movement and psychiatric disorders. The system leverages REDCap to promote accurate data capture in hospital settings and secure sharing with research institutes, Brain Imaging Data Structure (BIDS) as image storing standard and a DBS-specific SQLite database as comprehensive data store and unified interface to all data types. A self-developed Python tool automates the data flow between these three components, ensuring their full interoperability. The proposed framework has already been successfully employed for capturing and analyzing data of 107 patients from 2 medical institutions. It effectively addresses the challenges of managing, sharing and retrieving diverse data types, fostering advancements in data quality, organization, analysis, and collaboration among medical and research institutions.

  17. Dataset of Concurrent EEG, ECG, and Behavior with Multiple Doses of...

    • zenodo.org
    • data.niaid.nih.gov
    bin
    Updated Jun 3, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nigel Gebodh; Nigel Gebodh (2025). Dataset of Concurrent EEG, ECG, and Behavior with Multiple Doses of transcranial Electrical Stimulation-Exp1-Data Downsampled [Dataset]. http://doi.org/10.5281/zenodo.15572614
    Explore at:
    binAvailable download formats
    Dataset updated
    Jun 3, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Nigel Gebodh; Nigel Gebodh
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    GX Dataset downsampled - Experiment 1

    The GX Dataset is a dataset of combined tES, EEG, physiological, and behavioral signals from human subjects.
    Here the GX Dataset for Experiment 1 is downsampled to 1 kHz and saved in .MAT format which can be used in both MATLAB and Python.

    Publication

    A full data descriptor is published in Nature Scientific Data. Please cite this work as:

    Gebodh, N., Esmaeilpour, Z., Datta, A. et al. Dataset of concurrent EEG, ECG, and behavior with multiple doses of transcranial electrical stimulation. Sci Data 8, 274 (2021). https://doi.org/10.1038/s41597-021-01046-y

    Descriptions

    A dataset combining high-density electroencephalography (EEG) with physiological and continuous behavioral metrics during transcranial electrical stimulation (tES). Data includes within subject application of nine High-Definition tES (HD-tES) types targeted three brain regions (frontal, motor, parietal) with three waveforms (DC, 5Hz, 30Hz), with more than 783 total stimulation trials over 62 sessions with EEG, physiological (ECG, EOG), and continuous behavioral vigilance/alertness metrics.

    Acknowledgments

    Portions of this study were funded by X (formerly Google X), the Moonshot Factory. The funding source had no influence on study conduction or result evaluation. MB is further supported by grants from the National Institutes of Health: R01NS101362, R01NS095123, R01NS112996, R01MH111896, R01MH109289, and (to NG) NIH-G-RISE T32GM136499.

    Extras

    Back to Full GX Dataset : https://doi.org/10.5281/zenodo.4456079

    For downsampled data (1 kHz ) please see (in .mat format):

    Code used to import, process, and plot this dataset can be found here:

    Additional figures for this project have been shared on Figshare. Trial-wise figures can be found here:

    The full dataset is also provided in BIDS format here:

    Data License
    Creative Common 4.0 with attribution (CC BY 4.0)

    NOTE

    Please email ngebodh01@citymail.cuny.edu with any questions.


    Updates

    • Version 2.1.0
      • Behavioral data (ptracker) adjusted for all files. Previous version had one particiapnt's data overwriting all behavioral data when downsampled. Mat format now explicitly >v7.3
    • Version 2
      • Stimulation trigger labels now adjusted. Previous labels were missmatched for Experiment 1's data.

  18. Z

    Data from: Grand Theft Empathy? Evidence for the absence of effects of...

    • data.niaid.nih.gov
    • zenodo.org
    Updated Nov 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Lengersdorff, Lukas Leopold (2023). Data from: Grand Theft Empathy? Evidence for the absence of effects of violent video games on empathy for pain and emotional reactivity to violence [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_10057632
    Explore at:
    Dataset updated
    Nov 3, 2023
    Dataset authored and provided by
    Lengersdorff, Lukas Leopold
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Abstract: Influential accounts claim that violent video games (VVG) decrease players' emotional empathy by desensitizing them to both virtual and real-life violence. However, scientific evidence for this claim is inconclusive and controversially debated. To assess the causal effect of VVGs on the behavioral and neural correlates of empathy and emotional reactivity to violence, we conducted a prospective experimental study using functional magnetic resonance imaging (fMRI). We recruited eighty-nine male participants without prior VVG experience. Over the course of two weeks, participants played either a highly violent video game, or a non-violent version of the same game. Before and after this period, participants completed an fMRI experiment with paradigms measuring their empathy for pain and emotional reactivity to violent images. Applying a Bayesian analysis approach throughout enabled us to find substantial evidence for the absence of an effect of VVGs on the behavioral and neural correlates of empathy. Moreover, participants in the VVG group were not desensitized to images of real-world violence. These results imply that short and controlled exposure to VVGs does not numb empathy nor the responses to real-world violence. We discuss the implications of our findings regarding the potential and limitations of experimental research on the causal effects of VVGs. While VVGs might not have a discernible effect on the investigated subpopulation within our carefully controlled experimental setting, our results cannot preclude that effects could be found in special vulnerable subpopulations, or in settings with higher ecological validity. Dataset:This dataset contains the fMRI data collected for the study in the BIDS-format (https://bids.neuroimaging.io/)

    functional neuroimaging (*_bold.nii.gz) data of 89 human participants, collected during two tasks:

    Empathy-for-Pain paradigm (Session 1 & 2) Emotional Reactivity paradigm (Session 2) associated event files (*_events.tsv) containing event onsets, durations, and behavioral covariates metadata FMRI bold timeseries are fully preprocessed, as described in the manuscript. Additional data, such as behavioral data in a simpler format, can be accessed on https://osf.io/yx423/

  19. List of currently available BIDS Apps.

    • plos.figshare.com
    xls
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Krzysztof J. Gorgolewski; Fidel Alfaro-Almagro; Tibor Auer; Pierre Bellec; Mihai Capotă; M. Mallar Chakravarty; Nathan W. Churchill; Alexander Li Cohen; R. Cameron Craddock; Gabriel A. Devenyi; Anders Eklund; Oscar Esteban; Guillaume Flandin; Satrajit S. Ghosh; J. Swaroop Guntupalli; Mark Jenkinson; Anisha Keshavan; Gregory Kiar; Franziskus Liem; Pradeep Reddy Raamana; David Raffelt; Christopher J. Steele; Pierre-Olivier Quirion; Robert E. Smith; Stephen C. Strother; Gaël Varoquaux; Yida Wang; Tal Yarkoni; Russell A. Poldrack (2023). List of currently available BIDS Apps. [Dataset]. http://doi.org/10.1371/journal.pcbi.1005209.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Krzysztof J. Gorgolewski; Fidel Alfaro-Almagro; Tibor Auer; Pierre Bellec; Mihai Capotă; M. Mallar Chakravarty; Nathan W. Churchill; Alexander Li Cohen; R. Cameron Craddock; Gabriel A. Devenyi; Anders Eklund; Oscar Esteban; Guillaume Flandin; Satrajit S. Ghosh; J. Swaroop Guntupalli; Mark Jenkinson; Anisha Keshavan; Gregory Kiar; Franziskus Liem; Pradeep Reddy Raamana; David Raffelt; Christopher J. Steele; Pierre-Olivier Quirion; Robert E. Smith; Stephen C. Strother; Gaël Varoquaux; Yida Wang; Tal Yarkoni; Russell A. Poldrack
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    List of currently available BIDS Apps.

  20. f

    Data_Sheet_1_Common Data Elements, Scalable Data Management Infrastructure,...

    • frontiersin.figshare.com
    docx
    Updated Jun 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rayus Kuplicki; James Touthang; Obada Al Zoubi; Ahmad Mayeli; Masaya Misaki; NeuroMAP-Investigators; Robin L. Aupperle; T. Kent Teague; Brett A. McKinney; Martin P. Paulus; Jerzy Bodurka (2023). Data_Sheet_1_Common Data Elements, Scalable Data Management Infrastructure, and Analytics Workflows for Large-Scale Neuroimaging Studies.docx [Dataset]. http://doi.org/10.3389/fpsyt.2021.682495.s001
    Explore at:
    docxAvailable download formats
    Dataset updated
    Jun 3, 2023
    Dataset provided by
    Frontiers
    Authors
    Rayus Kuplicki; James Touthang; Obada Al Zoubi; Ahmad Mayeli; Masaya Misaki; NeuroMAP-Investigators; Robin L. Aupperle; T. Kent Teague; Brett A. McKinney; Martin P. Paulus; Jerzy Bodurka
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Neuroscience studies require considerable bioinformatic support and expertise. Numerous high-dimensional and multimodal datasets must be preprocessed and integrated to create robust and reproducible analysis pipelines. We describe a common data elements and scalable data management infrastructure that allows multiple analytics workflows to facilitate preprocessing, analysis and sharing of large-scale multi-level data. The process uses the Brain Imaging Data Structure (BIDS) format and supports MRI, fMRI, EEG, clinical, and laboratory data. The infrastructure provides support for other datasets such as Fitbit and flexibility for developers to customize the integration of new types of data. Exemplar results from 200+ participants and 11 different pipelines demonstrate the utility of the infrastructure.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Guiomar Niso; Jeremy Moreau; Elizabeth Bock; Francois Tadel; Sylvain Baillet (2024). MEG-BIDS OMEGA RestingState_sample [Dataset]. http://doi.org/10.18112/openneuro.ds000247.v1.0.2
Organization logo

MEG-BIDS OMEGA RestingState_sample

Explore at:
2 scholarly articles cite this dataset (View in Google Scholar)
Dataset updated
Apr 24, 2024
Dataset provided by
OpenNeurohttps://openneuro.org/
Authors
Guiomar Niso; Jeremy Moreau; Elizabeth Bock; Francois Tadel; Sylvain Baillet
License

CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically

Description

OMEGA - Resting State Sample Dataset

License

  • This dataset was obtained from The Open MEG Archive (OMEGA, https://omega.bic.mni.mcgill.ca).

  • You are free to use all data in OMEGA for research purposes; please acknowledge its authors and cite the following reference in your publications if you have used data from OMEGA:

  • Niso G., Rogers C., Moreau J.T., Chen L.Y., Madjar C., Das S., Bock E., Tadel F., Evans A.C., Jolicoeur P., Baillet S. (2016). OMEGA: The Open MEG Archive. NeuroImage 124, 1182-1187. doi: https://doi.org/10.1016/j.neuroimage.2015.04.028. OMEGA is available at: https://omega.bic.mni.mcgill.ca

Description

Experiment

  • 5 subjects x 5 minute resting sessions, eyes open

MEG acquisition

  • Recorded at the Montreal Neurological Institute in 2012-2016
  • Acquisition with CTF 275 MEG system at 2400Hz sampling rate
  • Anti-aliasing low-pass filter at 600Hz, files may be saved with or without the CTF 3rd order gradient compensation
  • Recorded channels (at least 297), include:
    • 26 MEG reference sensors (#2-#27)
    • 270 MEG axial gradiometers (#28-#297)
    • 1 ECG bipolar (EEG057/#298) - Not available in the empty room recordings
    • 1 vertical EOG bipolar (EEG058/#299) - Not available in the empty room recordings
    • 1 horizontal EOG bipolar (EEG059/#300) - Not available in the empty room recordings

Head shape and fiducial points

  • 3D digitization using a Polhemus Fastrak device driven by Brainstorm. The .pos files contain:
    • The center of the CTF coils
    • The anatomical references we use in Brainstorm: nasion and ears as illustrated here
    • Around 100 head points distributed on the hard parts of the head (no soft tissues).

Subject anatomy

  • Structural T1 image (defaced for anonymization purposes)
  • Processed with FreeSurfer 5.3
  • The anatomical fiducials (NAS, LPA, RPA) have already been marked and saved in the files fiducials.m

BIDS

  • The data in this dataset has been organized according to the MEG-BIDS specification (Brain Imaging Data Structure, http://bids.neuroimaging.io) (Niso et al. 2018)

  • Niso G., Gorgolewski K.J., Bock E., Brooks T.L., Flandin G., Gramfort A., Henson R.N., Jas M., Litvak V., Moreau J., Oostenveld R., Schoffelen J.M., Tadel F., Wexler J., Baillet S. (2018). MEG-BIDS: an extension to the Brain Imaging Data Structure for magnetoencephalography. Scientific Data; 5, 180110. https://doi.org/10.1038/sdata.2018.110

Release history:

  • 2016-12-01: initial release
  • 2018-07-18: release OpenNeuro ds000247 (00001 and 00002)
Search
Clear search
Close search
Google apps
Main menu