100+ datasets found
  1. BIDS dataset for BIDS Manager-Pipeline

    • figshare.com
    zip
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Aude Jegou; Nicolas Roehri; Samuel Medina Villalon (2023). BIDS dataset for BIDS Manager-Pipeline [Dataset]. http://doi.org/10.6084/m9.figshare.19046345.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Aude Jegou; Nicolas Roehri; Samuel Medina Villalon
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This folder contains data organised in BIDS format to test BIDS Manager-Pipeline (https://github.com/Dynamap/BIDS_Manager/tree/dev).

  2. BIDS Phenotype Segregation Example Dataset

    • openneuro.org
    Updated Jun 4, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Samuel Guay; Eric Earl; Hao-Ting Wang; Remi Gau; Dorota Jarecka; David Keator; Melissa Kline Struhl; Satra Ghosh; Louis De Beaumont; Adam G. Thomas (2022). BIDS Phenotype Segregation Example Dataset [Dataset]. http://doi.org/10.18112/openneuro.ds004129.v1.0.0
    Explore at:
    Dataset updated
    Jun 4, 2022
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Samuel Guay; Eric Earl; Hao-Ting Wang; Remi Gau; Dorota Jarecka; David Keator; Melissa Kline Struhl; Satra Ghosh; Louis De Beaumont; Adam G. Thomas
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    BIDS Phenotype Segregation Example COPY OF "The NIMH Healthy Research Volunteer Dataset" (ds003982)

    Modality-agnostic files were copied over and the CHANGES file was updated. Data was segregated using:

    python phenotype.py segregate subject -i ds003982 -o segregated_subject

    phenotype.py came from the GitHub repository: https://github.com/ericearl/bids-phenotype

    THE ORIGINAL DATASET ds003982 README FOLLOWS

    A comprehensive clinical, MRI, and MEG collection characterizing healthy research volunteers collected at the National Institute of Mental Health (NIMH) Intramural Research Program (IRP) in Bethesda, Maryland using medical and mental health assessments, diagnostic and dimensional measures of mental health, cognitive and neuropsychological functioning, structural and functional magnetic resonance imaging (MRI), along with diffusion tensor imaging (DTI), and a comprehensive magnetoencephalography battery (MEG).

    In addition, blood samples are currently banked for future genetic analysis. All data collected in this protocol are broadly shared in the OpenNeuro repository, in the Brain Imaging Data Structure (BIDS) format. In addition, blood samples of healthy volunteers are banked for future analyses. All data collected in this protocol are broadly shared here, in the Brain Imaging Data Structure (BIDS) format. In addition, task paradigms and basic pre-processing scripts are shared on GitHub. This dataset is unique in its depth of characterization of a healthy population in terms of brain health and will contribute to a wide array of secondary investigations of non-clinical and clinical research questions.

    This dataset is licensed under the Creative Commons Zero (CC0) v1.0 License.

    Recruitment

    Inclusion criteria for the study require that participants are adults at or over 18 years of age in good health with the ability to read, speak, understand, and provide consent in English. All participants provided electronic informed consent for online screening and written informed consent for all other procedures. Exclusion criteria include:

    • A history of significant or unstable medical or mental health condition requiring treatment
    • Current self-injury, suicidal thoughts or behavior
    • Current illicit drug use by history or urine drug screen
    • Abnormal physical exam or laboratory result at the time of in-person assessment
    • Less than an 8th grade education or IQ below 70
    • Current employees, or first-degree relatives of NIMH employees

    Study participants are recruited through direct mailings, bulletin boards and listservs, outreach exhibits, print advertisements, and electronic media.

    Clinical Measures

    All potential volunteers first visit the study website (https://nimhresearchvolunteer.ctss.nih.gov), check a box indicating consent, and complete preliminary self-report screening questionnaires. The study website is HIPAA compliant and therefore does not collect PII ; instead, participants are instructed to contact the study team to provide their identity and contact information. The questionnaires include demographics, clinical history including medications, disability status (WHODAS 2.0), mental health symptoms (modified DSM-5 Self-Rated Level 1 Cross-Cutting Symptom Measure), substance use survey (DSM-5 Level 2), alcohol use (AUDIT), handedness (Edinburgh Handedness Inventory), and perceived health ratings. At the conclusion of the questionnaires, participants are again prompted to send an email to the study team. Survey results, supplemented by NIH medical records review (if present), are reviewed by the study team, who determine if the participant is likely eligible for the protocol. These participants are then scheduled for an in-person assessment. Follow-up phone screenings were also used to determine if participants were eligible for in-person screening.

    In-person Assessments

    At this visit, participants undergo a comprehensive clinical evaluation to determine final eligibility to be included as a healthy research volunteer. The mental health evaluation consists of a psychiatric diagnostic interview (Structured Clinical Interview for DSM-5 Disorders (SCID-5), along with self-report surveys of mood (Beck Depression Inventory-II (BD-II) and anxiety (Beck Anxiety Inventory, BAI) symptoms. An intelligence quotient (IQ) estimation is determined with the Kaufman Brief Intelligence Test, Second Edition (KBIT-2). The KBIT-2 is a brief (20-30 minute) assessment of intellectual functioning administered by a trained examiner. There are three subtests, including verbal knowledge, riddles, and matrices.

    Medical Evaluation

    Medical evaluation includes medical history elicitation and systematic review of systems. Biological and physiological measures include vital signs (blood pressure, pulse), as well as weight, height, and BMI. Blood and urine samples are taken and a complete blood count, acute care panel, hepatic panel, thyroid stimulating hormone, viral markers (HCV, HBV, HIV), C-reactive protein, creatine kinase, urine drug screen and urine pregnancy tests are performed. In addition, blood samples that can be used for future genomic analysis, development of lymphoblastic cell lines or other biomarker measures are collected and banked with the NIMH Repository and Genomics Resource (Infinity BiologiX). The Family Interview for Genetic Studies (FIGS) was later added to the assessment in order to provide better pedigree information; the Adverse Childhood Events (ACEs) survey was also added to better characterize potential risk factors for psychopathology. The entirety of the in-person assessment not only collects information relevant for eligibility determination, but it also provides a comprehensive set of standardized clinical measures of volunteer health that can be used for secondary research.

    MRI Scan

    Participants are given the option to consent for a magnetic resonance imaging (MRI) scan, which can serve as a baseline clinical scan to determine normative brain structure, and also as a research scan with the addition of functional sequences (resting state and diffusion tensor imaging). The MR protocol used was initially based on the ADNI-3 basic protocol, but was later modified to include portions of the ABCD protocol in the following manner:

    1. The T1 scan from ADNI3 was replaced by the T1 scan from the ABCD protocol.
    2. The Axial T2 2D FLAIR acquisition from ADNI2 was added, and fat saturation turned on.
    3. Fat saturation was turned on for the pCASL acquisition.
    4. The high-resolution in-plane hippocampal 2D T2 scan was removed and replaced with the whole brain 3D T2 scan from the ABCD protocol (which is resolution and bandwidth matched to the T1 scan).
    5. The slice-select gradient reversal method was turned on for DTI acquisition, and reconstruction interpolation turned off.
    6. Scans for distortion correction were added (reversed-blip scans for DTI and resting state scans).
    7. The 3D FLAIR sequence was made optional and replaced by one where the prescription and other acquisition parameters provide resolution and geometric correspondence between the T1 and T2 scans.

    At the time of the MRI scan, volunteers are administered a subset of tasks from the NIH Toolbox Cognition Battery. The four tasks include:

    1. Flanker inhibitory control and attention task assesses the constructs of attention and executive functioning.
    2. Executive functioning is also assessed using a dimensional change card sort test.
    3. Episodic memory is evaluated using a picture sequence memory test.
    4. Working memory is evaluated using a list sorting test.

    MEG

    An optional MEG study was added to the protocol approximately one year after the study was initiated, thus there are relatively fewer MEG recordings in comparison to the MRI dataset. MEG studies are performed on a 275 channel CTF MEG system (CTF MEG, Coquiltam BC, Canada). The position of the head was localized at the beginning and end of each recording using three fiducial coils. These coils were placed 1.5 cm above the nasion, and at each ear, 1.5 cm from the tragus on a line between the tragus and the outer canthus of the eye. For 48 participants (as of 2/1/2022), photographs were taken of the three coils and used to mark the points on the T1 weighted structural MRI scan for co-registration. For the remainder of the participants (n=16 as of 2/1/2022), a Brainsight neuronavigation system (Rogue Research, Montréal, Québec, Canada) was used to coregister the MRI and fiducial localizer coils in realtime prior to MEG data acquisition.

    Specific Measures within Dataset

    Online and In-person behavioral and clinical measures, along with the corresponding phenotype file name, sorted first by measurement location and then by file name.

    LocationMeasureFile Name
    OnlineAlcohol Use Disorders Identification Test (AUDIT)audit
    Demographicsdemographics
    DSM-5 Level 2 Substance Use - Adultdrug_use
    Edinburgh Handedness Inventory (EHI)ehi
    Health History Formhealth_history_questions
    Perceived Health Rating - selfhealth_rating
    DSM-5
  3. c

    PREVENT-AD open data in BIDS format

    • portal.conp.ca
    Updated Jan 20, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    StoP-AD Center - Douglas Mental Health University Institute (2021). PREVENT-AD open data in BIDS format [Dataset]. https://portal.conp.ca/dataset?id=projects/preventad-open-bids
    Explore at:
    Dataset updated
    Jan 20, 2021
    Dataset authored and provided by
    StoP-AD Center - Douglas Mental Health University Institute
    License

    https://openpreventad.loris.ca/images/Open_PREVENT-AD_Terms_of_Use.pnghttps://openpreventad.loris.ca/images/Open_PREVENT-AD_Terms_of_Use.png

    Description

    Longitudinal study of pre-symptomatic Alzheimer's Disease

  4. z

    BIDS CHB-MIT Scalp EEG Database

    • zenodo.org
    zip
    Updated Dec 5, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dan Jonathan; Dan Jonathan; Ali Shoeb; Ali Shoeb (2023). BIDS CHB-MIT Scalp EEG Database [Dataset]. http://doi.org/10.5281/zenodo.10259996
    Explore at:
    zipAvailable download formats
    Dataset updated
    Dec 5, 2023
    Dataset provided by
    EPFL
    Authors
    Dan Jonathan; Dan Jonathan; Ali Shoeb; Ali Shoeb
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Jun 9, 2010
    Description

    This dataset is a BIDS-compatible version of the CHB-MIT Scalp EEG Database. It reorganizes the file structure to comply with the BIDS specification. To this effect:

    • The data from subject chb21 was moved to sub-01/ses-02.
    • Metadata was organized according to BIDS.
    • Data in the EEG edf files was modified to keep only the 18 channels from a double banana bipolar montage.
    • Annotations were formatted as BIDS-score compatible `tsv` files.

    Details related to access to the data

    License

    The dataset is released under the Open Data Commons Attribution License v1.0.

    Contact person

    The original Physionet CHB-MIT Scalp EEG Database was published by Ali Shoeb. This BIDS-compatible version of the dataset was published by Jonathan Dan.

    Practical information to access the data

    The original Physionet CHB-MIT Scalp EEG Database is available on the Physionet website.

    Overview

    Project name

    CHB-MIT Scalp EEG Database


    Year that the project ran

    2010

    Brief overview of the tasks in the experiment

    This database, collected at the Children's Hospital Boston, consists of EEG recordings from pediatric subjects with intractable seizures. Subjects were monitored for up to several days following withdrawal of anti-seizure medication in order to characterize their seizures and assess their candidacy for surgical intervention.

    Description of the contents of the dataset

    Each folder (sub-01, sub-01, etc.) contains between 9 and 42 continuous .edf files from a single subject. Hardware limitations resulted in gaps between consecutively-numbered .edf files, during which the signals were not recorded; in most cases, the gaps are 10 seconds or less, but occasionally there are much longer gaps. In order to protect the privacy of the subjects, all protected health information (PHI) in the original .edf files has been replaced with surrogate information in the files provided here. Dates in the original .edf files have been replaced by surrogate dates, but the time relationships between the individual files belonging to each case have been preserved. In most cases, the .edf files contain exactly one hour of digitized EEG signals, although those belonging to case sub-10 are two hours long, and those belonging to cases sub-04, sub-06, sub-07, sub-09, and sub-23 are four hours long; occasionally, files in which seizures are recorded are shorter.

    The EEG is recorded at 256 Hz with a 16-bit resolution. The recordings are referenced in a double banana bipolar montage with 18 channels from the 10-20 electrode system.

    The dataset also contains seizure annotations as start and stop times.

    The dataset contains 664 `.edf` recordings. 129 those files that contain one or more seizures. In all, these records include 198 seizures.

    Methods

    Subjects

    23 pediatric subjects with intractable seizures. (5 males, ages 3–22; and 17 females, ages 1.5–19; 1 n/a)

    Apparatus

    Recordings were performed at the Children's Hospital Boston using the International 10-20 system of EEG electrode positions. Signals were sampled at 256 samples per second with 16-bit resolution.

  5. f

    THINGS-data: MEG BIDS raw dataset

    • plus.figshare.com
    bin
    Updated Jun 1, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Martin Hebart; Oliver Contier; Lina Teichmann; Adam Rockter; Charles Zheng; Alexis Kidder; Anna Corriveau; Maryam Vaziri-Pashkam; Chris Baker (2023). THINGS-data: MEG BIDS raw dataset [Dataset]. http://doi.org/10.25452/figshare.plus.20563800.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Figshare+
    Authors
    Martin Hebart; Oliver Contier; Lina Teichmann; Adam Rockter; Charles Zheng; Alexis Kidder; Anna Corriveau; Maryam Vaziri-Pashkam; Chris Baker
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    MEG raw dataset in BIDS format.

    Part of THINGS-data: A multimodal collection of large-scale datasets for investigating object representations in brain and behavior.

    See related materials in Collection at: https://doi.org/10.25452/figshare.plus.c.6161151

  6. g

    Converted dataset PROJECT_DAYS_P3_NUMBERS in BIDS standard.

    • doi.gin.g-node.org
    Updated May 26, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Roman Mouček; Lukáš Vařeka; Petr Brůha (2021). Converted dataset PROJECT_DAYS_P3_NUMBERS in BIDS standard. [Dataset]. http://doi.org/10.12751/g-node.5rkqr4
    Explore at:
    Dataset updated
    May 26, 2021
    Dataset provided by
    Faculty of Applied Sciences, University of West Bohemia
    Authors
    Roman Mouček; Lukáš Vařeka; Petr Brůha
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Converted dataset PROJECT_DAYS_P3_NUMBERS in BIDS standard. The data were converted from BrainVision format to BIDS format using a tool created during the master thesis in ZČU.

  7. Z

    The Krakow Paradigm - fMRI datasets in BIDS format

    • data.niaid.nih.gov
    Updated Jan 1, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ewa Beldzik; Aleksandra Domagalik (2021). The Krakow Paradigm - fMRI datasets in BIDS format [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3555188
    Explore at:
    Dataset updated
    Jan 1, 2021
    Dataset provided by
    Jagiellonian University
    Authors
    Ewa Beldzik; Aleksandra Domagalik
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Kraków
    Description

    Participants

    Forty-nine participants (mean age, 24.2 ± 3.7 years; 16 males) met the following experiment requirements: no contraindication for MRI scanning; normal or corrected-to-normal vision; no reported physical or psychiatric disorders; drug-free. To ensure sufficient experience in the environment, subjects had to be Krakow residents for at least one year. Subjects lived in Krakow on average 9.1 years (SD 8.1).

    Participants were informed about the procedure and goals of the study and they gave written consent. The study was approved by the bioethics commission at the Polish Military Institute of Aviation Medicine and was conducted in accordance with ethical standards described in the Declaration of Helsinki. The study was a part of a larger registered project (ISRCTN 18109340).

    Experimental Task

    A novel place recognition task, the Krakow Paradigm, was prepared and generated using E-Prime 2.0 (©Psychology Software Tools). The task comprised of two stages: the training session and the fMRI session. Before the training session, subjects were presented with a map of Krakow city on which a thick red line marked the city “center” area and were asked to familiarize with the borders.

    The trial comprised of the stimulus (4.5 sec duration) and two response screens (each 1.0 sec duration), all separated by the blank screens (each 0.5 sec duration). The stimulus was a photograph taken in the Krakow city (resolution 640 x 428), presenting either characteristic landmarks (e.g. an Old Square) or uncharacteristic outside places (e.g. a playground near an estate community). Photograph was presented centrally on the light-gray background and covered 60% of the screen. On the first response screen, the question “Krakow Center?” occurred with three possible answers (‘yes’, ‘no’, ‘I don’t know’) given by pressing a button on a key-pad with right-hand index, middle, or ring finger respectively. On the second response screen, the question “Have you seen it in real-life?” occurred with two possible answers (‘yes’, ‘no’) given by pressing a button using index or middle finger respectively. For both questions, responses were recorded for 1.5 sec. Between the stimuli, a fixation point (a hash sign) was presented for a varying interval between 2.4 and 6.6 sec every 0.7 sec (on average total trial length = 12 sec). Total scan time was less than 13 minutes.

    The training session was conducted to ensure timely responses. It was comprised of 7 trials, different than those used in the fMRI session, and was presented on regular computer screen. The fMRI session included 60 trials and was presented using the VisualSystem HD (NordicNeuroLab, Bergen, Norway) binocular apparatus. 50% of the photos were taken in the “center” and 50% outside of it. Characteristic and uncharacteristic places were counterbalanced across both location possibilities. At the end of the task, a feedback information was given to participants informing them the percentage of correctly classified places. Because participants were instructed to wait until the response screen appeared before making a response, reaction times are not informative and were not reported. The rationale for this procedure was to promote accuracy rather than speed, and to encourage response preparation, i.e. memory retrieval, while looking at a photo.

    MRI Data Acquisition

    MRI was performed using a 3T scanner (Magnetom Skyra, Siemens) with a 20-channel head/neck coil. High-resolution, whole-brain anatomical images were acquired using a T1-MPRAGE sequence. A total of 176 sagittal slices were obtained (voxel size 1×1×1.1 mm3; TR = 2300 ms, TE = 2.98 ms, flip angle = 9°) for co-registration with the fMRI data. Next, a B0 inhomogeneity gradient fieldmap (magnitude and phase images) was acquired with a dual-echo gradient-echo sequence, matched spatially with fMRI scans (TE1 = 4.92 ms, TE2 = 7.38 ms, TR = 400 ms).

    Functional T2*-weighted images were acquired using a whole-brain echo planar (EPI) pulse sequence with the following parameters: 3 mm isotropic voxel; TR = 2070 ms; TE = 30 ms; flip angle = 90°; FOV 224 × 224 mm2; GRAPPA acceleration factor 2; and phase encoding A/P. Due to magnetic saturation effects, the first four volumes (dummy scans) of each session were discarded instantly resulting in 360 volumes acquired for each participant.

  8. Example DWI Dataset including minimally preprocessed and co-registered data

    • zenodo.org
    zip
    Updated Apr 27, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gregory Kiar; Gregory Kiar (2020). Example DWI Dataset including minimally preprocessed and co-registered data [Dataset]. http://doi.org/10.5281/zenodo.3767048
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 27, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Gregory Kiar; Gregory Kiar
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Includes minimally preprocessed and co-registered dataset for example subject containing both diffusion weighted and T1 weighted MR images, both in BIDS format.

    The dataset in the root directory (i.e. starting with /sub-) should be used as input to many end-to-end pipelines.

    The dataset in the preprocessed directory (i.e. starting with /derivatives/preproc/) should be used as input to modelling pipelines such as tractometry or connectivity analysis.

  9. Social Observation EEG raw data

    • openneuro.org
    Updated Aug 12, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yaner Su (2025). Social Observation EEG raw data [Dataset]. http://doi.org/10.18112/openneuro.ds006554.v1.0.0
    Explore at:
    Dataset updated
    Aug 12, 2025
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Yaner Su
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    README

    WARNING

    Below is a template to write a README file for this BIDS dataset. If this message is still present, it means that the person exporting the file has decided not to update the template.If you are the researcher editing this README file, please remove this warning section. The README is usually the starting point for researchers using your dataand serves as a guidepost for users of your data. A clear and informativeREADME makes your data much more usable. In general you can include information in the README that is not captured by some otherfiles in the BIDS dataset (dataset_description.json, events.tsv, ...). It can also be useful to also include information that might already bepresent in another file of the dataset but might be important for users to be aware ofbefore preprocessing or analysing the data. If the README gets too long you have the possibility to create a /doc folderand add it to the .bidsignore file to make sure it is ignored by the BIDS validator. More info here: https://neurostars.org/t/where-in-a-bids-dataset-should-i-put-notes-about-individual-mri-acqusitions/17315/3

    Details related to access to the data

    • [ ] Data user agreement If the dataset requires a data user agreement, link to the relevant information.
    • [ ] Contact person Indicate the name and contact details (email and ORCID) of the person responsible for additional information.
    • [ ] Practical information to access the data If there is any special information related to access rights orhow to download the data make sure to include it.For example, if the dataset was curated using datalad,make sure to include the relevant section from the datalad handbook:http://handbook.datalad.org/en/latest/basics/101-180-FAQ.html#how-can-i-help-others-get-started-with-a-shared-dataset ## Overview
    • [ ] Project name (if relevant)
    • [ ] Year(s) that the project ran If no scans.tsv is included, this could at least cover when the data acquisitionstarter and ended. Local time of day is particularly relevant to subject state.
    • [ ] Brief overview of the tasks in the experiment A paragraph giving an overview of the experiment. This should include thegoals or purpose and a discussion about how the experiment tries to achievethese goals.
    • [ ] Description of the contents of the dataset An easy thing to add is the output of the bids-validator that describes what type ofdata and the number of subject one can expect to find in the dataset.
    • [ ] Independent variables A brief discussion of condition variables (sometimes called contrastsor independent variables) that were varied across the experiment.
    • [ ] Dependent variables A brief discussion of the response variables (sometimes called thedependent variables) that were measured and or calculated to assessthe effects of varying the condition variables. This might also includequestionnaires administered to assess behavioral aspects of the experiment.
    • [ ] Control variables A brief discussion of the control variables --- that is what aspectswere explicitly controlled in this experiment. The control variables mightinclude subject pool, environmental conditions, set up, or other thingsthat were explicitly controlled.
    • [ ] Quality assessment of the data Provide a short summary of the quality of the data ideally with descriptive statistics if relevantand with a link to more comprehensive description (like with MRIQC) if possible. ## Methods ### Subjects A brief sentence about the subject pool in this experiment. Remember that Control or Patient status should be defined in the participants.tsvusing a group column.
    • [ ] Information about the recruitment procedure- [ ] Subject inclusion criteria (if relevant)- [ ] Subject exclusion criteria (if relevant) ### Apparatus A summary of the equipment and environment setup for theexperiment. For example, was the experiment performed in a shielded roomwith the subject seated in a fixed position. ### Initial setup A summary of what setup was performed when a subject arrived. ### Task organization How the tasks were organized for a session.This is particularly important because BIDS datasets usually have task dataseparated into different files.)
    • [ ] Was task order counter-balanced?- [ ] What other activities were interspersed between tasks?
    • [ ] In what order were the tasks and other activities performed? ### Task details As much detail as possible about the task and the events that were recorded. ### Additional data acquired A brief indication of data other than theimaging data that was acquired as part of this experiment. In additionto data from other modalities and behavioral data, this might includequestionnaires and surveys, swabs, and clinical information. Indicatethe availability of this data. This is especially relevant if the data are not included in a phenotype folder.https://bids-specification.readthedocs.io/en/stable/03-modality-agnostic-files.html#phenotypic-and-assessment-data ### Experimental location This should include any additional information regarding thethe geographical location and facility that cannot be includedin the relevant json files. ### Missing data Mention something if some participants are missing some aspects of the data.This can take the form of a processing log and/or abnormalities about the dataset. Some examples:
    • A brain lesion or defect only present in one participant- Some experimental conditions missing on a given run for a participant because of some technical issue.- Any noticeable feature of the data for certain participants- Differences (even slight) in protocol for certain participants. ### Notes Any additional information or pointers to information thatmight be helpful to users of the dataset. Include qualitative informationrelated to how the data acquisition went.
  10. e

    Structural and functional connectomes and region-average fMRI from 50...

    • search.kg.ebrains.eu
    Updated Mar 17, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jil M. Meier; Paul Triebkorn; Michael Schirner; Petra Ritter (2025). Structural and functional connectomes and region-average fMRI from 50 healthy participants, age range 18-80 years [Dataset]. http://doi.org/10.25493/6CKF-MJS
    Explore at:
    Dataset updated
    Mar 17, 2025
    Authors
    Jil M. Meier; Paul Triebkorn; Michael Schirner; Petra Ritter
    Description

    We present processed multimodal empirical data from a study with The Virtual Brain (TVB) based on this data. Structural and functional data have been prepared in accordance with Brain Imaging Data Structure (BIDS) standards and annotated according to the openMINDS metadata framework. This simultaneous electroencephalography (EEG) - functional magnetic resonance imaging (fMRI) resting-state data, diffusion-weighted MRI (dwMRI), and structural MRI were acquired for 50 healthy adult subjects (18 - 80 years of age, mean 41.24±18.33; 31 females, 19 males) at the Berlin Center for Advanced Imaging, Charité University Medicine, Berlin, Germany. We constructed personalized models from this multimodal data of 50 healthy individuals with TVB in a previous study (Triebkorn et al. 2024). We present this large comprehensive processed data set in an annotated and structured format following BIDS standards for derivatives of MRI and BIDS Extension Proposal for computational modeling data. We describe how we processed and converted the diverse data sources to make it reusable. In its current form, this dataset can be reused for further research and provides ready-to-use data at various levels of processing for a large data set of healthy subjects with a wide age range.

  11. NOD-EEG

    • openneuro.org
    Updated Jan 12, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Guohao Zhang; Yaoze Liu; Zeng Li; Zonglei Zhen (2025). NOD-EEG [Dataset]. http://doi.org/10.18112/openneuro.ds005811.v1.0.0
    Explore at:
    Dataset updated
    Jan 12, 2025
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    Guohao Zhang; Yaoze Liu; Zeng Li; Zonglei Zhen
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    References

    Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Höchenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896).https://doi.org/10.21105/joss.01896

    Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103.https://doi.org/10.1038/s41597-019-0104-8

  12. MIND DATA

    • openneuro.org
    Updated Apr 22, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    M.P. Weisend; F.M. Hanlon; R. Montano; S.P. Ahlfors; A.C. Leuthold; D. Pantazis; J.C. Mosher; A.P. Georgopoulos; M.S. Hamalainen; C.J. Aine (2022). MIND DATA [Dataset]. http://doi.org/10.18112/openneuro.ds004107.v1.0.0
    Explore at:
    Dataset updated
    Apr 22, 2022
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    M.P. Weisend; F.M. Hanlon; R. Montano; S.P. Ahlfors; A.C. Leuthold; D. Pantazis; J.C. Mosher; A.P. Georgopoulos; M.S. Hamalainen; C.J. Aine
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    This data was part of the study of:

    M.P. Weisend, F.M. Hanlon, R. Montaño, S.P. Ahlfors, A.C. Leuthold, D. Pantazis, J.C. Mosher, A.P. Georgopoulos, M.S. Hämäläinen, C.J. Aine,, V. (2007). Paving the way for cross-site pooling of magnetoencephalography (MEG) data. International Congress Series, Volume 1300, Pages 615-618,.

    It was converted to BIDS with MNE-BIDS:

    Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Höchenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896

    Following the MEG-BIDS format:

    Niso, G., Gorgolewski, K. J., Bock, E., Brooks, T. L., Flandin, G., Gramfort, A., Henson, R. N., Jas, M., Litvak, V., Moreau, J., Oostenveld, R., Schoffelen, J., Tadel, F., Wexler, J., Baillet, S. (2018). MEG-BIDS, the brain imaging data structure extended to magnetoencephalography. Scientific Data, 5, 180110. https://doi.org/10.1038/sdata.2018.110

  13. r

    Data from: Visuo-haptic prediction errors: a multimodal dataset (EEG,...

    • resodate.org
    Updated Jul 3, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Lukas Gehrke; Leonie Terfurth; Sezen Akman; Klaus Gramann (2024). Visuo-haptic prediction errors: a multimodal dataset (EEG, motion) in BIDS format indexing mismatches in haptic interaction [Dataset]. http://doi.org/10.14279/depositonce-20829
    Explore at:
    Dataset updated
    Jul 3, 2024
    Dataset provided by
    Technische Universität Berlin
    DepositOnce
    Authors
    Lukas Gehrke; Leonie Terfurth; Sezen Akman; Klaus Gramann
    Description

    One of the key challenges in the design of immersive virtual reality (VR) is to create an experience that mimics the natural, real world as closely as possible. The overarching goal is that users “treat what they perceive as real” and consequently feel present in the virtual world (Slater, 2009). To feel present in an environment, users need to establish a dynamic and precise interaction with their surroundings. This allows users to infer the causal structures in the (virtual) world they find themselves in and develop strategies to deal with uncertainties (Knill and Pouget, 2004). Here, we present a data set that indexes interaction realism in VR. By violating users' predictions about the VR's interaction behavior in an “oddball-like” manner (Sutton et al., 1965), labels with high temporal resolution were obtained (that describe the interaction); see our previous publications (Gehrke et al., 2019, 2022).

  14. S

    UNITE example dataset

    • scidb.cn
    Updated Jul 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wei Zhang; Shenshen Li; Ning An; Jianting Huang; Danhong Wang; Jianxun Ren; Hesheng Liu (2025). UNITE example dataset [Dataset]. http://doi.org/10.57760/sciencedb.27339
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jul 8, 2025
    Dataset provided by
    Science Data Bank
    Authors
    Wei Zhang; Shenshen Li; Ning An; Jianting Huang; Danhong Wang; Jianxun Ren; Hesheng Liu
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    This dataset includes neuroimaging and clinical data from three patients with stroke, collected as part of the UNITE study. All data are organized according to the Brain Imaging Data Structure (BIDS) format to facilitate reproducibility and standardized processing. The dataset contains T1-weighted structural MRI and resting-state fMRI for each subject.

  15. Preprocessed IXI dataset with FS8

    • kaggle.com
    zip
    Updated Sep 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    KingPowa (2025). Preprocessed IXI dataset with FS8 [Dataset]. https://www.kaggle.com/datasets/kingpowa/preprocessed-ixi-dataset-with-fs8/code
    Explore at:
    zip(933198881 bytes)Available download formats
    Dataset updated
    Sep 10, 2025
    Authors
    KingPowa
    License

    Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
    License information was derived automatically

    Description

    This is a collection of Structural T1w MRI scans from the notorious IXI dataset. The data was preprocessed using Freesurfer 8. It does not include all the accessory data from the preprocessing pipeline, but it does include the structural skullstripped brain (brainmask.mgz) and the aseg mask. Additional step of preprocessing are applied, which will be briefly illustrated.

    The main folder (IXI) is organised in a semi BIDS format. Each subject has a specific folder (sub-IXI[digits]), with additional subfolders:

    _IXI

    |_sub-IXI001 <- subject folder

    | |_ses-1 <- session folder

    | | |_run-1 <- run folder

    | | | |_anat <- anatomical folder

    | | | | |_sub-IXI001_acq-GE-1.5T_mni_registered_T1w.nii <- structural T1w scan

    | | | | |_sub-IXI001_acq-GE-1.5T_segmask_mni_registered_T1w.nii <- segmentation mask

    The structural T1w scan is obtained from brainmask.mgz output of FS8 via a preprocessing scripts that does the following operations - mriconvert on the brainmask - fslreorient2std - flirt operation with simple affine transformation to the MNI152 Standard Template transformation matrix is computed and applied to the segmentation mask.

    Additionally, 2 csvs file are provided: - subjects.csv contains demographic data of the subjects, linking to the relative path of the subject folders - thickness.csv contains information on the derivative cortical thickness measures obtained via parcellation annotation of the DKTAtlas (aparc.DKTatlas.annot). We provide both mean_thickness_weighted and mean_thickness_simple. The first one is the average for the candidate based on the vertex area size, while the other is a simple mean. Information about the computation can be found on the related github repo (TODO).

  16. f

    Data

    • uvaauas.figshare.com
    bin
    Updated Dec 6, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    S.J.S. Isherwood (2019). Data [Dataset]. http://doi.org/10.21942/uva.8241596.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Dec 6, 2019
    Dataset provided by
    University of Amsterdam / Amsterdam University of Applied Sciences
    Authors
    S.J.S. Isherwood
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Files for subjects 0218, 0296, 0413, 0435, 0457, 0471, 0520, 0582, 0621, 0672, 0681, 0814, 0827, 0869, 0884 containing AHEAD structural MRI data in BIDS format.

  17. d

    Open Bid Opportunities

    • catalog.data.gov
    Updated Jan 4, 2026
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    data.sfgov.org (2026). Open Bid Opportunities [Dataset]. https://catalog.data.gov/dataset/open-bid-opportunities
    Explore at:
    Dataset updated
    Jan 4, 2026
    Dataset provided by
    data.sfgov.org
    Description

    The San Francisco Controller's Office maintains a database of open bid opportunities sourced from it's citywide financial system. This data is available in this dataset in CSV format. The data is updated on a weekly basis.

  18. o

    Data acquired to demonstrate model-based Bayesian inference of brain...

    • ora.ox.ac.uk
    zip
    Updated Jan 1, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Cherukara, M; Chappell, M; Stone, A; Blockley, N (2018). Data acquired to demonstrate model-based Bayesian inference of brain oxygenation using quantitative BOLD [Dataset]. http://doi.org/10.5287/bodleian:6R5px9K0X
    Explore at:
    zip(144565658)Available download formats
    Dataset updated
    Jan 1, 2018
    Dataset provided by
    University of Oxford
    Authors
    Cherukara, M; Chappell, M; Stone, A; Blockley, N
    License

    Open Data Commons Attribution License (ODC-By) v1.0https://www.opendatacommons.org/licenses/by/1.0/
    License information was derived automatically

    Description

    This dataset will form the basis of a forthcoming publication regarding a model-based Bayesian analysis of streamlined quantitative BOLD data to measure brain oxygenation. In the absence of a reference to this publication the methods used are outlined here. Please reference this dataset if you use it in your work.

    Cherukara MT, Stone AJ, Chappell MA, Blockley NP. Data acquired to demonstrate model-based Bayesian inference of brain oxygenation using quantitative BOLD, Oxford University Research Archive 2018. doi: 10.5287/bodleian:6R5px9K0X

    *Summary*

    Data for Study 1 (7 subjects) of this dataset were acquired as part of a previous study [1] and are also available at via the Oxford University Research Archive [2]. Data for Study 2 (5 subjects) were acquired to demonstrate and validate a model-based Bayesian analysis method of similar data. The aim in this part was to see whether a model-based correction for CSF signal, using an independent CSF partial volume estimate, could be used in place of a FLAIR preparation [3] in order to improve image SNR and reduce total scan time. Images were acquired the streamlined qBOLD protocol [1] both with and without FLAIR preparation, using an Asymmetric Spin Echo (ASE) pulse sequence [4] with Gradient Echo Slice Excitation Profile Imaging (GESEPI) incorporated to minimise the effect of through-slice magnetic field gradients [5].

    *MRI data*

    Images were acquired using a Siemens Magnetom Verio scanner at 3T. The body coil was used for transmission and the manufacturer's 32-channel head coil was used for reception. For Study 1, GESEPI ASE (GASE) data were acquired with a field of view of 240x240 mm2, a 64x64 matrix, ten 5mm slices, TR/TE=3s/74ms, and an EPI bandwidth of 2004Hz/px. ASE images are acquired with varying amount of R2′ weighting determined by the spin echo displacement time, tau. Twenty four values of tau were acquired for each GASE scan: -28, -24, -20, -16, -12, -8, -4, 0, 4, 8, 12, 16, 20, 24, 28, 32, 36, 40, 44, 48, 52, 56, 60, and 64ms. For Study 2, GASE data were acquired with a 220x220 mm2 field of view, a 96x96 matrix, eight 5mm slices, TR/TE=3s/82ms, an EPI bandwidth of 2004Hz/px, and eleven tau values: -16, -8, 0, 8, 16, 24, 32, 40, 48, 56, and 64ms. For all subjects, the GESEPI magnetic field gradient correction technique required each 5mm slice to be encoded into multiple thin partitions each 1.25mm thick. Furthermore, partitions were oversampled by 100% leading to the acquisition of 8 partitions per slice. Oversampled slices were discarded during reconstruction, resulting in 40 slices being acquired for each tau value. To regain signal to noise ratio summing the slices in blocks of four is suggested. This results in the original number of prescribed slices. A FLAIR preparation was used to null the signal from CSF, with an inversion time of 1.21s, based on literature values for T1 and T2 of CSF [3]. In Study 2, GASE data were also acquired with the same protocol but without the FLAIR preparation, as well as a single GASE volume with the same parameters, except with a TE of 250ms, and tau of 0ms, and a set of eight 2D spin echo volumes with the same dimensions as the GASE data were acquired with TE values uniformly spaced from 66 to 248 ms. These were used to generate T2 weighted estimates of CSF partial volume. High resolution T1 weighted anatomical images were acquired for registration and the generation of tissue masks and T1 weighted CSF partial volume estimates. Anatomicals were “defaced” using the shell script in the code directory [6].

    *Data curation*

    The structure in which this data has been placed is based on the Brain Imaging Data Structure (BIDS) format [7]. However, this format (BIDS version 1.0.0-rc2) does not support ASE data, but we have followed the guiding principles of this specification.

    *References*

    1. Stone AJ, Blockley NP. A streamlined acquisition for mapping baseline brain oxygenation using quantitative BOLD. NeuroImage 2017:147:79-88.

    2. Stone AJ, Blockley NP. Data acquired to demonstrate a streamlined approach to mapping and quantifying brain oxygenation using quantitative BOLD. Oxford University Research Archive 2016. doi: 10.5287/Bodleian:E24JbXQwO.

    3. Hajnal JV, Bryant DJ, Kasuboski L, Pattany PM, De Coene B, Lewis PD, Pennock JM, Oatridge A, Young IR, Bydder GM. Use of fluid attenuated inversion recovery (FLAIR) pulse sequences in MRI of the brain. J Comput Assist Tomogr 1992;16:841–844.

    4. Wismer GL, Buxton RB, Rosen BR, Fisel CR, Oot RF, Brady TJ, Davis KR. Susceptibility induced MR line broadening: applications to brain iron mapping. J Comput Assist Tomogr 1988;12:259–265.

    5. Blockley NP, Stone AJ. Improving the specificity of R2′ to the deoxyhaemoglobin content of brain tissue: Prospective correction of macroscopic magnetic field gradients. Neuroimage 2016, in press. doi: 10.1016/j.neuroimage.2016.04.013

    6. https://github.com/hanke/gumpdata/blob/master/scripts/conversion/convert_dicoms_anatomy

    7. http://bids.neuroimaging.io

  19. e

    Decoding natural sounds in early 'visual' cortex of congenitally blind...

    • search.kg.ebrains.eu
    Updated Jun 10, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Petra Vetter; Lukasz Bola; Lior Reich; Matthew Bennett; Lars Muckli; Amir Amedi (2020). Decoding natural sounds in early 'visual' cortex of congenitally blind individuals [Dataset]. http://doi.org/10.18112/openneuro.ds002715.v1.0.0
    Explore at:
    Dataset updated
    Jun 10, 2020
    Authors
    Petra Vetter; Lukasz Bola; Lior Reich; Matthew Bennett; Lars Muckli; Amir Amedi
    Description

    The dataset contains raw functional MRI data from 8 congenitally blind individual who listened to natural sounds while being scanned, as well as corresponding T1-weighted anatomical images, and events files with trial order, onsets and durations. The dataset is provided in BIDS format.

  20. HUP iEEG Epilepsy Dataset

    • openneuro.org
    Updated Sep 21, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    John M. Bernabei; Adam Li; Andrew Y. Revell; Rachel J. Smith; Kristin M. Gunnarsdottir; Ian Z. Ong; Kathryn A. Davis; Nishant Sinha; Sridevi Sarma; Brian Litt (2022). HUP iEEG Epilepsy Dataset [Dataset]. http://doi.org/10.18112/openneuro.ds004100.v1.1.0
    Explore at:
    Dataset updated
    Sep 21, 2022
    Dataset provided by
    OpenNeurohttps://openneuro.org/
    Authors
    John M. Bernabei; Adam Li; Andrew Y. Revell; Rachel J. Smith; Kristin M. Gunnarsdottir; Ian Z. Ong; Kathryn A. Davis; Nishant Sinha; Sridevi Sarma; Brian Litt
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    HUP iEEG dataset

    This dataset was prepared for release as part of a manuscript by Bernabei & Li et al. (in preparation). A subset of the data has been featured in Kini & Bernabei et al., Brain (2019) [1], and Bernabei & Sinha et al., Brain (2022) [2].

    Dataset description

    These files contain de-identified patient data collected as part of surgical treatment for drug resistant epilepsy at the Hospital of the University of Pennsylvania. Each of the 58 subjects underwent intracranial EEG with subdural grid, strip, and depth electrodes (ECoG) or purely stereotactically-placed depth electrodes (SEEG). Each patient also underwent subsequent treatment with surgical resection or laser ablation. Electrophysiologic data for both interictal and ictal periods is available, as are electrode localizations in ICBM152 MNI space. Furthermore, clinically-determined seizure onset channels are provided, as are channels which overlap with the resection/ablation zone, which was rigorously determined by segmenting the resection cavity.

    BIDS Conversion

    MNE-BIDS was used to convert the dataset into BIDS format.

    References

    [1] Kini L.*, Bernabei J.M.*, Mikhail F., Hadar P., Shah P., Khambhati A., Oechsel K., Archer R., Boccanfuso J.A., Conrad E., Stein J., Das S., Kheder A., Lucas T.H., Davis K.A., Bassett D.S., Litt B., Virtual resection predicts surgical outcome for drug resistant epilepsy. Brain, 2019.

    [2] Bernabei J.M.*, Sinha N.*, Arnold T.C., Conrad E., Ong I., Pattnaik A.R., Stein J.M., Shinohara R.T., Lucas T.H., Bassett D.S., Davis K.A., Litt B., Normative intracranial EEG maps epileptogenic tissues in focal epilepsy. Brain, 2022

    [3] Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Höchenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896

    [4] Holdgraf, C., Appelhoff, S., Bickel, S., Bouchard, K., D'Ambrosio, S., David, O., … Hermes, D. (2019). iEEG-BIDS, extending the Brain Imaging Data Structure specification to human intracranial electrophysiology. Scientific Data, 6, 102. https://doi.org/10.1038/s41597-019-0105-7

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Aude Jegou; Nicolas Roehri; Samuel Medina Villalon (2023). BIDS dataset for BIDS Manager-Pipeline [Dataset]. http://doi.org/10.6084/m9.figshare.19046345.v1
Organization logoOrganization logo

BIDS dataset for BIDS Manager-Pipeline

Explore at:
zipAvailable download formats
Dataset updated
May 31, 2023
Dataset provided by
Figsharehttp://figshare.com/
figshare
Authors
Aude Jegou; Nicolas Roehri; Samuel Medina Villalon
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

This folder contains data organised in BIDS format to test BIDS Manager-Pipeline (https://github.com/Dynamap/BIDS_Manager/tree/dev).

Search
Clear search
Close search
Google apps
Main menu