20 datasets found
  1. EEG_Auditory_Oddball_Preprocessed_Data

    • figshare.com
    bin
    Updated Jan 31, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Clare D Harris; Elise G Rowe; Roshini Randeniya; Marta I Garrido (2019). EEG_Auditory_Oddball_Preprocessed_Data [Dataset]. http://doi.org/10.6084/m9.figshare.5812764.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 31, 2019
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Clare D Harris; Elise G Rowe; Roshini Randeniya; Marta I Garrido
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset was obtained at the Queensland Brain Institute, Australia, using a 64 channel EEG Biosemi system. 21 healthy participants completed an auditory oddball paradigm (as described in Garrido et al., 2017).For a description of the oddball paradigm, please see Garrido et al., 2017:Garrido, M.I., Rowe, E.G., Halasz, V., & Mattingley, J. (2017). Bayesian mapping reveals that attention boosts neural responses to predicted and unpredicted stimuli. Cerebral Cortex, 1-12. DOI: 10.1093/cercor/bhx087If you use this dataset, please cite its doi, as well as citing the associated methods paper, which is as follows:Harris, C.D., Rowe, E.G., Randeniya, R. and Garrido, M.I. (2018). Bayesian Model Selection Maps for group studies using M/EEG data.For scripts to analyse the data, please see: https://github.com/ClareDiane/BMS4EEG

  2. l

    Data set for a comprehensive tutorial on the SOM-RPM toolbox for MATLAB

    • opal.latrobe.edu.au
    • researchdata.edu.au
    hdf
    Updated Aug 22, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sarah Bamford; Wil Gardner; Paul Pigram; Ben Muir; David Winkler; Davide Ballabio (2024). Data set for a comprehensive tutorial on the SOM-RPM toolbox for MATLAB [Dataset]. http://doi.org/10.26181/25648905.v2
    Explore at:
    hdfAvailable download formats
    Dataset updated
    Aug 22, 2024
    Dataset provided by
    La Trobe
    Authors
    Sarah Bamford; Wil Gardner; Paul Pigram; Ben Muir; David Winkler; Davide Ballabio
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    This data set is uploaded as supporting information for the publication entitled:A Comprehensive Tutorial on the SOM-RPM Toolbox for MATLABThe attached file 'case_study' includes the following:X : Data from a ToF-SIMS hyperspectral image. A stage raster containing 960 x800 pixels with 963 associated m/z peaks.pk_lbls: The m/z label for each of the 963 m/z peaks.mdl and mdl_masked: SOM-RPM models created using the SOM-RPM tutorial provided within the cited article.Additional details about the datasets can be found in the published article.V2 - contains modified peak lists to show intensity weighted m/z rather than peak midpoint. If you use this data set in your work, please cite our work as follows:[LINK TO BE ADDED TO PAPER ONCE DOI RECEIVED]

  3. f

    How to set the input parameters: an example.

    • plos.figshare.com
    xls
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Montalto; Luca Faes; Daniele Marinazzo (2023). How to set the input parameters: an example. [Dataset]. http://doi.org/10.1371/journal.pone.0109462.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Alessandro Montalto; Luca Faes; Daniele Marinazzo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    How to set the input parameters: an example.

  4. f

    Example of the parameters required to define the methods for an experiment...

    • plos.figshare.com
    xls
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Montalto; Luca Faes; Daniele Marinazzo (2023). Example of the parameters required to define the methods for an experiment on 5 variables. [Dataset]. http://doi.org/10.1371/journal.pone.0109462.t002
    Explore at:
    xlsAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Alessandro Montalto; Luca Faes; Daniele Marinazzo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In the second column the instantaneous effects are neglected both for targets and conditioning. In the third column we set instantaneous effects for some drivers and the respective targets. For example, when the target is 1, instantaneous effects are taken into account for driver 2 (first two rows, right column, parameter idDrivers) and conditioning variable 3 (first row, right column, parameter idOtherLagZero).Example of the parameters required to define the methods for an experiment on 5 variables.

  5. d

    Remotely sensed data, field measurements, and MATLAB code used to produce...

    • catalog.data.gov
    Updated Nov 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Remotely sensed data, field measurements, and MATLAB code used to produce image-derived velocity maps for a reach of the Sacramento River near Glenn, CA, September 16-19, 2024 [Dataset]. https://catalog.data.gov/dataset/remotely-sensed-data-field-measurements-and-matlab-code-used-to-produce-image-derived-v-19
    Explore at:
    Dataset updated
    Nov 5, 2024
    Dataset provided by
    U.S. Geological Survey
    Area covered
    Sacramento River, Glenn
    Description

    This data release provides remotely sensed data, field measurements, and MATLAB code associated with an effort to produce image-derived velocity maps for a reach of the Sacramento River in California's Central Valley. Data collection occurred from September 16-19, 2024, and involved cooperators from the Intelligent Robotics Group from the National Aeronautics and Space Administration (NASA) Ames Research Center and the National Oceanographic and Atmospheric Administration (NOAA) Southwest Fisheries Science Center. The remotely sensed data were obtained from an Uncrewed Aircraft System (UAS) and are stored in Robot Operating System (ROS) .bag files. Within these files, the various data types are organized into ROS topics including: images from a thermal camera, measurements of the distance from the UAS down to the water surface made with a laser range finder, and position and orientation data recorded by a Global Navigation Satellite System (GNSS) receiver and Inertial Measurement Unit (IMU) during the UAS flights. This instrument suite is part of an experimental payload called the River Observing System (RiOS) designed for measuring streamflow and further detail is provided in the metadata file associated with this data release. For the September 2024 test flights, the RiOS payload was deployed from a DJI Matrice M600 Pro hexacopter hovering approximately 270 m above the river. At this altitude, the thermal images have a pixel size of approximately 0.38 m but are not geo-referenced. Two types of ROS .bag files are provided in separate zip folders. The first, Baguettes.zip, contains "baguettes" that include 15-second subsets of data with a reduced sampling rate for the GNSS and IMU. The second, FullBags.zip, contains the full set of ROS topics recorded by RiOS but have been subset to include only the time ranges during which the UAS was hovering in place over one of 11 cross sections along the reach. The start times are included in the .bag file names as portable operating system interface (posix) time stamps. To view the data within ROS .bag files, the Foxglove Studio program linked below is freely available and provides a convenient interface. Note that to view the thermal images, the contrast will need to be adjusted to minimum and maximum values around 12,000 to 15,000, though some further refinement of these values might be necessary to enhance the display. To enable geo-referencing of the thermal images in a post-processing mode, another M600 hexacopter equipped with a standard visible camera was deployed along the river to acquire images from which an orthophoto was produced: 20240916_SacramentoRiver_Ortho_5cm.tif. This orthophoto has a spatial resolution of 0.05 m and is in the Universal Transverse Mercator (UTM) coordinate system, Zone 10. To assess the accuracy of the orthophoto, 21 circular aluminum ground control targets visible in both thermal and RGB (red, green, blue) images were placed in the field and their locations surveyed with a Real-Time Kinematic (RTK) GNSS receiver. The coordinates of these control points are provided in the file SacGCPs20240916.csv. Please see the metadata for additional information on the camera, the orthophoto production process, and the RTK GNSS survey. The thermal images were used as input to Particle Image Velocimetry (PIV) algorithms to infer surface flow velocities throughout the reach. To assess the accuracy of the resulting image-derived velocity estimates, field measurements of flow velocity were obtained using a SonTek M9 acoustic Doppler current profiler (ADCP). These data were acquired along a series of 11 cross sections oriented perpendicular to the primary downstream flow direction and spaced approximately 150 m apart. At each cross section, the boat from which the ADCP was deployed made four passes across the channel and the resulting data was then aggregated into mean cross sections using the Velocity Mapping Toolbox (VMT) referenced below (Parsons et al., 2013). The VMT output was further processed as described in the metadata and ultimately led to a single comma delimited text file, SacAdcp20240918.csv, with cross section numbers, spatial coordinates (UTM Zone 10N), cross-stream distances, velocity vector components, and water depths. To assess the sensitivity of thermal image velocimetry to environmental conditions, air and water temperatures were recorded using a pair of Onset HOBO U20 pressure transducer data loggers set to record pressure and temperature. Deploying one data logger in the air and one in the water also provided information on variations in water level during the test flights. The resulting temperature and water level time series are provided in the file HoboDataSummary.csv with a one-minute sampling interval. These data sets were used to develop and test a new framework for mapping flow velocities in river channels in approximately real time using images from an UAS as they are acquired. Prototype code for implementing this approach was developed in MATLAB and is also included in the data release as a zip folder called VelocityMappingCode.zip. Further information on the individual functions (*.m files) included within this folder is available in the metadata file associated with this data release. The code is provided as is and is intended for research purposes only. Users are advised to thoroughly read the metadata file associated with this data release to understand the appropriate use and limitations of the data and code provided herein.

  6. v

    Maps of water depth derived from satellite images of selected reaches of the...

    • res1catalogd-o-tdatad-o-tgov.vcapture.xyz
    • data.usgs.gov
    • +1more
    Updated Sep 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Maps of water depth derived from satellite images of selected reaches of the American, Colorado, and Potomac Rivers acquired in 2020 and 2021 (ver. 2.0, September 2024) [Dataset]. https://res1catalogd-o-tdatad-o-tgov.vcapture.xyz/dataset/maps-of-water-depth-derived-from-satellite-images-of-selected-reaches-of-the-american-colo
    Explore at:
    Dataset updated
    Sep 12, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Colorado, United States
    Description

    Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) source code for performing this analysis is provided in the function NN_depth_ensembling.m and the figure included on this landing page provides a flow chart illustrating the four different neural network-based depth retrieval methods. As examples of the resulting models, MATLAB *.mat data files containing the best-performing neural network model for each site are provided below, along with a file that lists the PlanetScope image identifiers for the images that were used for each site. To develop and test this new NNDR approach, the method was applied to satellite images from three rivers across the U.S.: the American, Colorado, and Potomac. For each site, field measurements of water depth available through other data releases were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: X_mean-spec.tif, X_mean-depth.tif, X_NN-depth.tif, and X-single-image.tif, where X denotes the site name. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.

  7. d

    Maps of water depth derived from satellite images of the Colorado River...

    • catalog.data.gov
    Updated Sep 12, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Maps of water depth derived from satellite images of the Colorado River acquired in March and April of 2021 [Dataset]. https://catalog.data.gov/dataset/maps-of-water-depth-derived-from-satellite-images-of-the-colorado-river-acquired-in-march-
    Explore at:
    Dataset updated
    Sep 12, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Colorado River
    Description

    Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) source code for performing this analysis is provided in the function NN_depth_ensembling.m available on the main landing page for the data release of which this is a child item, along with a flow chart illustrating the four different neural network-based depth retrieval methods. To develop and test this new NNDR approach, the method was applied to satellite images from the Colorado River near Lees Ferry, AZ, acquired in March and April of 2021. Field measurements of water depth available through another data release (Legleiter, C.J., Debenedetto, G.P., and Forbes, B.T., 2022, Field measurements of water depth from the Colorado River near Lees Ferry, AZ, March 16-18, 2021: U.S. Geological Survey data release, https://doi.org/10.5066/P9HZL7BZ) were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: Colorado_mean-spec.tif, Colorado_mean-depth.tif, Colorado_NN-depth.tif, and Colorado-single-image.tif. In addition, to assess the robustness of the Mean-spec and NN-depth methods to the introduction of a large pulse of sediment by a flood event that occurred partway through the image time series, depth maps from before and after the flood are provided in the files Colorado_Mean-spec_after_flood.tif, Colorado_Mean-spec_before_flood.tif, Colorado_NN-depth_after_flood.tif, and Colorado_NN-depth_before_flood.tif. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.

  8. SIMToolbox: A MATLAB toolbox for structured illumination fluorescence...

    • zenodo.org
    tiff
    Updated Jul 19, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Pavel Křížek; Tomáš Lukeš; Martin Ovesný; Karel Fliegel; Karel Fliegel; Guy M. Hagen; Guy M. Hagen; Pavel Křížek; Tomáš Lukeš; Martin Ovesný (2024). SIMToolbox: A MATLAB toolbox for structured illumination fluorescence microscopy [Dataset]. http://doi.org/10.1093/bioinformatics/btv576
    Explore at:
    tiffAvailable download formats
    Dataset updated
    Jul 19, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Pavel Křížek; Tomáš Lukeš; Martin Ovesný; Karel Fliegel; Karel Fliegel; Guy M. Hagen; Guy M. Hagen; Pavel Křížek; Tomáš Lukeš; Martin Ovesný
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    SIMToolbox is an open-source, modular set of functions for MATLAB equipped with a user-friendly graphical interface and designed for processing two-dimensional and three-dimen- sional data acquired by structured illumination microscopy (SIM). Both optical sectioning and super-resolution applications are supported. The software is also capable of maximum a posteriori probability image estimation (MAP-SIM), an alternative method for reconstruction of structured il- lumination images. MAP-SIM can potentially reduce reconstruction artifacts, which commonly occur due to refractive index mismatch within the sample and to imperfections in the illumination.

  9. Datasets and Supporting Materials for the IPIN 2021 Competition Track 3...

    • zenodo.org
    • recerca.uoc.edu
    • +1more
    zip
    Updated Jun 14, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Joaquin Torres-Sospedra; Joaquin Torres-Sospedra; Fernando Aranda Polo; Fernando Aranda Polo; Felipe Parralejo; Felipe Parralejo; Vladimir Bellavista Parent; Fernando Alvarez; Fernando Alvarez; Antoni Pérez-Navarro; Antoni Pérez-Navarro; Antonio R. Jimenez; Antonio R. Jimenez; Fernando Seco; Fernando Seco; Vladimir Bellavista Parent (2022). Datasets and Supporting Materials for the IPIN 2021 Competition Track 3 (Smartphone-based, off-site) [Dataset]. http://doi.org/10.5281/zenodo.5948678
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 14, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Joaquin Torres-Sospedra; Joaquin Torres-Sospedra; Fernando Aranda Polo; Fernando Aranda Polo; Felipe Parralejo; Felipe Parralejo; Vladimir Bellavista Parent; Fernando Alvarez; Fernando Alvarez; Antoni Pérez-Navarro; Antoni Pérez-Navarro; Antonio R. Jimenez; Antonio R. Jimenez; Fernando Seco; Fernando Seco; Vladimir Bellavista Parent
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This package contains the datasets and supplementary materials used in the IPIN 2021 Competition.

    Contents:

    • IPIN2021_Track03_TechnicalAnnex_V1-02.pdf: Technical annex describing the competition
    • 01-Logfiles: This folder contains a subfolder with the 105 training logfiles, 80 of them single floor indoors, 10 in outdoor areas, 10 of them in the indoor auditorium with floor-trasitio and 5 of them in floor-transition zones, a subfolder with the 20 validation logfiles, and a subfolder with the 3 blind evaluation logfile as provided to competitors.
    • 02-Supplementary_Materials: This folder contains the matlab/octave parser, the raster maps, the files for the matlab tools and the trajectory visualization.
    • 03-Evaluation: This folder contains the scripts used to calculate the competition metric, the 75th percentile on the 82 evaluation points. It requires the Matlab Mapping Toolbox. The ground truth is also provided as 3 csv files. Since the results must be provided with a 2Hz freq. starting from apptimestamp 0, the GT files include the closest timestamp matching the timing provided by competitors for the 3 evaluation logfiles. It contains samples of reported estimations and the corresponding results.

    Please, cite the following works when using the datasets included in this package:

  10. c

    Maps of water depth derived from satellite images of the American River...

    • s.cnmilf.com
    • catalog.data.gov
    Updated Sep 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Maps of water depth derived from satellite images of the American River acquired in October 2020 [Dataset]. https://s.cnmilf.com/user74170196/https/catalog.data.gov/dataset/maps-of-water-depth-derived-from-satellite-images-of-the-american-river-acquired-in-octobe
    Explore at:
    Dataset updated
    Sep 12, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    American River, United States
    Description

    Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) for performing this analysis is provided in the function NN_depth_ensembling.m available on the main landing page for the data release of which this is a child item, along with a flow chart illustrating the four different neural network-based depth retrieval methods. To develop and test this new NNDR approach, the method was applied to satellite images from the American River near Fair Oaks, CA, acquired in October 2020. Field measurements of water depth available through another data release (Legleiter, C.J., and Harrison, L.R., 2022, Field measurements of water depth from the American River near Fair Oaks, CA, October 19-21, 2020: U.S. Geological Survey data release, https://doi.org/10.5066/P92PNWE5) were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: American_mean-spec.tif, American_mean-depth.tif, American_NN-depth.tif, and American-single-image.tif. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.

  11. Datasets and Supporting Materials for the IPIN 2020 Competition Track 3...

    • zenodo.org
    • data.europa.eu
    zip
    Updated Jun 15, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Joaquín Torres-Sospedra; Darwin Quezada Gaibor; Antonio R. Jiménez; Antoni Pérez-Navarro; Fernando Seco; Joaquín Torres-Sospedra; Darwin Quezada Gaibor; Antonio R. Jiménez; Antoni Pérez-Navarro; Fernando Seco (2021). Datasets and Supporting Materials for the IPIN 2020 Competition Track 3 (Smartphone-based, off-site) [Dataset]. http://doi.org/10.5281/zenodo.4314992
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 15, 2021
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Joaquín Torres-Sospedra; Darwin Quezada Gaibor; Antonio R. Jiménez; Antoni Pérez-Navarro; Fernando Seco; Joaquín Torres-Sospedra; Darwin Quezada Gaibor; Antonio R. Jiménez; Antoni Pérez-Navarro; Fernando Seco
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This package contains the datasets and supplementary materials used in the IPIN 2020 Competition.

    Contents:

    • IPIN2020_Track03_TechnicalAnnex_V1-01.pdf: Technical annex describing the competition
    • 01-Logfiles: This folder contains a subfolder with the 78 training logfiles, 72 of them single floor, 4 in bookshelves areas and 2 of them in floor-transition zones, a subfolder with the 13 validation logfiles, and a subfolder with the 1 blind evaluation logfile as provided to competitors.
    • 02-Supplementary_Materials: This folder contains the matlab/octave parser, the raster maps, the files for the matlab tools and the trajectory visualization.
    • 03-Evaluation: This folder contains the scripts used to calculate the competition metric, the 75th percentile on the 82 evaluation points. It requires the Matlab Mapping Toolbox. The ground truth is also provided as a CSV file. Since the results must be provided with a 2Hz freq. starting from apptimestamp 0, the GT includes the closest timestamp matching the timing provided by competitors. It contains a sample of reported estimations and the corresponding results. Additionally, we provide a second script to provide a more detailed report on the results file (requires export_fig folder to run).

    Please, cite the following works when using the datasets included in this package:

    • Torres-Sospedra, J.; Quezada-Gaibor, D.; Jimenez, A.R.; Perez-Navarro, A.; Seco, F.; Datasets and Supporting Materials for the IPIN 2020 Competition Track 3 (Smartphone-based, off-site). http://dx.doi.org/10.5281/zenodo.4314992
    • Potortì, F.; Torres-Sospedra, J.; Quezada-Gaibor, D.; Jiménez, A.R.; Seco, F.; Pérez-Navarro, A.; Ortiz, M.; Zhu, N.; Renaudin, V.; Ichikari, R.; Shimomura, R.; Ohta, N.; Nagae, S.; Kurata, T.; Wei, D.; Ji, X.; Zhang, W.; Kram, S.; Stahlke, M.; Mutschler, C.; Crivello, A.; Barsocchi, P.; Girolami, M.; Palumbo, F.; Chen, R.; Wu, Y.; Li, W.; Yu, Y.; Xu, S.; Huang, L.; Liu, T.; Kuang, J.; Niu, X.; Yoshida, T.; Nagata, Y.; Fukushima, Y.; Fukatani, N.; Hayashida, N.; Asai, Y.; Urano, K.; Ge, W.; Lee, N.T.; Fang, S.H.; Jie, Y.C.; Young, S.R.; Chien, Y.R.; Yu, C.C.; Ma, C.; Wu, B.; Zhang, W.; Wang, Y.; Fan, Y.; Poslad, S.; Selviah, D.R.; Wang, W.; Yuan, H.; Yonamoto, Y.; Yamaguchi, M.; Kaichi, T.; Zhou, B.; Liu, X.; Gu, Z.; Yang, C.; Wu, Z.; Xie, D.; Huang, C.; Zheng, L.; Peng, A.; Jin, G.; Wang, Q.; Luo, H.; Xiong, H.; Bao, L.; Zhang, P.; Zhao, F.; Yu, C.A.; Hung, C.H.; Antsfeld, L.; Chidlovskii, B.; Jiang, H.; Xia, M.; Yan, D.; Li, Y.; Dong, Y.; Silva, I.; Pendão, C.; Meneses, F.; Nicolau, M.J.; Costa, A.; Moreira, A.; Cock, C.D.; Plets, D.; Opiela, M.; Džama, J.; Zhang, L.; Li, H.; Chen, B.; Liu, Y.; Yean, S.; Lim, B.Z.; Teo, W.J.; Lee, B.S.; Oh, H.L. Off-line Evaluation of Indoor Positioning Systems in Different Scenarios: The Experiences from IPIN 2020 Competition IEEE Sensors Journal, Early Access (in press), 2021. https://doi.org/10.1109/JSEN.2021.3083149
  12. Digital Forestry Toolbox - Sample data

    • zenodo.org
    bin
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Matthew Parkan; Matthew Parkan (2020). Digital Forestry Toolbox - Sample data [Dataset]. http://doi.org/10.5281/zenodo.1998192
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Matthew Parkan; Matthew Parkan
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This repository contains airborne laser scanning data samples acquired by the states of Geneva, Solothurn and Zurich (in Switzerland). They are used in the tutorials of the Digital Forestry Toolbox for Matlab/Octave.

    The full datasets (covering the complete state extents) are available from here:

    Sources and usage conditions:

  13. Datasets and Supporting Materials for the IPIN 2024 Competition Track 3...

    • zenodo.org
    • recerca.uoc.edu
    • +1more
    zip
    Updated Feb 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Joaquin Torres Sospedra; Antonino Crivello; Antonino Crivello; Maximilian Stahlke; Maximilian Stahlke; Francesco Potortì; Francesco Potortì; Miguel Ortiz; Miguel Ortiz; Ziyou Li; Ziyou Li; Antoni Perez-Navarro; Antoni Perez-Navarro; Antonio Ramon Jimenez Ruiz; Antonio Ramon Jimenez Ruiz; Joaquin Torres Sospedra (2025). Datasets and Supporting Materials for the IPIN 2024 Competition Track 3 (Smartphone-based, off-site) [Dataset]. http://doi.org/10.5281/zenodo.13931119
    Explore at:
    zipAvailable download formats
    Dataset updated
    Feb 10, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Joaquin Torres Sospedra; Antonino Crivello; Antonino Crivello; Maximilian Stahlke; Maximilian Stahlke; Francesco Potortì; Francesco Potortì; Miguel Ortiz; Miguel Ortiz; Ziyou Li; Ziyou Li; Antoni Perez-Navarro; Antoni Perez-Navarro; Antonio Ramon Jimenez Ruiz; Antonio Ramon Jimenez Ruiz; Joaquin Torres Sospedra
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This package contains the datasets and supplementary materials used in the IPIN 2024 Competition.

    Contents

    • Track-3_TA-2024.pdf: Technical annex describing the competition (Version 1)
    • 01 Logfiles: This folder contains a subfolder with the 54 training trials, a subfolder with the 4 testing trials (validation), and a subfolder with the 2 blind scoring trials (test) as provided to competitors.
    • 02 Supplementary_Materials: This folder contains the Matlab/octave parser, the raster maps, the files for the Matlab tools and the trajectory visualization.
    • 03 Evaluation: This folder contains the scripts we used to calculate the competition metric, the 75th percentile on the 69 evaluation points. It requires the Matlab Mapping Toolbox. We also provide the ground truth as 2 CSV files. It contains samples of reported estimations and the corresponding results.

    We provide additional information on the competition at: https://competition.ipin-conference.org/2024/call-for-competition

    Citation Policy

    Please cite the following works when using the datasets included in this package:

    Torres-Sospedra, J.; et al. Datasets and Supporting Materials for the IPIN 2024
    Competition Track 3 (Smartphone-based, off-site), Zenodo 2024
    http://dx.doi.org/10.5281/zenodo.13931119

    Check the updated citation policy at: http://dx.doi.org/10.5281/zenodo.13931119

    Contact

    For any further questions about the database and this competition track, please contact:

    Joaquín Torres-Sospedra
    Departament d'Informatica, Universitat de València, 46100 Burjassot, Spain
    ValgrAI - Valencian Graduate School and Research Network of Artificial Intelligence, Camí de Vera s/n, 46022 Valencia, Spain
    Joaquin.Torres@uv.es - info@jtorr.es

    Antonio R. Jiménez
    Centre of Automation and Robotics (CAR)-CSIC/UPM, Spain
    antonio.jimenez@csic.es

    Antoni Pérez-Navarro
    Faculty of Computer Sciences, Multimedia and Telecommunication, Universitat Oberta de Catalunya, Barcelona, Spain
    aperezn@uoc.edu

    Acknowledgements

    We thank Maximilian Stahlke and Christopher Mutschler at Fraunhofer ISS, as well as Miguel Ortiz and Ziyou Li at Université Gustave Eiffel, for their invaluable support in collecting the datasets. And last but certainly not least, Antonino Crivello and Francesco Potortì for their huge effort in georeferencing the competition venue and evaluation points.

    We extend our appreciation to the staff at the Museum for Industrial Culture (Museum Industriekultur) for their unwavering patience and invaluable support throughout our collection days.

    We are also grateful to Francesco Potortì, the ISTI-CNR team (Paolo, Michele & Filippo), and the Fraunhofer IIS team (Chris, Tobi, Max, ...) for their invaluable commitment to organizing and promoting the IPIN competition.

    This work and competition are part of the IPIN 2023 Conference in Nuremberg (Germany) and the IPIN 2024 Conference in Hong Kong.

    Parts of this work received the financial support received from projects and grants:

    • POSITIONATE (CIDEXG/2023/17, Conselleria d’Educació, Universitats i Ocupació, Generalitat Valenciana)
    • ORIENTATE (H2020-MSCA-IF-2020, Grant Agreement 101023072)
    • GeoLibero (from CYTED)
    • INDRI (MICINN, ref. PID2021-122642OB-C42, PID2021-122642OB-C43, PID2021-122642OB-C44, MCIU/AEI/FEDER UE)
    • MICROCEBUS (MICINN, ref. RTI2018-095168-B-C55, MCIU/AEI/FEDER UE)
    • TARSIUS (TIN2015-71564-C4-2-R, MINECO/FEDER)
    • SmartLoc(CSIC-PIE Ref.201450E011)
    • LORIS (TIN2012-38080-C04-04)
  14. u

    Datasets and Supporting Materials for the IPIN 2022 Competition Track 3...

    • recerca.uoc.edu
    Updated 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Torres-Sospedra, Joaquín; Silva, Ivo; Pendao, Cristiano; Moreira, Adriano; Meneses, Filipe; Costa, Antonio; Nicolau, Maria João; Gonzalez-Perez, Alberto; Jiménez, Antonio Ramón; Pérez-Navarro, Antoni; Torres-Sospedra, Joaquín; Silva, Ivo; Pendao, Cristiano; Moreira, Adriano; Meneses, Filipe; Costa, Antonio; Nicolau, Maria João; Gonzalez-Perez, Alberto; Jiménez, Antonio Ramón; Pérez-Navarro, Antoni (2023). Datasets and Supporting Materials for the IPIN 2022 Competition Track 3 (Smartphone-based, off-site) [Dataset]. https://recerca.uoc.edu/documentos/668fc41fb9e7c03b01bd475d
    Explore at:
    Dataset updated
    2023
    Authors
    Torres-Sospedra, Joaquín; Silva, Ivo; Pendao, Cristiano; Moreira, Adriano; Meneses, Filipe; Costa, Antonio; Nicolau, Maria João; Gonzalez-Perez, Alberto; Jiménez, Antonio Ramón; Pérez-Navarro, Antoni; Torres-Sospedra, Joaquín; Silva, Ivo; Pendao, Cristiano; Moreira, Adriano; Meneses, Filipe; Costa, Antonio; Nicolau, Maria João; Gonzalez-Perez, Alberto; Jiménez, Antonio Ramón; Pérez-Navarro, Antoni
    Description

    This package contains the datasets and supplementary materials used in the IPIN 2022 Competition. Contents: Track-3_TA-2022.pdf: Technical annex describing the competition (Version 2) 01 Logfiles: This folder contains a subfolder with the 89 training trials a subfolder with the 24 testing trials (validation), and a subfolder with the 3 blind scoring trials (test) as provided to competitors. 02 Supplementary_Materials: This folder contains the matlab/octave parser, the raster maps, the files for the matlab tools and the trajectory visualization. 03 Evaluation: This folder contains the scripts used to calculate the competition metric, the 75th percentile on the 31|61|61 evaluation points. It requires the Matlab Mapping Toolbox. The ground truth is also provided as 3 csv files. Since the results must be provided with a 2Hz freq. starting from apptimestamp 0, the GT files include the closest timestamp matching the timing provided by competitors for the 3 evaluation logfiles. It contains samples of reported estimations and the corresponding results. Please, cite the following works when using the datasets included in this package: Torres-Sospedra, J.; et al. Datasets and Supporting Materials for the IPIN 2022 Competition Track 3 (Smartphone-based, off-site), Zenodo 2022. http://dx.doi.org/10.5281/zenodo.7612915

  15. Data from: Datasets and Supporting Materials for the IPIN 2023 Competition...

    • zenodo.org
    • producciocientifica.uv.es
    • +1more
    zip
    Updated Jul 9, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Joaquín Torres-Sospedra; Joaquín Torres-Sospedra; Antonino Crivello; Antonino Crivello; Maximilian Stahlke; Maximilian Stahlke; Francesco Potortì; Francesco Potortì; Miguel Ortiz; Miguel Ortiz; Ziyou Li; Ziyou Li; Antoni Pérez-Navarro; Antoni Pérez-Navarro; Antonio R. Jiménez; Antonio R. Jiménez (2024). Datasets and Supporting Materials for the IPIN 2023 Competition Track 3 (Smartphone-based, off-site) [Dataset]. http://doi.org/10.5281/zenodo.8362205
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 9, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Joaquín Torres-Sospedra; Joaquín Torres-Sospedra; Antonino Crivello; Antonino Crivello; Maximilian Stahlke; Maximilian Stahlke; Francesco Potortì; Francesco Potortì; Miguel Ortiz; Miguel Ortiz; Ziyou Li; Ziyou Li; Antoni Pérez-Navarro; Antoni Pérez-Navarro; Antonio R. Jiménez; Antonio R. Jiménez
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Sep 28, 2023
    Description

    This package contains the datasets and supplementary materials used in the IPIN 2023 Competition.

    Contents

    • Track-3_TA-2023.pdf: Technical annexe describing the competition (Version 2)
    • 01 Logfiles: This folder contains a subfolder with the 54 training trials, a subfolder with the 4 testing trials (validation), and a subfolder with the 2 blind scoring trials (test) as provided to competitors.
    • 02 Supplementary_Materials: This folder contains the Matlab/octave parser, the raster maps, the files for the Matlab tools and the trajectory visualization.
    • 03 Evaluation: This folder contains the scripts we used to calculate the competition metric, the 75th percentile on the 69 evaluation points. It requires the Matlab Mapping Toolbox. We also provide the ground truth as 2 CSV files. It contains samples of reported estimations and the corresponding results.

    We provide additional information on the competition at: https://evaal.aaloa.org/2023/call-for-competition

    Citation Policy

    Please cite the following works when using the datasets included in this package:

    Torres-Sospedra, J.; et al. Datasets and Supporting Materials for the IPIN 2023
    Competition Track 3 (Smartphone-based, off-site), Zenodo 2023
    http://dx.doi.org/10.5281/zenodo.8362205

    Check the updated citation policy at: http://dx.doi.org/10.5281/zenodo.8362205

    Contact

    For any further questions about the database and this competition track, please contact:

    Joaquín Torres-Sospedra
    Centro ALGORITMI,
    Universidade do Minho, Portugal
    info@jtorr.es - jtorres@algoritmi.uminho.pt

    Antonio R. Jiménez
    Centre of Automation and Robotics (CAR)-CSIC/UPM, Spain
    antonio.jimenez@csic.es

    Antoni Pérez-Navarro
    Faculty of Computer Sciences, Multimedia and Telecommunication, Universitat Oberta de Catalunya, Barcelona, Spain
    aperezn@uoc.edu

    Acknowledgements

    We thank Maximilian Stahlke and Christopher Mutschler at Fraunhofer ISS, as well as Miguel Ortiz and Ziyou Li at Université Gustave Eiffel, for their invaluable support in collecting the datasets. And last but certainly not least, Antonino Crivello and Francesco Potortì for their huge effort in georeferencing the competition venue and evaluation points.

    We extend our appreciation to the staff at the Museum for Industrial Culture (Museum Industriekultur) for their unwavering patience and invaluable support throughout our collection days.

    We are also grateful to Francesco Potortì, the ISTI-CNR team (Paolo, Michele & Filippo), and the Fraunhofer IIS team (Chris, Tobi, Max, ...) for their invaluable commitment to organizing and promoting the IPIN competition.

    This work and competition belong to the IPIN 2023 Conference in Nuremberg (Germany).

    Parts of this work received the financial support received from projects and grants:

    • ORIENTATE (H2020-MSCA-IF-2020, Grant Agreement 101023072)
    • GeoLibero (from CYTED)
    • INDRI (MICINN, ref. PID2021-122642OB-C42, PID2021-122642OB-C43, PID2021-122642OB-C44, MCIU/AEI/FEDER UE)
    • MICROCEBUS (MICINN, ref. RTI2018-095168-B-C55, MCIU/AEI/FEDER UE)
    • TARSIUS (TIN2015-71564-C4-2-R, MINECO/FEDER)
    • SmartLoc(CSIC-PIE Ref.201450E011)
    • LORIS (TIN2012-38080-C04-04)

  16. t

    Data from: Phorex & qy photoreactor - monte carlo ray tracing in matlab® &...

    • service.tib.eu
    Updated Aug 4, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2023). Phorex & qy photoreactor - monte carlo ray tracing in matlab® & quantum yield measurements [Dataset]. https://service.tib.eu/ldmservice/dataset/rdr-doi-10-35097-1439
    Explore at:
    Dataset updated
    Aug 4, 2023
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    Abstract: Sonnengestützte Fotosynthesen stellen einen Weg dar die Herausforderungen zu meistern, die aus der Notwendigkeit fossile Energieträger in der Weltwirtschaft zu ersetzen resultieren. Die Fortschritte im Bereich solarer Fotosynthesen hängen dabei stark von neuen Prozessdesigns ab. Deren Entwicklung setzt verlässliche Methoden zur Bestimmung von Quantenausbeuten und Fotoreaktionskinetiken und die Fähigkeit Strahlungstransport in Fotoreaktoren akkurat abbilden zu können voraus. Das Gebiet der Fotoreaktionstechnik ist jedoch ein wenig entwickeltes Feld, in dem es an verlässlichen und offen zugänglichen Methoden für Strahlungstransportsimulationen und standardisierten Fotoreaktoren für die Bestimmung von Quantenausbeuten und Fotoreaktionskinetiken fehlt. Mit dem vorliegenden Datensatz wird sowohl ein Satz von CAD Dateien zur Herstellung (via 3D Druck) eines Fotoreaktors für die akkurate Bestimmung von Quantenausbeuten und Fotoreaktionskinetiken als auch eine umfassende, MATLAB®-basierte Toolbox für Strahlungstransportsimulationen bereitgestellt. Der vorgeschlagene Fotoreaktor weist eine isophotonische Reaktionszone auf, was bedeutet, dass die Reaktionszone nur kleine Gradienten in der lokalen volumetrischen Photonenabsorptionsrate aufweist. Die Toolbox erlaubt die Berechnung von Strahlungstransporteffizienzen im isophotonischen Fotoreaktor und stellt damit die Basis für eine sinnvolle Datenauswertung in Experimenten mit dem vorgeschlagenen Fotoreaktor dar. Über diesen konkreten Anwendungsfall der Toolbox hinaus, kann die Toolbox auch für andere Anwendungen im Bereich Strahlungstransportsimulationen eingesetzt werden. Diese reichen von der Auswertung von Versuchen zur Bestimmung von optischen Transporteigenschaften über die Auslegung von Lichtquellen hin zur Optimierung von Fotoreaktoren. Der bereitgestellte Datensatz kann damit nicht nur die Arbeit von Materialwissenschaftlerinnen im Bereich der Entwicklung von Fotokatalysatoren mit hohen Quantenausbeuten unterstützen, sondern kann auch im Rahmen der Arbeit von Chemieingenieurinnen eingesetzt werden, die die Entwicklung von effizienten Fotoreaktoren und Lichtquellen für spezifische Fotokatalysatoren oder Anwendungsfälle vorantreiben. Abstract: Solar driven photocatalysis represents one way to address challenges arising from the need to substitute fossil energy carriers in the world’s economy. The developments in the field of photocatalysis heavily depend on new process designs whose development require methods for the determination of quantum yields and photoreaction kinetics as well as the ability to map radiation transport in complex photoreactors. However, the field of photoreaction engineering is an underdeveloped field lacking reliable and open access tools for radiation transport simulations and standardized photoreactors for quantum yield and photoreaction kinetic measurements. With this data set both a set of CAD files for the facile fabrication of an isophotonic photoreactor for the determination of quantum yields in gas, liquid, and multi-phase photoreactions via additive manufacturing as well as a comprehensive MATLAB®-based toolbox for radiation transport simulations in photoreactors are given. The proposed photoreactor is designed in a way that its reaction volume is isophotonic, which means that the reaction volume shows low gradients in the local volumetric rate of photon absorption. The toolbox allows the determination of radiation transport efficiencies within the isophotonic photoreactor and therewith provides the basis for meaningful data evaluation of experiments conducted with the isophotonic photoreactor. Beyond this concrete use case of the provided toolbox, the toolbox can also be employed for radiation transport simulations in many other use cases. Those range from the evaluation of experiments aiming for the determination of optical properties over light source design to the optimization of photoreactors. The data set therewith not only can support the further development of high quantum yield materials by material scientists in the field of photocatalysis but also can be used by chemical engineers working on new, high efficiency photo reactors or sophisticated light sources especially designed for specific photocatalysts and/or use cases. TechnicalRemarks: The data set comprises (a) all CAD files that are needed to print an isophotonic photoreactor for the precise determination of quantum yields in gas, liquid, and multi-phase photoreactions and (b) a MATLAB® toolbox named phoRex that allows the determination of spectral radiation transport efficiencies (= transport efficiencies from the light source of the isophotonic photoreactor into the reaction volume) as well as other radiation transport related performance metrices via a Monte Carlo ray tracing approach. For details on the reactor design and the simulation environment please refer to the corresponding publication (DOI: 10.1016/j.cej.2022.139204). The toolbox requires a working MATLAB® installation (2018 or later) including the MATLAB parallel computing toolbox. Installation of the toolbox is in accordance with the standard MATLAB® procedure for the installation of new toolboxes. After installation, phoRex provides an environment for Monte Carlo ray tracing simulations mapping radiation transport in 3D in channel-like geometries, for instance photoreactors. For the simulation of the isophotonic photoreactor a comprehensive live script example is given with the file example.mlx comprised in the toolbox. The example guides through the usage of the provided code in the context of quantum yield determination using the proposed isophotonic photoreactor. Further, the comprised pre-processing script preProcessQY.m lines out how simulations are set up in the provided Monte Carlo ray tracing environment. This code example can be employed to understand how to set up own simulation cases for other use cases than the simulation of the isophotonic photoreactor proposed for the accurate determination of quantum yields. For detailed information on the code structure of the Monte Carlo ray tracing approach itself, the author refers to the extensively commented source code given with the class definitions of the toolbox.

  17. Replication Data for: Magnetic Dipole Imaging of Brain Tissue

    • zenodo.org
    txt
    Updated Mar 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Leon Kaub; Stuart A. Gilder; Roger R. Fu; Barbara A. Maher; Gabriel Maxemin; Aaron T. Kuan; Andreas Büttner; Stefan Milz; Christoph Schmitz; Leon Kaub; Stuart A. Gilder; Roger R. Fu; Barbara A. Maher; Gabriel Maxemin; Aaron T. Kuan; Andreas Büttner; Stefan Milz; Christoph Schmitz (2025). Replication Data for: Magnetic Dipole Imaging of Brain Tissue [Dataset]. http://doi.org/10.5281/zenodo.14958851
    Explore at:
    txtAvailable download formats
    Dataset updated
    Mar 7, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Leon Kaub; Stuart A. Gilder; Roger R. Fu; Barbara A. Maher; Gabriel Maxemin; Aaron T. Kuan; Andreas Büttner; Stefan Milz; Christoph Schmitz; Leon Kaub; Stuart A. Gilder; Roger R. Fu; Barbara A. Maher; Gabriel Maxemin; Aaron T. Kuan; Andreas Büttner; Stefan Milz; Christoph Schmitz
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains magnetic field maps of brain tissue collected with a Quantum Diamond Microscope (QDM). Relevant details and the results based on this data are presented and discussed in a journal paper titled "Magnetic Dipole Imaging of Brain Tissue".

    There are three sets of QDM data in this repository, one for each set of samples: human brain samples, rat brain samples, and a magnetotactic bacteria sample.

    Each set contains folders for each sample and field of view (FOV). Rat brain data is further subdivided into polishes (see paper for details).

    For human and rat brain data, each folder contains two subfolders for the repeated map pairs (image 1 and 2).

    In each final subfolder, the common files are:

    • B111dataToPlot.mat: Nearly raw QDM data containing remanence and induced magnetic field maps in the 111 diamond crystallographic direction. Resolution is 1.175 µm (no binning).
    • Bz_uc0.mat: Bz data (vertical magnetic field component) computed from B111 data.
    • Bz_uc0_sat.png: PNG image of Bz map with saturated color range (typically 2e-7 T). Provided for convenience. Can be replotted from Bz_uc0.mat.
    • laser.jpg: optical light image of the same FOV as the QDM magnetic field map, illucidated by the QDM laser.
    • ledImg.png: optical light image of the same FOV as the QDM magnetic field map, illucidated by the LED.

    QDM data was processed using QDMlab: https://github.com/HarvardPaleomag/QDMlab [Volk, M. W., Fu, R. R., Trubko, R., Kehayias, P., Glenn, D. R., & Lima, E. A. (2022). QDMlab: A MATLAB toolbox for analyzing quantum diamond microscope (QDM) magnetic field maps. Computers & Geosciences, 167, 105198.].

  18. e

    AMS data and EBSD maps of small scale marble shear zone - Estremoz, Portugal...

    • b2find.eudat.eu
    Updated Mar 19, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2019). AMS data and EBSD maps of small scale marble shear zone - Estremoz, Portugal - Dataset - B2FIND [Dataset]. https://b2find.eudat.eu/dataset/25f9641d-b319-547d-a12f-f885498ca16a
    Explore at:
    Dataset updated
    Mar 19, 2019
    Area covered
    Estremoz, Estremoz, Portugal
    Description

    We studied two structural profiles across subvertical thin shear zone in marble from quarry in Estremoz (Portugal) to clarify a relationship between AMS and strain in natural rocks. The mesoscopic fabric can be described as changing from the subhorizontal coarse-grained foliation towards the ∼2cm-wide shear zone center with subvertical fine-grained foliation. In microstructure, the shear zone records dynamic recrystallization of calcite aggregate which resulted in development of porphyroclastic microstructure with increasing proportion of fine-grained recrystallized matrix towards the shear zone center. Two distinct crystallographic preferred orientations of calcite were recorded. One related with porphyroclasts, characterized by subvertical orientation of calcite axes and another associated with recrystallized matrix showing subhorizontal calcite axes orientation. The majority of the rock mass is diamagnetic, corresponding well with the thermomagnetic curves, with local paramagnetic accumulations in form of thin bands. Update 2019-07-10: columns 'Specimen orientation', 'F-statistics' and 'Fitting error' were added. AMS ascidata-zip folder was added to the datasetProject: Czech Science Foundation project no. 16-25486Y: How does AMS reflect the microstructure? Natural and experiment shear zones. Czech Academy of Sciences institutional support to the Institute of Geophysics of the CAS, v.v.i. (RVO 67985530).The low field anisotropy of magnetic susceptibility (AMS) was measured with a MFK1-FA Kappabridge (Jelínek and Pokorný, 1997) in the field of 423 A/m and for a frequency of 976 Hz (Institute of Geophysics of the Czech Academy of Sciences).File columns: Sample label, rock type, x and y coordinates, magnetic field in A/m (Field), operating frequency (Freq.), measured bulk magnetic susceptibility (Km), anisotropy factors L, F, P, Pj, T, U (for explanation please see Jelínek, 1981) , orientation of principal magnetic susceptibilities (orientation of principal axes of magnetic susceptibility tensor K1>K2>K3) dec - declination (0 - 360°), inc - inclination (0 - 90°) in geographic coordinates (North = 0, horizontal = 0), K11- K13 - description of symmetrical AMS tensor.The small-scale shear zone in marble has been studied by combination of detailed microstructural and rock magnetic analysis to explore the relationship between anisotropy of magnetic susceptibility and strain produced by single deformation event. The microstructural and rock magnetic data has been interpreted and discussed based on numerical modeling of magnetic fabric. The structural record in the area of Estremoz is dominated by pervasive isoclinal folding resulting in steep SW dipping axial plane cleavage. Based on the observed structural superposition, the studied shear zone belongs to the family of subhorizontal mostly brittle to ductile shear zones to shear fractures crosscutting the SW-dipping axial plane cleavage. The studied marble is characterized by alternating bands of white almost pure calcite up to 30cm wide and thinner gray bands parallel to coarse-grained foliation. In the shear zone, foliation is continuously curved towards the center where most of the strain is localized in fine-grained shear plane. The studied shear zone belongs to the class of ductile shear zones where no macroscopic fracture is involved, i.e. deformation preserves continuity of preexisting markers. This implies that ductile shear zones show a continuous displacement gradient across the zone. The SZ is a product of simple shear deformation as markers of non-simple shear are missing (non-planar shear zone walls, non-parallel shear-strain contours, porphyroclasts showing conflicting sense of rotation and sets of differently oriented shear bands showing opposite senses of shear). Displacement (~41.5cm) on the SZ is approximately 3 times width of the SZ (~13cm). Studied were two structural profiles across the SZ. The profile A is located in the pure calcite rocks and the profile B is realized in the grey zone enriched by accessory minerals. The maximal strain in the SZ core is achieved in the B10 sample (γaverage ~3, γlocal ~14). Crystalographyc prefered orientation maps of marble (ShearZoneEBSDdata.zip) in Channel text files *.ctf obtained using NORDLYS II (HKL Technology) EBSD system mounted on TESCAN Vega scanning electron microscope (Institute of Petrology and Structural Geology of the Faculty of Science at Charles University in Prague) in manual (B5-clasts, B8-clasts, B10-clasts) and in the mapping mode (B8-map with 5 µm step size; B10-map1, B10-map4, B10-map5 with 3 µm step size; B10-map2, B10-map3 with 2.5 µm step size). The .ctf file is a text file containing header with information about mineral phases measured and the data consists of phase orientation in each point measured. The .ctf file can be read by commercial HKL Technology (Oxford Instruments) software or better using the MATLAB® Toolbox for Quantitative Texture Analysis (MTEX) (Bachmann et al., 2010; 2011; https://code.google.com/p/mtex/) licensed under GNU General Public License v2.0.

  19. e

    AMS and EBSD maps of rock salt - lamprophyre dyke contact zone, Loule...

    • b2find.eudat.eu
    Updated Oct 15, 2013
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2013). AMS and EBSD maps of rock salt - lamprophyre dyke contact zone, Loule diapir, Algarve basin, Portugal - Dataset - B2FIND [Dataset]. https://b2find.eudat.eu/dataset/f18e5d44-39e3-55e3-8cb9-b82ba90ff8dd
    Explore at:
    Dataset updated
    Oct 15, 2013
    Area covered
    Loulé, Faro District, Portugal
    Description

    A rock salt-lamprophyre dyke contact zone (sub-vertical, NE-SW strike) was investigated for its petrographic, mechanic and physical properties by means of anisotropy of magnetic susceptibility (AMS) and rock magnetic properties, coupled with quantitative microstructural analysis and thermal mathematical modelling. The quantitative microstructural analysis of halite texture and solid inclusions revealed good spatial correlation with AMS and halite fabrics. The fabrics of both lamprophyre and rock salt record the magmatic intrusion, "plastic" flow and regional deformation (characterized by a NW-SE trending steep foliation). AMS and microstructural analysis revealed two deformation fabrics in the rock salt: (1) the deformation fabrics in rock salt on the NW side of the dyke are associated with high temperature and high fluid activity attributed to the dyke emplacement; (2) On the opposite side of the dyke, the emplacement-related fabric is reworked by localized tectonic deformation. The paleomagnetic results suggest significant rotation of the whole dyke, probably during the diapir ascent and/or the regional Tertiary to Quaternary deformation. The low field anisotropy of magnetic susceptibility (AMS) was measured with a MFK1-FA Kappabridge (Jelínek and Pokorný, 1997) in the field of 200 A/m and for a frequency of 976 Hz (Institute Dom Luiz - Univ. Lisbon). File columns: sample Name, rock type, distance from dyke centre, magnetic field in A/m (Field), operating frequency (Freq.), measured bulk magnetic susceptibility (Km), anisotropy factors L, F, P, Pj, T, U (for explanation please see Jelínek, 1981) , orientation of principal magnetic susceptibilities (orientation of principal axes of magnetic susceptibility tensor K1>K2>K3) dec -declination (0 - 360°), inc - inclination (0 - 90°) in geographic coordinates (North = 0, horizontal = 0), K11- K13 - description of symmetrical AMS tensor.The Loulé salt diapir in the Mesozoic-Cenozoic Algarve basin was chosen as case study to show the relationship between mafic dyke emplacement and host-rock salt deformation. The studied lamprophyre dyke is 3m thick, strikes N310 and dips 80º to the NW. In both margins of the dyke, the texture is aphanitic with devitrified volcanic glass and abundant euhedral as well as skeletal xenocrysts of olivine and olivine xenoliths, with serpentinized olivine xenocrysts at the SE margin. The dyke centre shows subophitic texture of plagioclase laths, needle crystals of amphibole and large anhedral crystals of biotite, with high amount of volcanic glass. The olivine xenocrysts are significantly less abundant in the dyke core. Only brittle structures have been observed in the dyke itself. Fractures within the dyke are filled with halite. Within the host-rocks close to the SE dyke margin, angular fragments of lamprophyre of various sizes can be observed. Samples were collected in rock salt and within and around a lamprophyre dyke in the Loulé Mine. A profile across the host-rock salt and dyke was sampled in detail. The salt rock was sampled on the NW and SE sides of the dyke, between a distance of 5 to 220 cm and 10 to 75cm of dyke margins, respectively. The dyke was also sampled along a complete cross-section. Crystalographyc prefered orientation maps of salt rock (SaltEBSDmaps.zip) in Channel text files *.ctf obtained using NORDLYS II (HKL Technology) EBSD system mounted on TESCAN Vega scanning electron microscope (Institute of Petrology and Structural Geology of the Faculty of Science at Charles University in Prague) in the mapping mode with 10 µm step size. The .ctf file is a text file containing header with information about mineral phases measured and the data consists of phase orientation in each point measured. The .ctf file can be read by commercial HKL Technology (Oxford Instruments) software or better using the MATLAB® Toolbox for Quantitative Texture Analysis (MTEX) (Bachmann et al., 2010; 2011; https://code.google.com/p/mtex/) under GNU GPL v2 licence.

  20. Data and code for "Change in grounding line location on the Antarctic...

    • zenodo.org
    zip
    Updated Jul 29, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Benjamin J. Wallis; Benjamin J. Wallis; Anna E. Hogg; Yikai Zhu; Andrew Hooper; Anna E. Hogg; Yikai Zhu; Andrew Hooper (2024). Data and code for "Change in grounding line location on the Antarctic Peninsula measured using a tidal motion offset correlation method" by Wallis et al. 2024 [Dataset]. http://doi.org/10.5281/zenodo.13120995
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 29, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Benjamin J. Wallis; Benjamin J. Wallis; Anna E. Hogg; Yikai Zhu; Andrew Hooper; Anna E. Hogg; Yikai Zhu; Andrew Hooper
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Antarctica, Antarctic Peninsula
    Description

    This data and code is made available to support the article: "Change in grounding line location on the Antarctic Peninsula measured using a tidal motion offset correlation method" by Wallis et al. 2024".

    Includes: TMOC method output tide correlation, Antarctic Peninsula grounding line, DInSAR data, TMOC Code.

    For the data:

    These data are made available to acompany the article "Change in grounding line location on the Antarctic Peninsula measured using a tidal motion offset correlation method" by Wallis et al. (2024)

    This datset contains:

    AP_TMOC_tide_correlation_2019_2020.tif - Significance adjusted tide correlation values for the TMOC method for 2019-2020 for the Antarctic Peninsula.

    AP_GL_TMOC_2019_2020.shp - A continuous grounding line made from TMOC data and British Antarctic Survey Coastline Data. Intended for use by others.

    AP_GL_TMOC_2019_2020_source.shp - A discontinuous grounding line made from TMOC data and British Antarctic Survey Coastline Data including the source of each line segment.

    The folder 'Interferograms' contains the DInSAR products used in the manuscript, sorted by Sentinel-1 frame

    For the code:

    This code is made available to support the article "Change in grounding line location on the Antarctic Peninsula measured using a tidal motion offset correlation method" by Wallis et al.

    The authors take no responsibility for the quality of results derived using this code.

    This code is licensed under a Creative Commons Attribution 4.0 International Licence: http://creativecommons.org/licenses/by/4.0/

    external functions required:
    geoimread - https://uk.mathworks.com/matlabcentral/fileexchange/46904-geoimread
    polarstreo_inv - https://uk.mathworks.com/matlabcentral/fileexchange/32907-polar-stereographic-coordinate-transformation-map-to-lat-lon
    CATS208 tide model and TMD 2.5 matlab toolbox - https://www.esr.org/research/polar-tide-models/tmd-software/

    The function TMOC_GL_v8 implements the TMOC method descibed in Wallis et al. 2024. This is a 'bring your own data' version.

    The script pp_folder prost-processes the outputs using the functuon LPfilt_cc

  21. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Clare D Harris; Elise G Rowe; Roshini Randeniya; Marta I Garrido (2019). EEG_Auditory_Oddball_Preprocessed_Data [Dataset]. http://doi.org/10.6084/m9.figshare.5812764.v1
Organization logo

EEG_Auditory_Oddball_Preprocessed_Data

Explore at:
binAvailable download formats
Dataset updated
Jan 31, 2019
Dataset provided by
Figsharehttp://figshare.com/
Authors
Clare D Harris; Elise G Rowe; Roshini Randeniya; Marta I Garrido
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

This dataset was obtained at the Queensland Brain Institute, Australia, using a 64 channel EEG Biosemi system. 21 healthy participants completed an auditory oddball paradigm (as described in Garrido et al., 2017).For a description of the oddball paradigm, please see Garrido et al., 2017:Garrido, M.I., Rowe, E.G., Halasz, V., & Mattingley, J. (2017). Bayesian mapping reveals that attention boosts neural responses to predicted and unpredicted stimuli. Cerebral Cortex, 1-12. DOI: 10.1093/cercor/bhx087If you use this dataset, please cite its doi, as well as citing the associated methods paper, which is as follows:Harris, C.D., Rowe, E.G., Randeniya, R. and Garrido, M.I. (2018). Bayesian Model Selection Maps for group studies using M/EEG data.For scripts to analyse the data, please see: https://github.com/ClareDiane/BMS4EEG

Search
Clear search
Close search
Google apps
Main menu