http://www.nationalarchives.gov.uk/doc/non-commercial-government-licence/non-commercial-government-licence.htmhttp://www.nationalarchives.gov.uk/doc/non-commercial-government-licence/non-commercial-government-licence.htm
The Matlab scripts will compute parametric maps from Bruker MR images as described in the JoVE paper published in 2017
This data release provides remotely sensed data, field measurements, and MATLAB code associated with an effort to produce image-derived velocity maps for a reach of the Sacramento River in California's Central Valley. Data collection occurred from September 16-19, 2024, and involved cooperators from the Intelligent Robotics Group from the National Aeronautics and Space Administration (NASA) Ames Research Center and the National Oceanographic and Atmospheric Administration (NOAA) Southwest Fisheries Science Center. The remotely sensed data were obtained from an Uncrewed Aircraft System (UAS) and are stored in Robot Operating System (ROS) .bag files. Within these files, the various data types are organized into ROS topics including: images from a thermal camera, measurements of the distance from the UAS down to the water surface made with a laser range finder, and position and orientation data recorded by a Global Navigation Satellite System (GNSS) receiver and Inertial Measurement Unit (IMU) during the UAS flights. This instrument suite is part of an experimental payload called the River Observing System (RiOS) designed for measuring streamflow and further detail is provided in the metadata file associated with this data release. For the September 2024 test flights, the RiOS payload was deployed from a DJI Matrice M600 Pro hexacopter hovering approximately 270 m above the river. At this altitude, the thermal images have a pixel size of approximately 0.38 m but are not geo-referenced. Two types of ROS .bag files are provided in separate zip folders. The first, Baguettes.zip, contains "baguettes" that include 15-second subsets of data with a reduced sampling rate for the GNSS and IMU. The second, FullBags.zip, contains the full set of ROS topics recorded by RiOS but have been subset to include only the time ranges during which the UAS was hovering in place over one of 11 cross sections along the reach. The start times are included in the .bag file names as portable operating system interface (posix) time stamps. To view the data within ROS .bag files, the Foxglove Studio program linked below is freely available and provides a convenient interface. Note that to view the thermal images, the contrast will need to be adjusted to minimum and maximum values around 12,000 to 15,000, though some further refinement of these values might be necessary to enhance the display. To enable geo-referencing of the thermal images in a post-processing mode, another M600 hexacopter equipped with a standard visible camera was deployed along the river to acquire images from which an orthophoto was produced: 20240916_SacramentoRiver_Ortho_5cm.tif. This orthophoto has a spatial resolution of 0.05 m and is in the Universal Transverse Mercator (UTM) coordinate system, Zone 10. To assess the accuracy of the orthophoto, 21 circular aluminum ground control targets visible in both thermal and RGB (red, green, blue) images were placed in the field and their locations surveyed with a Real-Time Kinematic (RTK) GNSS receiver. The coordinates of these control points are provided in the file SacGCPs20240916.csv. Please see the metadata for additional information on the camera, the orthophoto production process, and the RTK GNSS survey. The thermal images were used as input to Particle Image Velocimetry (PIV) algorithms to infer surface flow velocities throughout the reach. To assess the accuracy of the resulting image-derived velocity estimates, field measurements of flow velocity were obtained using a SonTek M9 acoustic Doppler current profiler (ADCP). These data were acquired along a series of 11 cross sections oriented perpendicular to the primary downstream flow direction and spaced approximately 150 m apart. At each cross section, the boat from which the ADCP was deployed made four passes across the channel and the resulting data was then aggregated into mean cross sections using the Velocity Mapping Toolbox (VMT) referenced below (Parsons et al., 2013). The VMT output was further processed as described in the metadata and ultimately led to a single comma delimited text file, SacAdcp20240918.csv, with cross section numbers, spatial coordinates (UTM Zone 10N), cross-stream distances, velocity vector components, and water depths. To assess the sensitivity of thermal image velocimetry to environmental conditions, air and water temperatures were recorded using a pair of Onset HOBO U20 pressure transducer data loggers set to record pressure and temperature. Deploying one data logger in the air and one in the water also provided information on variations in water level during the test flights. The resulting temperature and water level time series are provided in the file HoboDataSummary.csv with a one-minute sampling interval. These data sets were used to develop and test a new framework for mapping flow velocities in river channels in approximately real time using images from an UAS as they are acquired. Prototype code for implementing this approach was developed in MATLAB and is also included in the data release as a zip folder called VelocityMappingCode.zip. Further information on the individual functions (*.m files) included within this folder is available in the metadata file associated with this data release. The code is provided as is and is intended for research purposes only. Users are advised to thoroughly read the metadata file associated with this data release to understand the appropriate use and limitations of the data and code provided herein.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This MATLAB package provides FAPUX implementation for fast, accurate, and robust phase unwrapping solution, designed to handle challenging phase maps with discontinuities, and intricate spatial structures (e.g., spiral shear phase maps). While it excels in complex scenarios where conventional algorithms often fail, it is equally applicable to smooth and noise-free phase data, making it a versatile tool for a wide range of applications.It is particularly useful in optical metrology, interferometry, SAR/InSAR imaging, and medical diagnostics, where reliable phase recovery is essential.The scripts support the paper “ D. Khodadad, "Fast and accurate phase unwrapping for complex phase maps", Applied Optics, Vol. 64, No. 24, 2025 DOI: https://doi.org/10.1364/AO.567228” .If you use this code, cite:D. Khodadad, "Fast and accurate phase unwrapping for complex phase maps," Applied Optics, Vol. 64, No. 24, 10 August 2025. https://doi.org/10.1364/AO.567228D. Khodadad, "MATLAB code and example data for fast and accurate phase unwrapping of complex phase maps," figshare (2025). https://doi.org/10.6084/m9.figshare.29571782
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This database is the supplementary material of Tasseron et al., (2022): 'Towards Robust River Plastic Detection: Combining Lab and Field-based Hyperspectral Imagery' [Submitted and currently under review], preprint available online at https://doi.org/10.31223/X5RW7V. The dataset contains raw images, MATLAB scripts used for training classifier algorithms, trained pipelines, required toolboxes and labelled training datasets used in subsequent analyses.
Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
This dataset contains MATLAB code for the creation of regime maps for gravure printed patterns from the HYPA-p dataset. The regime maps show the location of the three fluid splitting regimes, namely, point splitting, lamella splitting and transition regime, in a map of tonal value of the printing form over printing velocity. The input for the code are the inference results of the trained convolutional neural networks (CNNs), as provided here. In this context, inference means automated classification of unlabeled data. Further information can be found in the dissertation of Pauline Rothmann-Brumm (2023) and in the provided README-file.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset was obtained at the Queensland Brain Institute, Australia, using a 64 channel EEG Biosemi system. 21 healthy participants completed an auditory oddball paradigm (as described in Garrido et al., 2017).For a description of the oddball paradigm, please see Garrido et al., 2017:Garrido, M.I., Rowe, E.G., Halasz, V., & Mattingley, J. (2017). Bayesian mapping reveals that attention boosts neural responses to predicted and unpredicted stimuli. Cerebral Cortex, 1-12. DOI: 10.1093/cercor/bhx087If you use this dataset, please cite its doi, as well as citing the associated methods paper, which is as follows:Harris, C.D., Rowe, E.G., Randeniya, R. and Garrido, M.I. (2018). Bayesian Model Selection Maps for group studies using M/EEG data.For scripts to analyse the data, please see: https://github.com/ClareDiane/BMS4EEG
Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
This data set is uploaded as supporting information for the publication entitled:A Comprehensive Tutorial on the SOM-RPM Toolbox for MATLABThe attached file 'case_study' includes the following:X : Data from a ToF-SIMS hyperspectral image. A stage raster containing 960 x800 pixels with 963 associated m/z peaks.pk_lbls: The m/z label for each of the 963 m/z peaks.mdl and mdl_masked: SOM-RPM models created using the SOM-RPM tutorial provided within the cited article.Additional details about the datasets can be found in the published article.V2 - contains modified peak lists to show intensity weighted m/z rather than peak midpoint. If you use this data set in your work, please cite our work as follows:[LINK TO BE ADDED TO PAPER ONCE DOI RECEIVED]
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
% This function evaluates the coordinates and radius of the pores' bodies
% and throats and give a label image of the segmented pore system
% The algorithm computes the distance map of a binary image and then uses
% its curvatures to locate the saddle points.
% The bodies centers are located at the local maximum of the Hessian matrix
% determinant map and the saddle points at the local minima of the Hessian
% matrix determinant map.
Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) source code for performing this analysis is provided in the function NN_depth_ensembling.m and the figure included on this landing page provides a flow chart illustrating the four different neural network-based depth retrieval methods. As examples of the resulting models, MATLAB *.mat data files containing the best-performing neural network model for each site are provided below, along with a file that lists the PlanetScope image identifiers for the images that were used for each site. To develop and test this new NNDR approach, the method was applied to satellite images from three rivers across the U.S.: the American, Colorado, and Potomac. For each site, field measurements of water depth available through other data releases were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: X_mean-spec.tif, X_mean-depth.tif, X_NN-depth.tif, and X-single-image.tif, where X denotes the site name. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.
This dataset provides the Matlab sediment transport and land accretion model at Wax Lake Delta (WLD), Atchafalaya Basin, in coastal Louisiana. The data include the Matlab scripts that solve the advection and Exner equations to simulate the suspended sediment transport and accretion at WLD. The model requires modeled flow information from a separate ANUGA hydrodynamic model as inputs. For this study, ANUGA modeled flow information from the Delta-X Spring and Fall 2021 campaigns were used as inputs. The ANUGA output files are converted to variables used by this Matlab model using pre-processing tools. The main code calculates suspended sediment fluxes and accretion rates of mud and sand as a function of space and time. The cumulative sediment accretion from each campaign was then used to estimate an annualized land accretion map using a weighted-average formula presented. The final product, the one-yr upscaled land accretion map, is archived as a separate dataset.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Excel file containing the topography of the CLV area written in ASCII format. (XLSX 1035 kb)
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains Hadamard pattern illumination and time-domain diffuse optical tomography (TD-DOT) measurements, including both simulated time domain data and frequency-domain converted data. MATLAB scripts are provided for visualizing Hadamard patterns, data and reconstructing 3D maps of optical properties.
The dataset supports the forthcoming publication “Spatial and temporal modulation in time-domain diffuse optical tomography".
Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) for performing this analysis is provided in the function NN_depth_ensembling.m available on the main landing page for the data release of which this is a child item, along with a flow chart illustrating the four different neural network-based depth retrieval methods. To develop and test this new NNDR approach, the method was applied to satellite images from the American River near Fair Oaks, CA, acquired in October 2020. Field measurements of water depth available through another data release (Legleiter, C.J., and Harrison, L.R., 2022, Field measurements of water depth from the American River near Fair Oaks, CA, October 19-21, 2020: U.S. Geological Survey data release, https://doi.org/10.5066/P92PNWE5) were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: American_mean-spec.tif, American_mean-depth.tif, American_NN-depth.tif, and American-single-image.tif. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This document describes how to identify and extract ALM neurons from the MouseLight database of individual reconstructed neurons (https://ml-neuronbrowser.janelia.org/) to define ALM projection zones (relevant to Chen, Liu et al., Cell, 2023). All scripts are in Matlab R2022b./MouseLight_figshare/MouseLightComplete contains all reconstructed single neurons from the MouseLight data set, in .json and .swc formats.Use ‘ExtractMouseLightNeuronsFromJsonFiles.m’ to extract MouseLight neuron ID, soma coordinates and annotation, and axon coordinates from the above directory.Use ‘ExtractMouseLightALMneurons.m’ to identify and extract ALM neurons from the MouseLight data set. ALM neurons are defined based on functional maps of ALM (photoinhibition) in the CCF coordinate system, contained in ‘ALM_functionalData.nii’ (from Li, Daie, et al Nature, 2016).Use ‘ALMprojDensity.m’ to compute and generate an ALM projection map based on axonal density. The map is saved in ‘ALM_mask_150um3Dgauss_Bilateral.mat’ as smoothed (3D Gaussian, sigma = 150 um) axonal density in a 3D matrix: F_smooth.First axis: dorsal-ventral, second axis: medial-lateral, third axis: anterior-posterior.Use ‘medial_lateral_ALMprojDensities.m’ to compute and generate medial and lateral ALM projection maps separately.Medial ALM soma location < 1.5 mm from the midline; lateral ALM soma locations > 1.5 mm from the midline.The maps are saved in ‘medialALM_mask_150um3Dgauss_Bilateral.mat’ and ‘lateralALM_mask_150um3Dgauss_Bilateral.mat’ as smoothed (3D Gaussian, sigma = 150 um) axonal density in 3D matrices, respectively.Use ‘PlotALMinCCF.m’ to plot voxels of ALM in CCF, defined by the functional maps in ‘ALM_functionalData.nii’.Use ‘PlotMouseLightALMneurons.m’ to plot ALM neurons (all, medial, or lateral) in CCF; figures are saved in .tiff format.Other functions:‘loadTifFast.m’ is called to load CCF .tif file (Annotation_new_10_ds222_16bit.tif).‘plotCCFbrain.m’ is called to plot an isosurface of the CCF brain (Annotation_new_10_ds222_16bit.tif).
This page provides the source code and results underlying the manuscript: Andrzejak RG, Ruzzene G, Schöll E, Omelchenko I (2020) Two populations of coupled quadratic maps exhibit a plentitude of symmetric and symmetry broken dynamics. Chaos, 30, 033125. If you use any of these resources, please make sure that you cite this reference. For more detailed information, please refer to https://www.upf.edu/web/ntsa/downloads MATLAB source code (.m) and MATLAB data (.mat)
http://researchdatafinder.qut.edu.au/display/n4066http://researchdatafinder.qut.edu.au/display/n4066
Matlab code to access results exported from Python. User will need to alter code to point to correct file path (line 5). Folder containing this and the other Matlab code needs to contain a... QUT Research Data Respository Dataset Resource available for download
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
Data for "TrueEBSD: correcting spatial distortions in electron backscatter diffraction maps" published in Ultramicroscopy.
Journal DOI: https://doi.org/10.1016/j.ultramic.2020.113130;
Preprint DOI: https://arxiv.org/abs/1909.00347.
The zipped folder contains:
Each data subfolder contains:
The 'MATLAB scripts' subfolder contains TrueEBSD source code:
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Data set for: Le Merre P, Esmaeili V, Charrière E, Galan K, Salin P-A, Petersen CCH, Crochet S (2018) Reward-based learning drives rapid sensory signals in medial prefrontal cortex and dorsal hippocampus necessary for goal-directed behavior. Neuron, https://doi.org/10.1016/j.neuron.2017.11.031
There are 44 files in this data upload: 1. '2018_LeMerre_Neuron.pdf' - this is a pdf version of the online publication. 2. 'Chronic_LFP_data.mat' - this is a Matlab data structure, which contains all the chronic LFP data for the publication. 3. 'Silicon_Probe_data.mat' - this is a Matlab data structure, which contains all the mPFC silicon probe recording data for the publication. 4. 'Opto_Inactivation_data.mat' - this is a Matlab data structure, which contains all the optogenetic inactivation data for the publication. 5. 'Mus_Inactivation_data.mat' - this is a Matlab data structure, which contains all the pharmacological (Muscimol) inactivation data for the publication. 6. 'Learning_Days_Mtrx.mat' - this is a Matlab data file, which contains the selected training days analyzed for the Trained condition in the Detection Task. 7. 'Exposed_Days_Mtrx.mat' - this is a Matlab data file, which contains the selected days analyzed for the Exposed condition in the Neutral Exposure. 8. 'p_value_colormap.mat' - this is a Matlab data file, which contains the color map used to display the p value in the Matlab codes 'plot_fig2A_SEP_D1_vs_Trained.m'; 'plot_fig2B_Amplitude_D1_vs_Trained.m'; 'plot_fig3A_SEP_D1_vs_Exposed.m'; 'plot_fig4A_SEP_H_vs_M.m’. 9. 'p_value_colormap2.mat' - this is a Matlab data file, which contains the color map used to display the p value in the Matlab code 'plot_figS3B_Stim_vs_Catch_for_significantly_inc_dec_units.m’; ’plot_figS4A_H_vs_M_for_inc_dec_units_and_zscored_PSTH.m’. 10. 'scatterplot_colormap.mat' - this is a Matlab data file, which contains the color map used to display the p value in the Matlab code 'plot_fig2C_Scatterplot_Amplitude_vs_dprime.m'. 11. 'SEP_colormtrx.mat' - this is a Matlab data file, which contains the color map used to display the p value in the Matlab code 'plot_fig1B_Sensory_Evoked_Potentials.m'; 'plot_figS3A_SEP_EMG_amplitude_ReactionTime.m'. 12. 'zscore_colormap.mat' - this is a Matlab data file, which contains the color map used to display the p value in the Matlab code 'plot_fig3D_mPFC_PSTH_and_zscore_DT_vs_NE.m'; 'plot_figS4A_H_vs_M_for_inc_dec_units_and_zscored_PSTH.m’. 13. 'Chronic_LFP_dataViewer.fig' - this is a Matlab Figure file, which is the GUI layout for 'Chronic_LFP_dataViewer.m'. 14. 'Chronic_LFP_dataViewer.m' - this is a Matlab code, which displays the data contained in 'Chronic_LFP_data.mat'. 15. 'Silicon_Probe_dataViewer.fig' - this is a Matlab Figure file, which is the GUI layout for 'Silicon_Probe_dataViewer.m'. 16. 'Silicon_Probe_dataViewer.m' - this is a Matlab code, which displays the data contained in 'Silicon_Probe_data.mat'. 17. 'plot_fig1B_Sensory_Evoked_Potentials.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results published in figure 1, panel B (Le Merre et al., 2018). 18. 'plot_fig1C_Silicon_Probe_Hit_trials.m' - this is a Matlab code, which analyses the data in 'Silicon_Probe_data.mat', and displays the results in the same way as the published figure 1, panel C (Le Merre et al., 2018). 19. 'plot_fig2A_SEP_D1_vs_Trained.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 2, panel A (Le Merre et al., 2018). 20. 'plot_fig2B_Amplitude_D1_vs_Trained.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 2, panel B (Le Merre et al., 2018). 21. 'plot_fig2C_Scatterplot_Amplitude_vs_dprime.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 2, panel C (Le Merre et al., 2018). 22. 'plot_fig3A_SEP_D1_vs_Exposed.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 3, panel A (Le Merre et al., 2018). 23. 'plot_fig3B_Amplitude_D1_vs_Exposed.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 3, panel B (Le Merre et al., 2018). 24. 'plot_fig3C_ROC_Trained_vs_Exposed.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the ROCs in the same way as the published figure 3, panel C (Le Merre et al., 2018). 25. 'plot_fig3C_ROC_Randomization.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the label shuffled ROCs in the same way as the published figure 3, panel C (Le Merre et al., 2018). 26. 'plot_fig3D_mPFC_PSTH_and_zscore_DT_vs_NE.m' - this is a Matlab code, which analyses the data in 'Silicon_Probe_data.mat', and displays the results in the same way as the published figure 3, panel D (Le Merre et al., 2018). 27. 'plot_fig4A_SEP_H_vs_M.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 4, panel A (Le Merre et al., 2018). 28. 'plot_fig4B_Amplitude_ H_vs_M.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure 4, panel B (Le Merre et al., 2018). 29. 'plot_fig4C_mPFC_PSTH_Hit_vs_Miss.m' - this is a Matlab code, which analyses the data in 'Silicon_Probe_data.mat', and displays the results in the same way as the published figure 4, panel C, left panel (Le Merre et al., 2018). 30. 'plot_fig4C_Scatterplot_modulation_Hit_vs_Miss.m' - this is a Matlab code, which analyses the data in 'Silicon_Probe_data.mat', and displays the results in the same way as the published figure 4, panel C, right panel (Le Merre et al., 2018). 31. 'plot_fig4D_Photoinhibitions.m' - this is a Matlab code, which analyses the data in 'Opto_Inactivation_data.mat', and displays the results published in figure 4, panel D (Le Merre et al., 2018). 32. 'plot_figS2D_Performance_DetectionTask_NeutralExposition.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure S2, panel D (Le Merre et al., 2018). 33. 'plot_figS3A_SEP_EMG_amplitude_ReactionTime.m' - this is a Matlab code, which analyses the data in 'Chronic_LFP_data.mat', and displays the results in the same way as the published figure S3, panel A (Le Merre et al., 2018). 34. 'plot_figS3B_Stim_vs_Catch_for_significantly_inc_dec_units.m' - this is a Matlab code, which analyses the data in 'Silicon_Probe_data.mat', and displays the results in the same way as the published figure S3, panel B (Le Merre et al., 2018). 35. 'plot_figS4A_H_vs_M_for_inc_dec_units_and_zscored_PSTH.m' - this is a Matlab code, which analyses the data in 'Silicon_Probe_data.mat', and displays the results in the same way as the published figure S4, panel A (Le Merre et al., 2018). 36. 'plot_figS4B_Pharmacological_Inactivations.m' - this is a Matlab code, which analyses the data in 'Mus_Inactivation_data.mat', and displays the results published in figure S4 (Le Merre et al., 2018). 37. 'Load_LFP_Multisite_database.m' - this is a Matlab code, which is called in the Matlab codes that analyze the data in 'Chronic_LFP_data.mat'. 38. 'Load_Silicon_Probe_database.m' - this is a Matlab code, which is called in the Matlab codes that analyze the data in 'Silicon_Probe_data.mat'. 39. 'Load_Optogenetic_Inactivation_database.m' - this is a Matlab code, which is called in the Matlab code that analyzes the data in 'Opto_Inactivation_data.mat'. 40. 'Load_Pharmacological_Inactivation_database.m' - this is a Matlab code, which is called in the Matlab code that analyzes the data in 'Mus_Inactivation_data.mat'. 41. 'bonf_holm.m' - this is a Matlab code developed by D. M. Groppe, which is called in the Matlab code 'plot_figS4B_Pharmacological_Inactivations.m': https://ch.mathworks.com/matlabcentral/fileexchange/28303-bonferroni-holm-correction-for-multiple-comparisons 42. 'boundedline.m' - this is a Matlab code developed by K. Kearney, which is called in the Matlab codes 'plot_fig1C_Silicon_Probe_Hit_trials.m'; 'plot_fig2A_SEP_D1_vs_Trained.m'; 'plot_fig3A_SEP_D1_vs_Exposed.m’; 'plot_fig3C_ROC_Trained_vs_Exposed.m'; 'plot_fig3D_mPFC_PSTH_and_zscore_DT_vs_NE.m'; 'plot_fig4A_SEP_H_vs_M.m'; 'plot_fig4C_mPFC_PSTH_Hit_vs_Miss.m'; 'plot_figS3B_Stim_vs_Catch_for_significantly_inc_dec_units.m'; 'plot_figS4A_H_vs_M_for_inc_dec_units_and_zscored_PSTH.m': https://ch.mathworks.com/matlabcentral/fileexchange/27485-boundedline-m 43. 'inpaint_nans.m' - this is a Matlab code, which is called in the Matlab code 'boundedline.m'. 44. 'PSTH_Simple.m' - this is a Matlab code developed by V. Esmaeili, which is called in the Matlab codes 'plot_fig1C_Silicon_Probe_Hit_trials.m'; 'plot_fig3D_mPFC_PSTH_and_zscore_DT_vs_NE.m'; 'plot_fig4C_mPFC_PSTH_Hit_vs_Miss.m'; 'plot_figS3B_Stim_vs_Catch_for_significantly_inc_dec_units.m'; 'plot_figS4A_H_vs_M_for_inc_dec_units_and_zscored_PSTH.m’.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This package contains the datasets and supplementary materials used in the IPIN 2021 Competition.
Contents:
IPIN2021_Track03_TechnicalAnnex_V1-02.pdf: Technical annex describing the competition
01-Logfiles: This folder contains a subfolder with the 105 training logfiles, 80 of them single floor indoors, 10 in outdoor areas, 10 of them in the indoor auditorium with floor-trasitio and 5 of them in floor-transition zones, a subfolder with the 20 validation logfiles, and a subfolder with the 3 blind evaluation logfile as provided to competitors.
02-Supplementary_Materials: This folder contains the matlab/octave parser, the raster maps, the files for the matlab tools and the trajectory visualization.
03-Evaluation: This folder contains the scripts used to calculate the competition metric, the 75th percentile on the 82 evaluation points. It requires the Matlab Mapping Toolbox. The ground truth is also provided as 3 csv files. Since the results must be provided with a 2Hz freq. starting from apptimestamp 0, the GT files include the closest timestamp matching the timing provided by competitors for the 3 evaluation logfiles. It contains samples of reported estimations and the corresponding results.
Please, cite the following works when using the datasets included in this package:
Torres-Sospedra, J.; et al. Datasets and Supporting Materials for the IPIN 2021 Competition Track 3 (Smartphone-based, off-site). http://dx.doi.org/10.5281/zenodo.5948678
Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) source code for performing this analysis is provided in the function NN_depth_ensembling.m available on the main landing page for the data release of which this is a child item, along with a flow chart illustrating the four different neural network-based depth retrieval methods. To develop and test this new NNDR approach, the method was applied to satellite images from the Colorado River near Lees Ferry, AZ, acquired in March and April of 2021. Field measurements of water depth available through another data release (Legleiter, C.J., Debenedetto, G.P., and Forbes, B.T., 2022, Field measurements of water depth from the Colorado River near Lees Ferry, AZ, March 16-18, 2021: U.S. Geological Survey data release, https://doi.org/10.5066/P9HZL7BZ) were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: Colorado_mean-spec.tif, Colorado_mean-depth.tif, Colorado_NN-depth.tif, and Colorado-single-image.tif. In addition, to assess the robustness of the Mean-spec and NN-depth methods to the introduction of a large pulse of sediment by a flood event that occurred partway through the image time series, depth maps from before and after the flood are provided in the files Colorado_Mean-spec_after_flood.tif, Colorado_Mean-spec_before_flood.tif, Colorado_NN-depth_after_flood.tif, and Colorado_NN-depth_before_flood.tif. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.
http://www.nationalarchives.gov.uk/doc/non-commercial-government-licence/non-commercial-government-licence.htmhttp://www.nationalarchives.gov.uk/doc/non-commercial-government-licence/non-commercial-government-licence.htm
The Matlab scripts will compute parametric maps from Bruker MR images as described in the JoVE paper published in 2017