95 datasets found
  1. d

    Remotely sensed data, field measurements, and MATLAB code used to produce...

    • catalog.data.gov
    Updated Nov 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Remotely sensed data, field measurements, and MATLAB code used to produce image-derived velocity maps for a reach of the Sacramento River near Glenn, CA, September 16-19, 2024 [Dataset]. https://catalog.data.gov/dataset/remotely-sensed-data-field-measurements-and-matlab-code-used-to-produce-image-derived-v-19
    Explore at:
    Dataset updated
    Nov 5, 2024
    Dataset provided by
    U.S. Geological Survey
    Area covered
    Glenn, Sacramento River
    Description

    This data release provides remotely sensed data, field measurements, and MATLAB code associated with an effort to produce image-derived velocity maps for a reach of the Sacramento River in California's Central Valley. Data collection occurred from September 16-19, 2024, and involved cooperators from the Intelligent Robotics Group from the National Aeronautics and Space Administration (NASA) Ames Research Center and the National Oceanographic and Atmospheric Administration (NOAA) Southwest Fisheries Science Center. The remotely sensed data were obtained from an Uncrewed Aircraft System (UAS) and are stored in Robot Operating System (ROS) .bag files. Within these files, the various data types are organized into ROS topics including: images from a thermal camera, measurements of the distance from the UAS down to the water surface made with a laser range finder, and position and orientation data recorded by a Global Navigation Satellite System (GNSS) receiver and Inertial Measurement Unit (IMU) during the UAS flights. This instrument suite is part of an experimental payload called the River Observing System (RiOS) designed for measuring streamflow and further detail is provided in the metadata file associated with this data release. For the September 2024 test flights, the RiOS payload was deployed from a DJI Matrice M600 Pro hexacopter hovering approximately 270 m above the river. At this altitude, the thermal images have a pixel size of approximately 0.38 m but are not geo-referenced. Two types of ROS .bag files are provided in separate zip folders. The first, Baguettes.zip, contains "baguettes" that include 15-second subsets of data with a reduced sampling rate for the GNSS and IMU. The second, FullBags.zip, contains the full set of ROS topics recorded by RiOS but have been subset to include only the time ranges during which the UAS was hovering in place over one of 11 cross sections along the reach. The start times are included in the .bag file names as portable operating system interface (posix) time stamps. To view the data within ROS .bag files, the Foxglove Studio program linked below is freely available and provides a convenient interface. Note that to view the thermal images, the contrast will need to be adjusted to minimum and maximum values around 12,000 to 15,000, though some further refinement of these values might be necessary to enhance the display. To enable geo-referencing of the thermal images in a post-processing mode, another M600 hexacopter equipped with a standard visible camera was deployed along the river to acquire images from which an orthophoto was produced: 20240916_SacramentoRiver_Ortho_5cm.tif. This orthophoto has a spatial resolution of 0.05 m and is in the Universal Transverse Mercator (UTM) coordinate system, Zone 10. To assess the accuracy of the orthophoto, 21 circular aluminum ground control targets visible in both thermal and RGB (red, green, blue) images were placed in the field and their locations surveyed with a Real-Time Kinematic (RTK) GNSS receiver. The coordinates of these control points are provided in the file SacGCPs20240916.csv. Please see the metadata for additional information on the camera, the orthophoto production process, and the RTK GNSS survey. The thermal images were used as input to Particle Image Velocimetry (PIV) algorithms to infer surface flow velocities throughout the reach. To assess the accuracy of the resulting image-derived velocity estimates, field measurements of flow velocity were obtained using a SonTek M9 acoustic Doppler current profiler (ADCP). These data were acquired along a series of 11 cross sections oriented perpendicular to the primary downstream flow direction and spaced approximately 150 m apart. At each cross section, the boat from which the ADCP was deployed made four passes across the channel and the resulting data was then aggregated into mean cross sections using the Velocity Mapping Toolbox (VMT) referenced below (Parsons et al., 2013). The VMT output was further processed as described in the metadata and ultimately led to a single comma delimited text file, SacAdcp20240918.csv, with cross section numbers, spatial coordinates (UTM Zone 10N), cross-stream distances, velocity vector components, and water depths. To assess the sensitivity of thermal image velocimetry to environmental conditions, air and water temperatures were recorded using a pair of Onset HOBO U20 pressure transducer data loggers set to record pressure and temperature. Deploying one data logger in the air and one in the water also provided information on variations in water level during the test flights. The resulting temperature and water level time series are provided in the file HoboDataSummary.csv with a one-minute sampling interval. These data sets were used to develop and test a new framework for mapping flow velocities in river channels in approximately real time using images from an UAS as they are acquired. Prototype code for implementing this approach was developed in MATLAB and is also included in the data release as a zip folder called VelocityMappingCode.zip. Further information on the individual functions (*.m files) included within this folder is available in the metadata file associated with this data release. The code is provided as is and is intended for research purposes only. Users are advised to thoroughly read the metadata file associated with this data release to understand the appropriate use and limitations of the data and code provided herein.

  2. EEG_Auditory_Oddball_Preprocessed_Data

    • figshare.com
    bin
    Updated Jan 31, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Clare D Harris; Elise G Rowe; Roshini Randeniya; Marta I Garrido (2019). EEG_Auditory_Oddball_Preprocessed_Data [Dataset]. http://doi.org/10.6084/m9.figshare.5812764.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 31, 2019
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Clare D Harris; Elise G Rowe; Roshini Randeniya; Marta I Garrido
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset was obtained at the Queensland Brain Institute, Australia, using a 64 channel EEG Biosemi system. 21 healthy participants completed an auditory oddball paradigm (as described in Garrido et al., 2017).For a description of the oddball paradigm, please see Garrido et al., 2017:Garrido, M.I., Rowe, E.G., Halasz, V., & Mattingley, J. (2017). Bayesian mapping reveals that attention boosts neural responses to predicted and unpredicted stimuli. Cerebral Cortex, 1-12. DOI: 10.1093/cercor/bhx087If you use this dataset, please cite its doi, as well as citing the associated methods paper, which is as follows:Harris, C.D., Rowe, E.G., Randeniya, R. and Garrido, M.I. (2018). Bayesian Model Selection Maps for group studies using M/EEG data.For scripts to analyse the data, please see: https://github.com/ClareDiane/BMS4EEG

  3. l

    Data set for a comprehensive tutorial on the SOM-RPM toolbox for MATLAB

    • opal.latrobe.edu.au
    • researchdata.edu.au
    hdf
    Updated Aug 22, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sarah Bamford; Wil Gardner; Paul Pigram; Ben Muir; David Winkler; Davide Ballabio (2024). Data set for a comprehensive tutorial on the SOM-RPM toolbox for MATLAB [Dataset]. http://doi.org/10.26181/25648905.v2
    Explore at:
    hdfAvailable download formats
    Dataset updated
    Aug 22, 2024
    Dataset provided by
    La Trobe
    Authors
    Sarah Bamford; Wil Gardner; Paul Pigram; Ben Muir; David Winkler; Davide Ballabio
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    This data set is uploaded as supporting information for the publication entitled:A Comprehensive Tutorial on the SOM-RPM Toolbox for MATLABThe attached file 'case_study' includes the following:X : Data from a ToF-SIMS hyperspectral image. A stage raster containing 960 x800 pixels with 963 associated m/z peaks.pk_lbls: The m/z label for each of the 963 m/z peaks.mdl and mdl_masked: SOM-RPM models created using the SOM-RPM tutorial provided within the cited article.Additional details about the datasets can be found in the published article.V2 - contains modified peak lists to show intensity weighted m/z rather than peak midpoint. If you use this data set in your work, please cite our work as follows:[LINK TO BE ADDED TO PAPER ONCE DOI RECEIVED]

  4. w

    JoVE article Matlab software

    • data.wu.ac.at
    txt
    Updated Nov 28, 2017
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Science (2017). JoVE article Matlab software [Dataset]. https://data.wu.ac.at/schema/data_bris_ac_uk_data_/M2IzMjcyMWItNjI1Yi00MDI3LWFmYjktNjQzOWFiYWY5ZTY3
    Explore at:
    txt(72686.0), txt(295986.0), txt(2681.0), txt(1324.0), txt(6089.0), txt(4666.0), txt(38233.0), txt(18038.0), txt(3418.0), txt(11908.0), txt(3259.0), txt(4862.0), txt(455.0), txt(8291.0), txt(4936.0), txt(147.0)Available download formats
    Dataset updated
    Nov 28, 2017
    Dataset provided by
    Science
    License

    http://www.nationalarchives.gov.uk/doc/non-commercial-government-licence/non-commercial-government-licence.htmhttp://www.nationalarchives.gov.uk/doc/non-commercial-government-licence/non-commercial-government-licence.htm

    Description

    The Matlab scripts will compute parametric maps from Bruker MR images as described in the JoVE paper published in 2017

  5. SamSrf v5.84 (pRF mapping toolbox) - OUT OF DATE!

    • figshare.com
    zip
    Updated Sep 25, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data from SamPenDu (2024). SamSrf v5.84 (pRF mapping toolbox) - OUT OF DATE! [Dataset]. http://doi.org/10.6084/m9.figshare.1344765.v25
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 25, 2024
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Data from SamPenDu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This is an out-of-date version of our pRF mapping toolbox and no longer supported! While this version should be stable, USE IT AT YOUR OWN RISK! We instead recommend our new version SamSrf X, which will continue to be updated with bugfixes and new features. You can also use the new version for further analysis of maps from the old version.SamSrf X is available for download at:http://osf.io/2rgsm.--------------------------------------------------------------------------------Version: 5.84 (18-09-2017)Our Matlab toolbox for pRF mapping analysis. Uses SPM8 or SPM12 and FreeSurfer functionality for preprocessing. Also requires Statistics Toolbox, Optimization Toolbox, and Curve Fitting Toolbox (not strictly necessary) for Matlab.An extensive documentation "cookbook" is included. Please contact Sam (sampendu.wordpress.com) for any questions but please be advised that we are not able to provide tech support for people we don't collaborate with.As of version 5.63, we included a new tutorial explaining how to delineate visual areas using the DelineationTool in MatLab and giving advice on what to do with tricky retinotopic maps.

  6. 4

    Hyperspectral dataset and associated MATLAB scripts supplementary to the...

    • data.4tu.nl
    zip
    Updated Jul 22, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Paolo Tasseron; Louise Schreyers; Tim van Emmerik; Joseph Peller; Lauren Biermann (2022). Hyperspectral dataset and associated MATLAB scripts supplementary to the paper 'Towards Robust River Plastic Detection: Combining Lab and Field-based Hyperspectral Imagery' [Dataset]. http://doi.org/10.4121/20343012.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 22, 2022
    Dataset provided by
    4TU.ResearchData
    Authors
    Paolo Tasseron; Louise Schreyers; Tim van Emmerik; Joseph Peller; Lauren Biermann
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This database is the supplementary material of Tasseron et al., (2022): 'Towards Robust River Plastic Detection: Combining Lab and Field-based Hyperspectral Imagery' [Submitted and currently under review], preprint available online at https://doi.org/10.31223/X5RW7V. The dataset contains raw images, MATLAB scripts used for training classifier algorithms, trained pipelines, required toolboxes and labelled training datasets used in subsequent analyses.

  7. f

    Matlab scripts

    • auckland.figshare.com
    rtf
    Updated Apr 28, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Louise Wilson (2023). Matlab scripts [Dataset]. http://doi.org/10.17608/k6.auckland.22508050.v3
    Explore at:
    rtfAvailable download formats
    Dataset updated
    Apr 28, 2023
    Dataset provided by
    The University of Auckland
    Authors
    Louise Wilson
    License

    Attribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
    License information was derived automatically

    Description

    Dataset and Matlab code to accompany the following manuscript: Wilson, L. Constantine, R. Pine, M.K., Farcas, A. Radford, C.A. 2022. Small boat sound diminishes the listening spaces of fishes and crustaceans.

    Please note that the functions find_closest4_fast.m, linterp.m, linterp2d.m, and extract_rec_value_update.m were provided by Charlotte Findlay and Adrian Farcas. Charlotte Findlay can be contacted at charlotte_findlay@hotmail.co.uk or charlotte.findlay@bio.au.dk.

    The following required functions are available from the Matlab file exchange: Jonathan Sullivan (2023). Automatic Map Scale Generation (https://www.mathworks.com/matlabcentral/fileexchange/33545-automatic-map-scale-generation), MATLAB Central File Exchange. Retrieved April 28, 2023.

    Rafael Palacios (2023). deg2utm (https://www.mathworks.com/matlabcentral/fileexchange/10915-deg2utm), MATLAB Central File Exchange. Retrieved April 28, 2023.

    Rafael Palacios (2023). utm2deg (https://www.mathworks.com/matlabcentral/fileexchange/10914-utm2deg), MATLAB Central File Exchange. Retrieved April 28, 2023.

    The function PG_DFT.m accompanies the folllowing manuscript: Merchant, N. D., Fristrup, K. M., Johnson, M. P., Tyack, P. L., Witt, M. J., Blondel, P., & Parks, S. E. (2015). Measuring acoustic habitats. Methods in Ecology and Evolution, 6, 257–265. https://doi.org/10.1111/2041-210X.12330

  8. How to set the input parameters: an example.

    • plos.figshare.com
    xls
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Montalto; Luca Faes; Daniele Marinazzo (2023). How to set the input parameters: an example. [Dataset]. http://doi.org/10.1371/journal.pone.0109462.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Alessandro Montalto; Luca Faes; Daniele Marinazzo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    How to set the input parameters: an example.

  9. d

    Maps of water depth derived from satellite images of selected reaches of the...

    • catalog.data.gov
    • data.usgs.gov
    Updated Sep 12, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Maps of water depth derived from satellite images of selected reaches of the American, Colorado, and Potomac Rivers acquired in 2020 and 2021 (ver. 2.0, September 2024) [Dataset]. https://catalog.data.gov/dataset/maps-of-water-depth-derived-from-satellite-images-of-selected-reaches-of-the-american-colo
    Explore at:
    Dataset updated
    Sep 12, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    United States, Colorado
    Description

    Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) source code for performing this analysis is provided in the function NN_depth_ensembling.m and the figure included on this landing page provides a flow chart illustrating the four different neural network-based depth retrieval methods. As examples of the resulting models, MATLAB *.mat data files containing the best-performing neural network model for each site are provided below, along with a file that lists the PlanetScope image identifiers for the images that were used for each site. To develop and test this new NNDR approach, the method was applied to satellite images from three rivers across the U.S.: the American, Colorado, and Potomac. For each site, field measurements of water depth available through other data releases were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: X_mean-spec.tif, X_mean-depth.tif, X_NN-depth.tif, and X-single-image.tif, where X denotes the site name. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.

  10. H

    Replication Data for: Automated analysis of cardiovascular magnetic...

    • dataverse.harvard.edu
    Updated Jan 24, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ahmed Fahmy (2021). Replication Data for: Automated analysis of cardiovascular magnetic resonance myocardial native T1 mapping images using fully convolutional neural networks [Dataset]. http://doi.org/10.7910/DVN/N1R1Q4
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jan 24, 2021
    Dataset provided by
    Harvard Dataverse
    Authors
    Ahmed Fahmy
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    1) Image Data (Matlab format): > T1-weighted images (5 anatomical slices, 11 image/slice) for 210 patients. > Manual annotation (segmetnation) of each image is included as a binary mask. > Timing of each image is included (i.e. inversion delay time) > Label indicating whether the dataset was used for training ('trn') or testing ('tst') 2) Reference T1 mapping images (Matlab format): 1 map/slice, 5 slices/patient

  11. l

    Data set for article: Effect of data preprocessing and machine learning...

    • opal.latrobe.edu.au
    • researchdata.edu.au
    hdf
    Updated Mar 7, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wil Gardner (2024). Data set for article: Effect of data preprocessing and machine learning hyperparameters on mass spectrometry imaging models [Dataset]. http://doi.org/10.26181/22671022.v1
    Explore at:
    hdfAvailable download formats
    Dataset updated
    Mar 7, 2024
    Dataset provided by
    La Trobe
    Authors
    Wil Gardner
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    This data set is uploaded as supporting information for the publication entitled:Effect of data preprocessing and machine learning hyperparameters on mass spectrometry imaging modelsFiles are as follows:polymer_microarray_data.mat - MATLAB workspace file containing peak-picked ToF-SIMS data (hyperspectral array) for the polymer microarray sample.nylon_data.mat - MATLAB workspace file containing m/z binned ToF-SIMS data (hyperspectral array) for the semi-synthetic nylon data set, generated from 7 nylon samples.Additional details about the datasets can be found in the published article.If you use this data set in your work, please cite our work as follows:Cite as: Gardner et al.. J. Vac. Sci. Technol. A 41, 000000 (2023); doi: 10.1116/6.0002788

  12. Spatial and temporal modulation in time-domain diffuse optical tomography:...

    • zenodo.org
    bin, txt
    Updated May 20, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jarjish Rahaman; Jarjish Rahaman (2025). Spatial and temporal modulation in time-domain diffuse optical tomography: Dataset and MATLAB Code for Visualization [Dataset]. http://doi.org/10.5281/zenodo.15462556
    Explore at:
    bin, txtAvailable download formats
    Dataset updated
    May 20, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Jarjish Rahaman; Jarjish Rahaman
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains Hadamard pattern illumination and time-domain diffuse optical tomography (TD-DOT) measurements, including both simulated time domain data and frequency-domain converted data. MATLAB scripts are provided for visualizing Hadamard patterns, data and reconstructing 3D maps of optical properties.

    The dataset supports the forthcoming publication “Spatial and temporal modulation in time-domain diffuse optical tomography".

  13. d

    Data for Measuring porous media velocity fields and grain bed architecture...

    • search.dataone.org
    • hydroshare.org
    • +1more
    Updated Dec 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Brandon Hilliard; Ralph Budwig; Richard S. Skifton; Vibhav Durgesh; William J. Reeder; Bishal Bhattarai; Benjamin T. Martin; Tao Xing; Daniele Tonina (2023). Data for Measuring porous media velocity fields and grain bed architecture with a quantitative PLIF-based technique [Dataset]. http://doi.org/10.4211/hs.a79d513a08064ecd85f781bb9dfb642d
    Explore at:
    Dataset updated
    Dec 30, 2023
    Dataset provided by
    Hydroshare
    Authors
    Brandon Hilliard; Ralph Budwig; Richard S. Skifton; Vibhav Durgesh; William J. Reeder; Bishal Bhattarai; Benjamin T. Martin; Tao Xing; Daniele Tonina
    Description

    Porous media flows are common in both natural and anthropogenic systems. Mapping these flows in a laboratory setting is challenging and often requires non-intrusive measurement techniques, such as particle image velocimetry (PIV) coupled with refractive index matching (RIM). RIM-coupled PIV allows the mapping of velocity fields around transparent solids by analyzing the movement of neutrally buoyant micron-sized seeding particles. The use of this technique in a porous medium can be problematic because seeding particles adhere to grains, which causes the grain bed to lose transparency and can obstruct pore flows. Another non-intrusive optical technique, planar laser-induced fluorescence (PLIF), can be paired with RIM and does not have this limitation because fluorescent dye is used instead of particles, but it has been chiefly used for qualitative flow visualization. Here, we propose a quantitative PLIF-based methodology to map both porous media flow fields and porous media architecture. Velocity fields are obtained by tracking the advection-dominated movement of the fluorescent dye plume front within a porous medium. We also propose an automatic tracking algorithm that quantifies 2D velocity components as the plume moves through space in both an Eulerian and a Lagrangian framework. We apply this algorithm to three data sets: a synthetic data set and two laboratory experiments. Performance of this algorithm is reported by the mean (bias error, B) and standard deviation (random error, SD) of the residuals between its results and the reference data. For the synthetic data, the algorithm produces maximum errors of B & SD = 32% & 23% in the Eulerian framework, respectively, and B & SD = −0.04% & 3.9% in the Lagrangian framework. The small-scale laboratory experimental data requires the Eulerian framework and produce errors of B & SD = −0.5% & 33%. The Lagrangian framework is used on the large-scale laboratory experimental data and produces errors of B & SD = 5% & 44%. Mapping the porous media architecture shows negligible error for reconstructing calibration grains of known dimensions. Article DOI: 10.1088/1361-6501/acfb2b

  14. d

    Site visit cross section surveys and multispectral image data from gaging...

    • catalog.data.gov
    • data.usgs.gov
    Updated Jul 20, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Site visit cross section surveys and multispectral image data from gaging stations throughout the Willamette and Delaware River Basins from 2022 and code for Bathymetric Mapping using Gage Records and Image Databases (BaMGRID) [Dataset]. https://catalog.data.gov/dataset/site-visit-cross-section-surveys-and-multispectral-image-data-from-gaging-stations-through
    Explore at:
    Dataset updated
    Jul 20, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Delaware River
    Description

    This data release includes cross section survey data collected during site visits to USGS gaging stations located throughout the Willamette and Delaware River Basins and multispectral images of these locations acquired as close in time as possible to the date of each site visit. In addition, MATLAB source code developed for the Bathymetric Mapping using Gage Records and Image Databases (BaMGRID) framework is also provided. The site visit data were obtained from the Aquarius Time Series database, part of the USGS National Water Information System (NWIS), using the Publish Application Programming Interface (API). More specifically, a custom MATLAB function was used to query the FieldVisitDataByLocationServiceRequest endpoint of the Aquarius API by specifying the gaging station ID number and the date range of interest and then retrieve the QRev XML attachments associated with site visits meeting these criteria. These XML files were then parsed using another custom MATLAB function that served to extract the cross section survey data collected during the site visit. Note that because many of the site visits involved surveying cross sections using instrumentation that was not GPS-enabled, latitude and longitude coordinates were not available and no data values (NaN) are used in the site visit files provided in this data release. Remotely sensed data acquired as close as possible to the date of each site visit were also retrieved via APIs. Multispectral satellite images from the PlanetScope constellation were obtained using custom MATLAB functions developed to interact with the Planet Orders API, which provided tools for clipping the images to a specified area of interest focused on the gaging station and harmonizing the pixel values to be consistent across the different satellites within the PlanetScope constellation. The data product retrieved was the PlanetScope orthorectified 8-band surface reflectance bundle. PlanetScope images are acquired with high frequency, often multiple times per day at a given location, and so the search was restricted to a time window spanning from three days prior to three days after the site visit. All images meeting these criteria were downloaded and manually inspected; the highest quality image closest in time to the site visit date was retained for further analysis. For the gaging stations within the Willamette River Basin, digital aerial photography acquired through the National Agricultural Imagery Program (NAIP) in 2022 were obtained using a similar set of MATLAB functions developed to access the USGS EarthExplorer Machine-to-Machine (M2M) API. The NAIP quarter-quadrangle image encompassing each gaging station was downloaded and then clipped to a smaller area centered on the gaging station. Only one NAIP image at each gaging station was acquired in 2022, so differences in streamflow between the image acquisition date and the date of the site visit closest in time were accounted for by performing separate NWIS web queries to retrieve the stage and discharge recorded at the gaging station on the date the image was acquired and on the date of the site visit. These data sets were used as an example application of the framework for Bathymetric Mapping using Gage Records and Image Databases (BaMGRID) and this data release also provides MATLAB source code developed to implement this approach. The code is packaged in a zip archive that includes the following individual .m files: 1) getSiteVisit.m, for retrieving data collected during site visits to USGS gaging stations through the Aquarius API; 2) Qrev2depth.m, for parsing the XML file from the site visit and extracting depth measurements surveyed along a channel cross section during a direct discharge measurement; 3) orderPlanet.m, for searching for and ordering PlanetScope images via the Planet Orders API; 4) pollThenGrabPlanet.m, for querying the status of an order and then downloading PlanetScope images requested through the Planet Orders API; 5) organizePlanet.m, for file management and cleanup of the original PlanetScope image data obtained via the previous two functions; 6) ingestNaip.m, for searching for, ordering, and downloading NAIP data via the USGS Machine-to-Machine (M2M) API; 7) naipExtractClip.m, for clipping the downloaded NAIP images to the specified area of interest and performing file management and cleanup; and 8) crossValObra.m, for performing spectrally based depth retrieval via the Optimal Band Ratio Analysis (OBRA) algorithm using a k-fold cross-validation approach intended for small sample sizes. The files provided through this data release include: 1) A zipped shapefile with polygons delineating the Willamette and Delaware River basins 2) .csv text files with information on site visits within each basin during 2022 3) .csv text files with information on PlanetScope images of each gaging station close in time to the date of each site visit that can be used to obtain the image data through the Planet Orders API or Planet Explorer web interface. 4) A .csv text tile with information on NAIP images of each gaging station in the Willamette River Basin as close in time as possible to the date of each site visit, along with the stage and discharge recorded at the gaging station on the date of image acquisition and the date of the site visit. 5) A zip archive of the clipped NAIP images of each gaging station in the Willamette River Basin in GeoTIFF format. 6) A zip archive with source code (MATLAB *.m files) developed to implement the Bathymetric Mapping using Gage Records and Image Databases (BaMGRID) framework.

  15. d

    Maps of water depth derived from satellite images of the Colorado River...

    • catalog.data.gov
    Updated Sep 12, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Maps of water depth derived from satellite images of the Colorado River acquired in March and April of 2021 [Dataset]. https://catalog.data.gov/dataset/maps-of-water-depth-derived-from-satellite-images-of-the-colorado-river-acquired-in-march-
    Explore at:
    Dataset updated
    Sep 12, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Area covered
    Colorado River
    Description

    Information on water depth in river channels is important for a number of applications in water resource management but can be difficult to obtain via conventional field methods, particularly over large spatial extents and with the kind of frequency and regularity required to support monitoring programs. Remote sensing methods could provide a viable alternative means of mapping river bathymetry (i.e., water depth). The purpose of this study was to develop and test new, spectrally based techniques for estimating water depth from satellite image data. More specifically, a neural network-based temporal ensembling approach was evaluated in comparison to several other neural network depth retrieval (NNDR) algorithms. These methods are described in a manuscript titled "Neural Network-Based Temporal Ensembling of Water Depth Estimates Derived from SuperDove Images" and the purpose of this data release is to make available the depth maps produced using these techniques. The images used as input were acquired by the SuperDove cubesats comprising the PlanetScope constellation, but the original images cannot be redistributed due to licensing restrictions; the end products derived from these images are provided instead. The large number of cubesats in the PlanetScope constellation allows for frequent temporal coverage and the neural network-based approach takes advantage of this high density time series of information by estimating depth via one of four NNDR methods described in the manuscript: 1. Mean-spec: the images are averaged over time and the resulting mean image is used as input to the NNDR. 2. Mean-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is averaged to obtain the final depth map. 3. NN-depth: a separate NNDR is applied independently to each image in the time series and the resulting time series of depth estimates is then used as input to a second, ensembling neural network that essentially weights the depth estimates from the individual images so as to optimize the agreement between the image-derived depth estimates and field measurements of water depth used for training; the output from the ensembling neural network serves as the final depth map. 4. Optimal single image: a separate NNDR is applied independently to each image in the time series and only the image that yields the strongest agreement between the image-derived depth estimates and the field measurements of water depth used for training is used as the final depth map. MATLAB (Version 24.1, including the Deep Learning Toolbox) source code for performing this analysis is provided in the function NN_depth_ensembling.m available on the main landing page for the data release of which this is a child item, along with a flow chart illustrating the four different neural network-based depth retrieval methods. To develop and test this new NNDR approach, the method was applied to satellite images from the Colorado River near Lees Ferry, AZ, acquired in March and April of 2021. Field measurements of water depth available through another data release (Legleiter, C.J., Debenedetto, G.P., and Forbes, B.T., 2022, Field measurements of water depth from the Colorado River near Lees Ferry, AZ, March 16-18, 2021: U.S. Geological Survey data release, https://doi.org/10.5066/P9HZL7BZ) were used for training and validation. The depth maps produced via each of the four methods described above are provided as GeoTIFF files, with file name suffixes that indicate the method employed: Colorado_mean-spec.tif, Colorado_mean-depth.tif, Colorado_NN-depth.tif, and Colorado-single-image.tif. In addition, to assess the robustness of the Mean-spec and NN-depth methods to the introduction of a large pulse of sediment by a flood event that occurred partway through the image time series, depth maps from before and after the flood are provided in the files Colorado_Mean-spec_after_flood.tif, Colorado_Mean-spec_before_flood.tif, Colorado_NN-depth_after_flood.tif, and Colorado_NN-depth_before_flood.tif. The spatial resolution of the depth maps is 3 meters and the pixel values within each map are water depth estimates in units of meters.

  16. Matlab Inc Import Shipments, Overseas Suppliers

    • volza.com
    csv
    Updated May 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Volza FZ LLC (2025). Matlab Inc Import Shipments, Overseas Suppliers [Dataset]. https://www.volza.com/us-importers/matlab-inc-1968842.aspx
    Explore at:
    csvAvailable download formats
    Dataset updated
    May 30, 2025
    Dataset provided by
    Volza
    Authors
    Volza FZ LLC
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    2014 - Sep 30, 2021
    Variables measured
    Count of exporters, Count of importers, Sum of export value, Count of import shipments
    Description

    Find out import shipments and details about Matlab Inc Import Data report along with address, suppliers, products and import shipments.

  17. f

    Conversion Script for Model Building from KEGG-Reactome Data to SBTOOLBOX2

    • fairdomhub.org
    application/matlab
    Updated Dec 11, 2012
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sebastian Curth (2012). Conversion Script for Model Building from KEGG-Reactome Data to SBTOOLBOX2 [Dataset]. https://fairdomhub.org/data_files/1034
    Explore at:
    application/matlab(4.18 KB)Available download formats
    Dataset updated
    Dec 11, 2012
    Authors
    Sebastian Curth
    Description
    • automated integration of transcriptomic and reactome data to differential equations
    • structure of the paths is maintained
    • continuous fermentation model in standard format for data integration, two component model (cell and fermenter)

    call >> Kegg2SBToolbox2('model_map.txt', 'reactions_compounds_final.csv','extracellular.txt','testmodel.txt') for an example

    where model_map is the desired mapping of species, reaction_compounds_final.csv is the entire network, extracellular.txt is a manual mapping which compounds are in extracellular-space, testmodel.txt is the output model

  18. f

    Example of the parameters required to define the methods for an experiment...

    • plos.figshare.com
    xls
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Montalto; Luca Faes; Daniele Marinazzo (2023). Example of the parameters required to define the methods for an experiment on 5 variables. [Dataset]. http://doi.org/10.1371/journal.pone.0109462.t002
    Explore at:
    xlsAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Alessandro Montalto; Luca Faes; Daniele Marinazzo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In the second column the instantaneous effects are neglected both for targets and conditioning. In the third column we set instantaneous effects for some drivers and the respective targets. For example, when the target is 1, instantaneous effects are taken into account for driver 2 (first two rows, right column, parameter idDrivers) and conditioning variable 3 (first row, right column, parameter idOtherLagZero).Example of the parameters required to define the methods for an experiment on 5 variables.

  19. Data from: Delta-X: Matlab Model for Wax Lake Delta Land Accretion

    • catalog.data.gov
    • data.staging.idas-ds1.appdat.jsc.nasa.gov
    • +4more
    Updated Jul 4, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    ORNL_DAAC (2025). Delta-X: Matlab Model for Wax Lake Delta Land Accretion [Dataset]. https://catalog.data.gov/dataset/delta-x-matlab-model-for-wax-lake-delta-land-accretion-026b3
    Explore at:
    Dataset updated
    Jul 4, 2025
    Dataset provided by
    Oak Ridge National Laboratory Distributed Active Archive Center
    Area covered
    Wax Lake
    Description

    This dataset provides the Matlab sediment transport and land accretion model at Wax Lake Delta (WLD), Atchafalaya Basin, in coastal Louisiana. The data include the Matlab scripts that solve the advection and Exner equations to simulate the suspended sediment transport and accretion at WLD. The model requires modeled flow information from a separate ANUGA hydrodynamic model as inputs. For this study, ANUGA modeled flow information from the Delta-X Spring and Fall 2021 campaigns were used as inputs. The ANUGA output files are converted to variables used by this Matlab model using pre-processing tools. The main code calculates suspended sediment fluxes and accretion rates of mud and sand as a function of space and time. The cumulative sediment accretion from each campaign was then used to estimate an annualized land accretion map using a weighted-average formula presented. The final product, the one-yr upscaled land accretion map, is archived as a separate dataset.

  20. Z

    Datasets and Supporting Materials for the IPIN 2021 Competition Track 3...

    • data.niaid.nih.gov
    • recerca.uoc.edu
    • +1more
    Updated Jun 14, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Joaquin Torres-Sospedra (2022). Datasets and Supporting Materials for the IPIN 2021 Competition Track 3 (Smartphone-based, off-site) [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_5948677
    Explore at:
    Dataset updated
    Jun 14, 2022
    Dataset provided by
    Fernando Aranda Polo
    Fernando Alvarez
    Antoni Pérez-Navarro
    Fernando Seco
    Antonio R. Jimenez
    Joaquin Torres-Sospedra
    Felipe Parralejo
    Vladimir Bellavista Parent
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This package contains the datasets and supplementary materials used in the IPIN 2021 Competition.

    Contents:

    IPIN2021_Track03_TechnicalAnnex_V1-02.pdf: Technical annex describing the competition

    01-Logfiles: This folder contains a subfolder with the 105 training logfiles, 80 of them single floor indoors, 10 in outdoor areas, 10 of them in the indoor auditorium with floor-trasitio and 5 of them in floor-transition zones, a subfolder with the 20 validation logfiles, and a subfolder with the 3 blind evaluation logfile as provided to competitors.

    02-Supplementary_Materials: This folder contains the matlab/octave parser, the raster maps, the files for the matlab tools and the trajectory visualization.

    03-Evaluation: This folder contains the scripts used to calculate the competition metric, the 75th percentile on the 82 evaluation points. It requires the Matlab Mapping Toolbox. The ground truth is also provided as 3 csv files. Since the results must be provided with a 2Hz freq. starting from apptimestamp 0, the GT files include the closest timestamp matching the timing provided by competitors for the 3 evaluation logfiles. It contains samples of reported estimations and the corresponding results.

    Please, cite the following works when using the datasets included in this package:

    Torres-Sospedra, J.; et al. Datasets and Supporting Materials for the IPIN 2021 Competition Track 3 (Smartphone-based, off-site). http://dx.doi.org/10.5281/zenodo.5948678

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
U.S. Geological Survey (2024). Remotely sensed data, field measurements, and MATLAB code used to produce image-derived velocity maps for a reach of the Sacramento River near Glenn, CA, September 16-19, 2024 [Dataset]. https://catalog.data.gov/dataset/remotely-sensed-data-field-measurements-and-matlab-code-used-to-produce-image-derived-v-19

Remotely sensed data, field measurements, and MATLAB code used to produce image-derived velocity maps for a reach of the Sacramento River near Glenn, CA, September 16-19, 2024

Explore at:
Dataset updated
Nov 5, 2024
Dataset provided by
U.S. Geological Survey
Area covered
Glenn, Sacramento River
Description

This data release provides remotely sensed data, field measurements, and MATLAB code associated with an effort to produce image-derived velocity maps for a reach of the Sacramento River in California's Central Valley. Data collection occurred from September 16-19, 2024, and involved cooperators from the Intelligent Robotics Group from the National Aeronautics and Space Administration (NASA) Ames Research Center and the National Oceanographic and Atmospheric Administration (NOAA) Southwest Fisheries Science Center. The remotely sensed data were obtained from an Uncrewed Aircraft System (UAS) and are stored in Robot Operating System (ROS) .bag files. Within these files, the various data types are organized into ROS topics including: images from a thermal camera, measurements of the distance from the UAS down to the water surface made with a laser range finder, and position and orientation data recorded by a Global Navigation Satellite System (GNSS) receiver and Inertial Measurement Unit (IMU) during the UAS flights. This instrument suite is part of an experimental payload called the River Observing System (RiOS) designed for measuring streamflow and further detail is provided in the metadata file associated with this data release. For the September 2024 test flights, the RiOS payload was deployed from a DJI Matrice M600 Pro hexacopter hovering approximately 270 m above the river. At this altitude, the thermal images have a pixel size of approximately 0.38 m but are not geo-referenced. Two types of ROS .bag files are provided in separate zip folders. The first, Baguettes.zip, contains "baguettes" that include 15-second subsets of data with a reduced sampling rate for the GNSS and IMU. The second, FullBags.zip, contains the full set of ROS topics recorded by RiOS but have been subset to include only the time ranges during which the UAS was hovering in place over one of 11 cross sections along the reach. The start times are included in the .bag file names as portable operating system interface (posix) time stamps. To view the data within ROS .bag files, the Foxglove Studio program linked below is freely available and provides a convenient interface. Note that to view the thermal images, the contrast will need to be adjusted to minimum and maximum values around 12,000 to 15,000, though some further refinement of these values might be necessary to enhance the display. To enable geo-referencing of the thermal images in a post-processing mode, another M600 hexacopter equipped with a standard visible camera was deployed along the river to acquire images from which an orthophoto was produced: 20240916_SacramentoRiver_Ortho_5cm.tif. This orthophoto has a spatial resolution of 0.05 m and is in the Universal Transverse Mercator (UTM) coordinate system, Zone 10. To assess the accuracy of the orthophoto, 21 circular aluminum ground control targets visible in both thermal and RGB (red, green, blue) images were placed in the field and their locations surveyed with a Real-Time Kinematic (RTK) GNSS receiver. The coordinates of these control points are provided in the file SacGCPs20240916.csv. Please see the metadata for additional information on the camera, the orthophoto production process, and the RTK GNSS survey. The thermal images were used as input to Particle Image Velocimetry (PIV) algorithms to infer surface flow velocities throughout the reach. To assess the accuracy of the resulting image-derived velocity estimates, field measurements of flow velocity were obtained using a SonTek M9 acoustic Doppler current profiler (ADCP). These data were acquired along a series of 11 cross sections oriented perpendicular to the primary downstream flow direction and spaced approximately 150 m apart. At each cross section, the boat from which the ADCP was deployed made four passes across the channel and the resulting data was then aggregated into mean cross sections using the Velocity Mapping Toolbox (VMT) referenced below (Parsons et al., 2013). The VMT output was further processed as described in the metadata and ultimately led to a single comma delimited text file, SacAdcp20240918.csv, with cross section numbers, spatial coordinates (UTM Zone 10N), cross-stream distances, velocity vector components, and water depths. To assess the sensitivity of thermal image velocimetry to environmental conditions, air and water temperatures were recorded using a pair of Onset HOBO U20 pressure transducer data loggers set to record pressure and temperature. Deploying one data logger in the air and one in the water also provided information on variations in water level during the test flights. The resulting temperature and water level time series are provided in the file HoboDataSummary.csv with a one-minute sampling interval. These data sets were used to develop and test a new framework for mapping flow velocities in river channels in approximately real time using images from an UAS as they are acquired. Prototype code for implementing this approach was developed in MATLAB and is also included in the data release as a zip folder called VelocityMappingCode.zip. Further information on the individual functions (*.m files) included within this folder is available in the metadata file associated with this data release. The code is provided as is and is intended for research purposes only. Users are advised to thoroughly read the metadata file associated with this data release to understand the appropriate use and limitations of the data and code provided herein.

Search
Clear search
Close search
Google apps
Main menu