19 datasets found
  1. Model output and data used for analysis

    • catalog.data.gov
    Updated Nov 12, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. EPA Office of Research and Development (ORD) (2020). Model output and data used for analysis [Dataset]. https://catalog.data.gov/dataset/model-output-and-data-used-for-analysis
    Explore at:
    Dataset updated
    Nov 12, 2020
    Dataset provided by
    United States Environmental Protection Agencyhttp://www.epa.gov/
    Description

    The modeled data in these archives are in the NetCDF format (https://www.unidata.ucar.edu/software/netcdf/). NetCDF (Network Common Data Form) is a set of software libraries and machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. It is also a community standard for sharing scientific data. The Unidata Program Center supports and maintains netCDF programming interfaces for C, C++, Java, and Fortran. Programming interfaces are also available for Python, IDL, MATLAB, R, Ruby, and Perl. Data in netCDF format is: • Self-Describing. A netCDF file includes information about the data it contains. • Portable. A netCDF file can be accessed by computers with different ways of storing integers, characters, and floating-point numbers. • Scalable. Small subsets of large datasets in various formats may be accessed efficiently through netCDF interfaces, even from remote servers. • Appendable. Data may be appended to a properly structured netCDF file without copying the dataset or redefining its structure. • Sharable. One writer and multiple readers may simultaneously access the same netCDF file. • Archivable. Access to all earlier forms of netCDF data will be supported by current and future versions of the software. Pub_figures.tar.zip Contains the NCL scripts for figures 1-5 and Chesapeake Bay Airshed shapefile. The directory structure of the archive is ./Pub_figures/Fig#_data. Where # is the figure number from 1-5. EMISS.data.tar.zip This archive contains two NetCDF files that contain the emission totals for 2011ec and 2040ei emission inventories. The name of the files contain the year of the inventory and the file header contains a description of each variable and the variable units. EPIC.data.tar.zip contains the monthly mean EPIC data in NetCDF format for ammonium fertilizer application (files with ANH3 in the name) and soil ammonium concentration (files with NH3 in the name) for historical (Hist directory) and future (RCP-4.5 directory) simulations. WRF.data.tar.zip contains mean monthly and seasonal data from the 36km downscaled WRF simulations in the NetCDF format for the historical (Hist directory) and future (RCP-4.5 directory) simulations. CMAQ.data.tar.zip contains the mean monthly and seasonal data in NetCDF format from the 36km CMAQ simulations for the historical (Hist directory), future (RCP-4.5 directory) and future with historical emissions (RCP-4.5-hist-emiss directory). This dataset is associated with the following publication: Campbell, P., J. Bash, C. Nolte, T. Spero, E. Cooter, K. Hinson, and L. Linker. Projections of Atmospheric Nitrogen Deposition to the Chesapeake Bay Watershed. Journal of Geophysical Research - Biogeosciences. American Geophysical Union, Washington, DC, USA, 12(11): 3307-3326, (2019).

  2. Resolved Exiobase version 3 (REX3)

    • zenodo.org
    zip
    Updated Apr 18, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Livia Cabernard; Livia Cabernard; Stephan Pfister; Stephan Pfister; Stefanie Hellweg; Stefanie Hellweg (2025). Resolved Exiobase version 3 (REX3) [Dataset]. http://doi.org/10.5281/zenodo.10354283
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 18, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Livia Cabernard; Livia Cabernard; Stephan Pfister; Stephan Pfister; Stefanie Hellweg; Stefanie Hellweg
    Time period covered
    Dec 23, 2023
    Description

    Description of the REX3 database

    This repository provides the Resolved EXIOBASE database version 3 (REX3) of the study "Biodiversity impacts of recent land-use change driven by increases in agri-food imports” published in Nature Sustainability. Also the REX3 database was used in Chapter 3 of the Global Resource Outlook 2024 from the UNEP International Resource Panel (IRP), including a data visualizer that allows for downscaling.

    In REX3, Exiobase version 3.8 was merged with Eora26, production data from FAOSTAT, and bilateral trade data from the BACI database to create a highly-resolved MRIO database with comprehensive regionalized environmental impact assessment following the UNEP-SETAC guidelines and integrating land use data from the LUH2 database. REX3 distinguishes 189 countries, 163 sectors, time series from 1995 to 2022, and several environmental and socioeconomic extensions. The environmental impact assessment includes climate impacts, PM health impacts, water stress, and biodiversity impact from land occupation, land use change, and eutrophication.

    The folders "REX3_Year" provide the database for each year. Each folder contains the following files (*.mat-files):
    T_REX: the transaction matrix
    Y_REX: the final demand matrix
    Q_REX and Q_Y_REX: the satellite matrix of the economy and the final demand

    The folder "REX3_Labels" provides the labels of the matrices, countries, sectors and extensions.

    *The database is also available as textfiles --> contact livia.cabernard@tum.de

    While Exiobase version 3.8.2 was used for the study "Biodiversity impacts of recent land-use change driven by increases in agri-food imports and the Global Resource Outlook 2024, the REX3 database shared in this repository is based on Exiobase version 3.8, as this is the earliest exiobase version that can be still shared via a Creative Commons Attribution 4.0 International License. However, the matlab code attached to this repository allows to compile the REX3 database with earlier exiobase versions as well (e.g., version 3.8.2), as described in the section below.

    Codes to compile REX3 and reproduce the results of the study Biodiversity impacts of recent land-use change driven by increases in agri-food imports

    The folder "matlab code to compile REX3" provides the code to compile the REX3 database. This can also be done by using an earlier exiobase version (e.g., version 3.8.2). For this purpose, the data from EXIOBASE3 need to be saved into the subfolder Files/Exiobase/…, while the data from Eora26 need to be saved into the subfolder Files/Eora26/bp/…

    The folder "R code for regionalized BD impact assessment based on LUH2 data and maps (Figure 1)" contains the R code to weight the land use data from the LUH2 dataset with the species loss factors from UNEP-SETAC and to create the maps shown in Figure 1 of the paper. For this purpose, the data from the LUH2 dataset (transitions.nc) need to be stored in the subfolder "LUH2 data".

    The folder "matlab code to calculate MRIO results (Figure 2-5)" contains the matlab code to calculate the MRIO Results for Figure 2-5 of the study.

    The folder "R code to illustrate sankeys – Figure 3–5, S10" contains the R code to visualize the sankeys.

    Data visualizer to downscale the results of the IRP Global Resource Outlook 2024 based on REX3:

    A data visualizer that is based on REX3 and allows to downscale the results of the IRP Global Resource Outlook 2024 on a country level can be found here.

    Earlier versions of REX:

    An earlier version of this database (REX1) with time series from 1995–2015 is described in Cabernard & Pfister 2021.

    An earlier version including GTAP and mining-related biodiversity impacts for the year 2014 (REX2) is described in Cabernard & Pfister 2022.

    Download & conversion from .mat to .zarr files for efficient data handling:
    A package for downloading, extracting, and converting REX3 data from MATLAB (.mat) to .zarr format has been provided by Yanfei Shan here:
    https://github.com/FayeShan/REX3_handler. Once the files are converted to .zarr format, the data can be explored and processed flexibly. For example, you can use pandas to convert the data into CSV, or export it as Parquet, which is more efficient for handling large datasets. Please note note that this package is still under development and that more functions for MRIO analysis will be added in the future.

  3. Data from: Matlab Scripts and Sample Data Associated with Water Resources...

    • osti.gov
    • data.openei.org
    • +2more
    Updated Jul 18, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Becker, Matthew W (2015). Matlab Scripts and Sample Data Associated with Water Resources Research Article [Dataset]. https://www.osti.gov/dataexplorer/biblio/dataset/1638712
    Explore at:
    Dataset updated
    Jul 18, 2015
    Dataset provided by
    United States Department of Energyhttp://energy.gov/
    Office of Energy Efficiency and Renewable Energyhttp://energy.gov/eere
    Authors
    Becker, Matthew W
    Description

    Scripts and data acquired at the Mirror Lake Research Site, cited by the article submitted to Water Resources Research: Distributed Acoustic Sensing (DAS) as a Distributed Hydraulic Sensor in Fractured Bedrock M. W. Becker(1), T. I. Coleman(2), and C. C. Ciervo(1) 1 California State University, Long Beach, Geology Department, 1250 Bellflower Boulevard, Long Beach, California, 90840, USA. 2 Silixa LLC, 3102 W Broadway St, Suite A, Missoula MT 59808, USA. Corresponding author: Matthew W. Becker (matt.becker@csulb.edu).

  4. f

    Data underpinning: BPM-Matlab - An open-source optical propagation...

    • datasetcatalog.nlm.nih.gov
    • data.dtu.dk
    Updated Apr 6, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Andersen, Peter E.; Jensen, Stefan Mark; Marti, Dominik; Veettikazhy, Madhu; Hansen, Anders Kragh; Dholakia, Kishan; Andresen, Esben Ravn; Borre, Anja Lykke (2021). Data underpinning: BPM-Matlab - An open-source optical propagation simulation tool in MATLAB [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000799949
    Explore at:
    Dataset updated
    Apr 6, 2021
    Authors
    Andersen, Peter E.; Jensen, Stefan Mark; Marti, Dominik; Veettikazhy, Madhu; Hansen, Anders Kragh; Dholakia, Kishan; Andresen, Esben Ravn; Borre, Anja Lykke
    Description

    BPM-Matlab is an open-source optical propagation simulation tool we developed in MATLAB environment for computationally efficient simulation of electric field propagation through a wide variety of optical fiber geometries using Douglas-Gunn Alternating Direction Implicit finite difference method. The validations of BPM-Matlab numerical results are provided in the article by comparing them against published data and results from state-of-the-art commercial software. The simulation tool is gratis, open-source, fast, user-friendly, and supports optional CUDA acceleration. It can be downloaded from https://gitlab.gbar.dtu.dk/biophotonics/BPM-Matlab. The software is published under the terms of the GPLv3 License. The data available here in DTU Data can be used to reproduce the figures 1-5 in the Optics Express manuscript titled 'BPM-Matlab - An open-source optical propagation simulation tool in MATLAB'. These data are generated using BPM-Matlab software except for Data_Fig1_d.mat. We suggest the user to use Matlab 2018a or newer to open and read the data. The data set is published under the terms of the Creative Commons Attribution 4.0 License.Data_Fig1_a_b_c.matThis file can be used to reproduce Fig. 1 (a-c) of the article where BPM-Matlab is used to simulate beam propagation through a multimode fiber. The x and y axes values are available in the variables P.x and P.y. The E-field intensity at the proximal end in Fig. 1(a) can be calculated as abs(P.Einitial.').^2. The corresponding phase in Fig. 1(b) is available as angle(P.Einitial.'). The E-field intensity at the multimode fiber distal end in Fig. 1(c) can be calculated as abs(P.E.field.').^2.Data_Fig1_d.matThe corresponding BeamLab simulation results of the same multimode fiber are available in this data file. This data file is generated using BeamLab software. Use the variables bpmData.SlicesXZ.XData, bpmData.SlicesYZ.YData, and abs(bpmData.OutputField.E.x.').^2 to obtain x, y, and distal E-field intensity respectively.Data_Fig_2.matThe data from this file will generate intensity profiles of the five lowest order fiber modes supported by a straight and a bent multimode fiber corresponding to Figure 2 of the article. The variables P_noBend and P_bend are struct variables that hold information about the spatial dimensions as well as E-field profiles of the straight and bent modes. For the straight fiber case, the mode field profile is stored in P_noBend.modes(modeNumber).field, where 1x = dx*(-(Nx-1)/2:(Nx-1)/2) and y = dy*(-(Ny-1)/2:(Ny-1)/2), where Nx = size(P_noBend.modes(modeNumber).field,1), Ny = size(P_noBend.modes(modeNumber).field,2), dx = P_noBend.modes(modeNumber).Lx/Nx, and dy = P_noBend.modes(modeNumber).Ly/Ny. In a similar manner, the mode field profiles of bent multimode fiber may also be accessed from P_bend. Data_Fig3_a.matUse this data file to reproduce Figure 3(a) from the article, where numerical simulation results of different LP modes' normalized fractional power in a bent multimode fiber excited with LP01 mode are presented. The matlab command semilogy(P.z.*1e3,P.modeOverlaps,'linewidth',2)will plot the mode overlap of LP01 to all 30 guided modes in logarithmic scale. The following command legend(P.modes.label,'location','eastoutside','FontSize',6)could be used to label the modes. Set the y-limits of the plot using ylim([1e-4 2]) to visualize the contribution from only the six most excited modes. Data_Fig3_b.matLoad this data file and follow similar steps described above for Data_Fig3_a case in order to plot normalized fractional power in a bent multimode fiber excited with LP03 mode, as in Figure 3(b). Data_Fig_4.matTo reproduce Figure 4(a) from the article, use the commands imagesc(P.z,P.x,abs(P.xzSlice).^2);ylim([-1 1]*0.75e-5); to plot the intensity profile in the xz plane of a multimode fiber tapered down to be a single-mode fiber. For Figure 4(b), use plot(P.z,P.powers) that will plot the power within the simulation window against the length P.z of the fiber. Data_Fig5_a.matThis data file could be used to plot the intensity profile of the E-field at a distance of z = 5 mm after the non-twisted, straight multicore fiber distal end as given in Figure 5(a) in the article. The E-field data after propagation from the distal end is available as E_out_fft.field and the corresponding spatial dimensions are available as E_out_fft.x and E_out_fft.y. Use imagesc(x.*1e3,y.*1e3,E_abs(E_out_fft.field.').^2); axis image; to plot the field intensity profile. Similar to the above case, use the below .mat files to reproduce Figure 5 (b-d). Data_Fig5_b.mat - Twisted straight multicore fiberData_Fig5_c.mat - Non-twisted bent multicore fiberData_Fig5_d.mat - Twisted bent multicore fiber.

  5. d

    HERO WEC Belt Test Data

    • catalog.data.gov
    • data.openei.org
    • +2more
    Updated Jan 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Renewable Energy Laboratory (2025). HERO WEC Belt Test Data [Dataset]. https://catalog.data.gov/dataset/hero-wec-belt-test-data-1d409
    Explore at:
    Dataset updated
    Jan 27, 2025
    Dataset provided by
    National Renewable Energy Laboratory
    Description

    The following submission includes raw and processed data from the 2024 Hydraulic and Electric Reverse Osmosis Wave Energy Converter (HERO WEC) belt tests conducted using NREL's Large Amplitude Motion Platform (LAMP). A description of the motion profiles run during testing can be found in the run log document. Data was collected using NREL's Modular Ocean Data AcQuisition (MODAQ) system in the form of TDMS files. Data was then processed using Python and MATLAB and converted to MATLAB workspace, parquet, and csv file formats. During Data processing, a low pass filter was applied to each array and the arrays were then resampled to common 10Hz timestamps. A MATLAB data viewer script is provided to quickly visualize these data sets. The following arrays are contained in each test data file: - Time: Unix seconds timestamp - Test_Time: Time in seconds since beginning of test - POS_OS_1001: Encoder position in degrees (the encoder is located on the secondary shaft of the spring return and is driven by the winch after a 4.5:1 gear reduction) - LC_ST_1001: Anchor load cell data in lbf - PRESS_OS_2002: Air spring pressure in psi This data set has been developed by the National Renewable Energy Laboratory, operated by Alliance for Sustainable Energy, LLC, for the U.S. Department of Energy (DOE) under Contract No. DE-AC36-08GO28308. Funding provided by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy Water Power Technologies Office.

  6. e

    Model, data, and code for paper "Modeling of streamflow in a...

    • knb.ecoinformatics.org
    • dataone.org
    • +1more
    Updated Apr 7, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yunxiang Chen; Jie Bao; Yilin Fang; William A. Perkings; Huiying Ren; Xuehang Song; Zhuoran Duan; Zhangshuan Hou; Xiaoliang He; Timothy D. Scheibe (2023). Model, data, and code for paper "Modeling of streamflow in a 30-kilometer-long reach spanning 5 years using OpenFOAM 5.x" [Dataset]. http://doi.org/10.15485/1819956
    Explore at:
    Dataset updated
    Apr 7, 2023
    Dataset provided by
    ESS-DIVE
    Authors
    Yunxiang Chen; Jie Bao; Yilin Fang; William A. Perkings; Huiying Ren; Xuehang Song; Zhuoran Duan; Zhangshuan Hou; Xiaoliang He; Timothy D. Scheibe
    Time period covered
    Jan 1, 2011 - Oct 31, 2019
    Area covered
    Description

    The data package includes data, model, and code that support the analyses and conclusions in the paper titled “modeling of streamflow in a 30-kilometer-long reach spanning 5 years using OpenFOAM 5.x”. The primary goal of this paper is to demonstrate that key streamflow properties such as water depth, flow velocity, and dynamic pressure in a natural river at 30-kilometer scale over 5 years can be reliably and efficiently modeled using the computational framework presented in this paper. To support the paper, various data types from remote sensing, field observations, and computational models are used. Specific details are described as follows. Firstly, the river bathymetry data was obtained from a Light Detection and Ranging (LiDAR) survey. This data is then converted to a triangulated surface format, STL, for mesh generation in OpenFOAM. The STL data can be found in Model_Setups/BaseCase_2013To2015/constant/triSurface. The OpenFOAM mesh generated using this STL file can be found in constant/polyMesh. Other model setups, boundary and initial conditions can be found in /system and /0.org under folder BaseCase_2013To2015. A similar data structure can also be found in BaseCase_2018To2019 for the simulations during 2018 and 2019. Secondly, the OpenFOAM simulations need the upstream discharge and water depth information at the upstream boundary to drive the model. These data are generated from a one-dimensional hydraulic model and the data can be found under the folder Model_Setups /1D model Mass1 data. The mass1_65.csv and mass1_191.csv files include the results of the 1D model at the model inlet and outlet, respectively. The Matlab source code Mass1ToOFBC20182019.m is used to convert these data into OpenFOAM boundary condition setups. With the above OpenFOAM model, it can generate data for water surface elevation, flow velocity, and dynamic pressure. In this paper, the water surface elevation was measured at 7 locations during different periods between 2011 and 2019. The exact survey locations (see Fig1_SurveyLocations.txt) can be found in folder Fig_1. The variation of water stage over time at the 7 locations can be found in folder /Observation_WSE. The data type include .txt, .csv, .xlsx, and .mat. The .mat data can be loaded by Matlab. We also measured the flow velocities at 12 cross-sections along the river. At each cross-section, we recorded the x, y locations, depth, three velocity components u,v,w. These data are saved to a Matlab format which can be found under folder /Observation_Velocity and /Fig_1. The relative locations of velocity survey locations to the river bathymetry can be found in Figure 1c. The water stage data at the 7 locations from OpenFOAM, 1D, and 2D hydraulic models are also provided to evaluate the long-term performance of 3D models vs 1D/2D models. The water stage data for the 7 locations from OpenFOAM have been saved to .mat format and can be found in /OpenFOAM_WSE. The water stage data from the 1D model are saved in .csv format and can be found in /Mass1_WSE. The water stage from the 2D model is saved as .mat format and can be found in / Mass2_WSE In addition, the OpenFOAM model outputs the information of hydrostatic and hydrodynamic pressure. They are saved as .mat format under folder /Fig_11/2013_1. As the files are too large, we only uploaded the data for January 2013. The area of different ratio of dynamic pressure to static pressure for all simulation range, i.e., 2013-2015, are saved to .mat format. They can be found in /Fig_11/PA. Further, the data of wall clock time versus the solution time of the OpenFOAM modeling are also saved to .mat format under folder /Fig_13/LogsMat. In summary, the data package contains seven data types, including .txt, .csv, .xlsx, .dat, .stl, .m, and .mat. The former 4 types can be directly open using a text editor or Microsoft Office. The .mat format needs to be read by Matlab. The Matlab source code .m files need to be run with Matlab. The OpenFOAM setups can be visualized in ParaView. The .stl file can be opened in ParaView or Blender. The data in subfolders Fig_1 to Fig_10 and Fig_12 are copied from the aforementioned data folders to generate specific figures for the paper. A readME.txt file is included in each subfolder to further describe how the data in each folder are generated and used to support the paper. Please use the data package's DOI to cite the data package. Please contact yunxiang.chen@pnnl.gov if you need more data related to the paper.

  7. B

    Replication Data for: "TiFA: A new LSPIV Post-Processing Algorithm for River...

    • borealisdata.ca
    Updated Apr 15, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Qingcheng Yu; Colin Rennie; Sean Ferguson; Mitchel Provan (2025). Replication Data for: "TiFA: A new LSPIV Post-Processing Algorithm for River Surface Velocity Measurement under Low Tracer Density Conditions" [Dataset]. http://doi.org/10.5683/SP3/WKH13A
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Apr 15, 2025
    Dataset provided by
    Borealis
    Authors
    Qingcheng Yu; Colin Rennie; Sean Ferguson; Mitchel Provan
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Time period covered
    Feb 29, 2024
    Area covered
    Ontario, Ottawa, Canada
    Dataset funded by
    Natural Sciences and Engineering Research Council of Canada
    The City of Calgary
    National Research Council of Canada
    Description

    This is the replication data for the journal article "TiFA: A new LSPIV Post-Processing Algorithm for River Surface Velocity Measurement under Low Tracer Density Conditions". The original article developed a new LSPIV algorithm, Time Frequency Analysis (TiFA), to improve computational efficiency and enhance the accuracy of velocity measurements in traditional LSPIV. TiFA’s performance was assessed by comparison with other image velocimetry algorithms, including traditional LSPIV, Ensemble Correlation (EC), Large-Scale Particle Tracking Velocimetry (LSPTV), and Seeding Density Index (SDI). The evaluations were conducted using an experimental hydraulic model (an indoor physical model) and two field cases: the Bradano River case and the Arrow River case. This dataset contains the following data: (1) The video footage and frames collected from the indoor physical model; (2) The Matlab code of TiFA and an example of using TiFA to process velocity data from the indoor physical model. Please note that the video footage collected from the indoor physical model is original and included in this open-access dataset. Datasets from the field cases (i.e., the Bradano River case and the Arrow River case) are sourced from a third-party study (Perks et al., 2020; see the citation below) and are NOT included this open-access dataset. Readers can access the data of the two filed cases from: Perks, M. T., Dal Sasso, S. F., Hauet, A., Jamieson, E., Le Coz, J., Pearce, S., Peña-Haro, S., Pizarro, A., Strelnikova, D., Tauro, F., Bomhof, J., Grimaldi, S., Goulet, A., Hortobágyi, B., Jodeau, M., Käfer, S., Ljubičić, R., Maddock, I., Mayr, P., & Paulus, G. (2020). Towards harmonisation of image velocimetry techniques for river surface velocity observations. Earth System Science Data, 12(3), 1545–1559. https://doi.org/10.5194/essd-12-1545-2020

  8. Z

    Dataset of future district heating energy demand in a Finnish municipality

    • data.niaid.nih.gov
    Updated Nov 26, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jean-Nicolas Louis; Petri Hietaharju; Jari Pulkkinen (2020). Dataset of future district heating energy demand in a Finnish municipality [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_4139298
    Explore at:
    Dataset updated
    Nov 26, 2020
    Dataset provided by
    University of Oulu
    Authors
    Jean-Nicolas Louis; Petri Hietaharju; Jari Pulkkinen
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Finland
    Description

    ******************* Please view the README.md file for detailed documentation of data. ********************

    Title: Impact of climate change, energy efficiency and population on long-term heat demand scenarios in districts: Datasets and Supplementary Materials Version: 1.0

    Date of Release: 28/10/2020 Identifier: doi:10.5281/zenodo.4139299 Permalink: http://dx.doi.org/10.5281/zenodo.4139299

    Associated publication: Hietaharju, P.; Louis, J.-N.; Pulkkinen, J. & Ruusunen, M. Impact of climate change, energy efficiency and population on long-term heat demand scenarios in districts Under Review, 2020

    Suggested citation: Please reference the associated publication above when using any datasets or materials described in the README file. Contact information: Jean-Nicolas Louis, University of Oulu, Oulu, Finland, jean-nicolas.louis@oulu.fi or jeannicolas.louis@gmail.com

    Dates of data modelisation: 2013 - 2030 - 2050

    Geographic location: Jyväskylä

    Time resolution: Hourly, heating season.

    Types: Input data (all input configuration data are freely available, but dataset related to the district heating network and buildings are not distributed and not shareable for copyright reasons), power, temperature

    Format: All data are stored in .mat file format (MatLab file).

    This directory contains the following datasets and supplementary materials: A summary of all the files has been compiled and stored in the "READ ME.md" or "READ ME.html" file

  9. M

    HERO WEC V1 Upgrade - 2023 Laboratory Testing (processed data)

    • mhkdr.openei.org
    • data.openei.org
    • +2more
    archive, code +1
    Updated Jan 1, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Justin Panzarella; Andrew Simms; Ben McGilton; Alec Schnabel; Scott Lambert; Miles Skinner; Scott Jenne; Justin Panzarella; Andrew Simms; Ben McGilton; Alec Schnabel; Scott Lambert; Miles Skinner; Scott Jenne (2024). HERO WEC V1 Upgrade - 2023 Laboratory Testing (processed data) [Dataset]. http://doi.org/10.15473/2281726
    Explore at:
    archive, website, codeAvailable download formats
    Dataset updated
    Jan 1, 2024
    Dataset provided by
    Marine and Hydrokinetic Data Repository
    USDOE Office of Energy Efficiency and Renewable Energy (EERE), Renewable Power Office. Water Power Technologies Office (EE-4WP)
    National Renewable Energy Laboratory
    Authors
    Justin Panzarella; Andrew Simms; Ben McGilton; Alec Schnabel; Scott Lambert; Miles Skinner; Scott Jenne; Justin Panzarella; Andrew Simms; Ben McGilton; Alec Schnabel; Scott Lambert; Miles Skinner; Scott Jenne
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The following submission includes processed laboratory data from NREL's Hydraulic and Electric Reverse Osmosis Wave Energy Converter (HERO WEC), in the form of MATLAB workspaces. This dataset was created using NREL's Large Amplitude Motion Platform (LAMP) and collected between August and September 2023. Included with this submission is a test log of all the processed data "HERO WEC LAMP test run log.xlsx" so that the user can easily find the data of interest. Additionally, more detailed descriptions of the type of data and how it was processed, or calculated, can be found in the document titled "Lamp Data Description.docx". The MATLAB workspaces can be visualized using the file "LAMP_Data_Viewer_Ver2.m/mlx". The user simply needs to upload the workspace of interest and run the file "LAMP_Data_Viewer_Ver2.m/mlx". Both the .m and .mlx file format has been provided depending on the user's preference.

    The MATLAB workspaces have been separated into zip files corresponding to either Drivetrain, Hydraulic, or Electric configuration runs representing the respective test cases that were run. The drivetrain runs were used to characterize the drivetrain only (no pump or generator). The Hydraulic runs represent the configuration when the seawater pump is installed, and the Electric runs represents the configuration when the generator is installed. The following sub-categories of data are included for each type:

    - DW - Deep water sine wave profile (not run in drivetrain configuration)
    - Heave - Heave only sine wave profile
    - Heave_NoRO (hydraulic configuration only)
    - Heave_ACC (hydraulic configuration only)
    - IR - Surge and heave irregular wave profile (not run in drivetrain configuration)
    - RW - Heave only profile created from real world encoder data (not run in drivetrain configuration)
    

    For those interested in the raw, unprocessed, data the authors have created a separate submission, linked below. This submission includes the raw TDMS files and associated files necessary to translate the data into either python or MATLAB formats.

    This data set has been developed by the National Renewable Energy Laboratory, operated by Alliance for Sustainable Energy, LLC, for the U.S. Department of Energy (DOE) under Contract No. DE-AC36-08GO28308. Funding provided U.S. Department of Energy Office of Energy Efficiency and Renewable Energy Water Power Technologies Office.

  10. m

    FDI - Fractal Dimension Index

    • data.mendeley.com
    Updated May 16, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Juan Ruiz de Miras (2025). FDI - Fractal Dimension Index [Dataset]. http://doi.org/10.17632/k3t9h984s5.1
    Explore at:
    Dataset updated
    May 16, 2025
    Authors
    Juan Ruiz de Miras
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    FDI is a MATLAB tool for computing the Fractal Dimension Index of reconstructed sources (dipoles) obtained from EEG data. The fractal dimension (FD) is a valuable tool for analysing the complexity of neural structures and functions in the human brain. To assess the spatiotemporal complexity of brain activations derived from electroencephalogram (EEG) signals, the fractal dimension index (FDI) was developed. This measure integrates two distinct complexity metrics: 1) integration FD, which calculates the FD of the spatiotemporal coordinates of all significantly active EEG sources (4DFD); and 2) differentiation FD, determined by the complexity of the temporal evolution of the spatial distribution of cortical activations (3DFD), estimated via the Higuchi FD [HFD(3DFD)]. The final FDI value is the product of these two measurements: 4DFD × HFD(3DFD). We introduce an open-source MATLAB software named FDI for measuring FDI values in EEG datasets. By using CUDA for leveraging the GPU massive parallelism to optimize performance, our software facilitates efficient processing of large-scale EEG data while ensuring compatibility with pre-processed data from widely used tools such as Brainstorm and EEGLab. The FDI toolbox allows neuroscientists to readily apply FDI to investigate cortical activity complexity within their own studies.

  11. I

    SoyFACE Fumigation Data Files

    • databank.illinois.edu
    Updated Oct 7, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Elise Kole Aspray; Elizabeth Ainsworth; Jesse McGrath; Justin McGrath; Christopher Montes; Andrew Whetten; Donald Ort; Stephen Long; Kannan Puthuval; Timothy Mies; Carl Bernacchi; Evan DeLucia; Bradley Dalsing; Andrew Leakey; Shuai Li; Jelena Herriott; Franco Miglietta (2024). SoyFACE Fumigation Data Files [Dataset]. http://doi.org/10.13012/B2IDB-3496460_V4
    Explore at:
    Dataset updated
    Oct 7, 2024
    Authors
    Elise Kole Aspray; Elizabeth Ainsworth; Jesse McGrath; Justin McGrath; Christopher Montes; Andrew Whetten; Donald Ort; Stephen Long; Kannan Puthuval; Timothy Mies; Carl Bernacchi; Evan DeLucia; Bradley Dalsing; Andrew Leakey; Shuai Li; Jelena Herriott; Franco Miglietta
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Dataset funded by
    U.S. Department of Energy (DOE)
    U.S. Department of Agriculture (USDA)
    U.S. National Science Foundation (NSF)
    Description

    This data set is related to the SoyFACE experiments, which are open-air agricultural climate change experiments that have been conducted since 2001. The fumigation experiments take place at the SoyFACE farm and facility in Champaign County, Illinois during the growing season of each year, typically between June and October. This V4 contains new experimental data files, hourly fumigation files, and weather/ambient files for 2022 and 2023, since the original dataset only included files for 2001-2021. The MATLAB code has also been updated for efficiency, and explanatory files have been updated accordingly. Below are new changes in V4: - The "SoyFACE Plot Information 2001 to 2021" file is renamed to “SoyFACE ring information 2001 to 2023.xlsx”. Data for 2022 and 2023 were added. File contains information about each year of the SoyFACE experiments, including the fumigation treatment type (CO2, O3, or a combination treatment), the crop species, the plots (also referred to as 'rings' and labeled with numbers between 2 and 31) used in each experiment, important experiment dates, and the target concentration levels or 'setpoints' for CO2 and O3 in each experiment. - The "SoyFACE 1-Minute Fumigation Data Files" were updated to contain sub-folders for each year of the experiments (2001-2023), each of which contains sub-folders for each ring used in that year's experiments. This data set also includes hourly data files for the fumigation experiments ("SoyFACE Hourly Fumigation Data Files" folder) created from the 1-minute files, and hourly ambient/weather data files for each year of the experiments ("Hourly Weather and Ambient Data Files" folder which has also been updated to include 2022 and 2023 data). The ambient CO2 and O3 data are collected at SoyFACE, and the weather data are collected from the SURFRAD and WARM weather stations located near the SoyFACE farm. - “Rings.xlsx” is new in this version. This file lists the rings and treatments used in each year of the SoyFACE experiments between 2001 and 2023 and is used in several of the MATLAB codes. - “CMI Weather Data Explanation.docx” is newly added. This file contains specific information about the processing of raw weather data, which is used in the hourly weather and ambient data files. - Files that were in RAR format in V3 are now updated and saved as ZIP format, including: Hourly Weather and Ambient Data Files.zip , SoyFACE 1-Minute Fumigation Data Files.zip , SoyFACE Hourly Fumigation Data Files.zip, and Matlab Files.zip. - The "Fumigation Target Percentages" file was updated to add data for 2022 and 2023. This file shows how much of the time the CO2 and O3 fumigation levels are within a 10 or 20 percent margin of the target levels when the fumigation system is turned on. - The "Matlab Files" folder contains custom code (Aspray, E.K.) that was used to clean the "SoyFACE 1-Minute Fumigation Data" files and to generate the "SoyFACE Hourly Fumigation Data" and "Fumigation Target Percentages" files. Code information can be found in the various "Explanation" files. The Matlab code changes are as follows: 1. “Data_Issues_Finder.m” code was changed to use the “Ring.xlsx” file to gather ring and treatment information based on the contents of the file rather than being hardcoded in the Matlab code itself. 2. “Data_Issues_Finder_all.m” code is new. This code is the same as the “Data_Issues_Finder.m” code except that it identifies all CO2 and O3 repeats. In contrast, the “Data_Issues_Finder.m” code only identifies CO2 and O3 repeats that occur when the fumigation system is turned on. 3. “Target_Yearly.m” code was changed to use the “Ring.xlsx” file to gather ring and treatment information based on the contents of the file rather than being hardcoded in the Matlab code itself. 4. “HourlyFumCode.m” code is new. This code uses the “Rings.xlsx” file to gather ring and treatment information based on the contents of the file instead of the user needing to define these values explicitly. This code also defines a list of all ring folders for the year selected and runs the hourly code for each ring, instead of the user having to run the hourly code for each ring individually. Finally, the code generates two dialog boxes for the user, one which allows user to specify whether they want the hourly code to be run for 1-minute fumigation files or 1-minute ambient files, and another which allows user to specify whether they would like the hourly fumigation averages to be replaced with hourly ambient averages when the fumigation system is turned off. 5. “HourlyDataFun.m” code was changed to run either “HourlyData.m” code or “HourlyDataAmb.m” code, depending on user input in the first dialog box. 6. “HourlyData.m” code was changed to replace hourly fumigation averages with hourly ambient averages when the fumigation system is turned off, depending on user input in the second dialog box. 7. “HourlyDataAmb.m” code is new. This code is similar to “HourlyData.m” code but is used to calculate hourly averages for 1-minute ambient files instead 1-minute fumigation files. 8. “batch.m” code was changed to account for new function input variables in “HourlyDataFun.m” code, along with adding header columns for “FumOutput.xlsx” and “AmbOutput.xlsx” output files generated by “HourlyData.m” and “HourlyDataAmb.m” code. - Finally, the " * Explanation" files contain information about the column names, units of measurement, steps needed to use Matlab code, and other pertinent information for each data file. Some of them have been updated to reflect the current change of data.

  12. 4

    MATLAB script and COMSOL models of the article "An efficient multiscale...

    • data.4tu.nl
    zip
    Updated Apr 24, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Renan Liupekevicius Carnielli; Johannes van Dommelen; Marc Geers; Varvara Kouznetsova (2024). MATLAB script and COMSOL models of the article "An efficient multiscale method for subwavelength transient analysis of acoustic metamaterials" [Dataset]. http://doi.org/10.4121/0c31cd57-7ea1-4587-84e7-b9b75ff5fa2e.v3
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 24, 2024
    Dataset provided by
    4TU.ResearchData
    Authors
    Renan Liupekevicius Carnielli; Johannes van Dommelen; Marc Geers; Varvara Kouznetsova
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Dataset funded by
    Netherlands Organisation for Scientific Research (NWO)
    Description

    A reduced-order homogenisation framework is proposed in the article "An efficient multiscale method for subwavelength transient analysis of acoustic metamaterials", providing a macro-scale enriched continuum model for locally resonant acoustic metamaterials operating in the subwavelength regime, for both time and frequency domain analyses. The homogenised continuum has a non-standard constitutive model, capturing a metamaterial behaviour such as negative effective bulk modulus, negative effective density, and Willis coupling. A suitable reduced space is constructed based on the unit cell response in a steady state regime and the local resonance regime.


    - The effective continuum material properties are computed via the MATLAB script provided here.


    -A frequency domain numerical example demonstrates the efficiency and suitability of the proposed framework. The macro-scale model is implemented via a COMSOL model provided here.


    -The direct numerical simulations (COMSOL models) are also provided here.

  13. C-MAPSS Aircraft Engine Simulator Data - Dataset - NASA Open Data Portal

    • data.nasa.gov
    Updated Sep 22, 2010
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2010). C-MAPSS Aircraft Engine Simulator Data - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/c-mapss-aircraft-engine-simulator-data
    Explore at:
    Dataset updated
    Sep 22, 2010
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    SPECIAL NOTE: C-MAPSS and C-MAPSS40K ARE CURRENTLY UNAVAILABLE FOR DOWNLOAD. Glenn Research Center management is reviewing the availability requirements for these software packages. We are working with Center management to get the review completed and issues resolved in a timely manner. We will post updates on this website when the issues are resolved. We apologize for any inconvenience. Please contact Jonathan Litt, jonathan.s.litt@nasa.gov, if you have any questions in the meantime. Subject Area: Engine Health Description: This data set was generated with the C-MAPSS simulator. C-MAPSS stands for 'Commercial Modular Aero-Propulsion System Simulation' and it is a tool for the simulation of realistic large commercial turbofan engine data. Each flight is a combination of a series of flight conditions with a reasonable linear transition period to allow the engine to change from one flight condition to the next. The flight conditions are arranged to cover a typical ascent from sea level to 35K ft and descent back down to sea level. The fault was injected at a given time in one of the flights and persists throughout the remaining flights, effectively increasing the age of the engine. The intent is to identify which flight and when in the flight the fault occurred. How Data Was Acquired: The data provided is from a high fidelity system level engine simulation designed to simulate nominal and fault engine degradation over a series of flights. The simulated data was created with a Matlab Simulink tool called C-MAPSS. Sample Rates and Parameter Description: The flights are full flight recordings sampled at 1 Hz and consist of 30 engine and flight condition parameters. Each flight contains 7 unique flight conditions for an approximately 90 min flight including ascent to cruise at 35K ft and descent back to sea level. The parameters for each flight are the flight conditions, health indicators, measurement temperatures and pressure measurements. Faults/Anomalies: Faults arose from the inlet engine fan, the low pressure compressor, the high pressure compressor, the high pressure turbine and the low pressure turbine.

  14. Z

    Data from: Determining aligner-induced tooth movements in three dimensions...

    • data-staging.niaid.nih.gov
    Updated Nov 25, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Tanner, Christine; Filippon, Ignacio; von Jackowski, Jeannette A.; Schulz, Georg; Töpper, Tino; Müller, Bert (2024). Determining aligner-induced tooth movements in three dimensions using clinical data of two patients: datasets [Dataset]. https://data-staging.niaid.nih.gov/resources?id=zenodo_11280342
    Explore at:
    Dataset updated
    Nov 25, 2024
    Dataset provided by
    Biomaterials Science Center, Department of Biomedical Engineering, Univeristy of Basel
    Biomaterials Science Center, Department of Biomedical Engineering, University of Basel
    Authors
    Tanner, Christine; Filippon, Ignacio; von Jackowski, Jeannette A.; Schulz, Georg; Töpper, Tino; Müller, Bert
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Abstract

    The effectiveness of a series of optically transparent aligners for orthodontic treatments depends on the anchoring of each tooth. In contrast with the roots, the crowns’ positions and orientations are measurable with intraoral scans, thus avoiding any X-ray dose. Exemplified by two patients, we demonstrate that three-dimensional crown movements could be determined with micrometer precision by registering weekly intraoral scans. The data show the movement and orientation changes in the individual crowns of the upper and lower jaws as a result of the forces generated by the series of aligners. During the first weeks, the canines and incisors were more affected than the premolars and molars. We detected overall tooth movement of up to about 1 mm during a nine-week active treatment. The data on these orthodontic treatments indicate the extent to which actual tooth movement lags behind the treatment plan, as represented by the aligner shapes. The proposed procedure can not only be used to quantify the clinical outcome of the therapy, but also to improve future planning of orthodontic treatments for each specific patient. This study should be treated with caution because only two cases were investigated, and the approach should be applied to a reasonably large cohort to reach strong conclusions regarding the efficiency and efficacy of this therapeutic approach.

    Data

    The repository contains the data of the intraoral scans and all data where manual interactions were performed to allow reproducing the results.

    The directory and file names are as follows:

    Name Level, type Description

    p_Clinical_Trial 1st, directory Patient p: p=3485 stands for patient A, p=6457 stands for patient B

    Bottmedical 2nd, directory Planning data

    Sirona 2nd, directory Intraoral scan data

    Tn 3rd, directory Time step n for n=0: before treatment, n=1-9: after 1-9 weeks of treatment; n=10: end of treatment

    Lower_Aligner_n_cut1.stl 4th, file Manually cut surface mesh of planned data, lower jaw for time step n

    Upper_Aligner_n_cut1.stl 4th, file Manually cut surface mesh of planned data, upper jaw for time step n

    p_OnyxCeph3_Export_j_cut1.stl 4th, file Manually cut surface mesh from intraoral scan, for lower (j=UK) or upper (j=OK) jaw

    p_OnyxCeph3_Export_j.stl 4th, file Original surface mesh from intraoral scan, for lower (j=UK) or upper (j=OK) jaw

    teethSeg 4th, directory Segmented crowns via OnyxCeph3 TM

    teethSeg_noSnap_TolS 4th, directory Transferred crown segmentations and occlusion plane points

    p_Zi.stl 5th, file

    Surface mesh of crown segmentation for tooth number i of patient p

    p_j.stl 5th, file

    Surface mesh of all crown segmentations for lower (j=UK) or upper (j=OK) jaw of patient p

    p_j_occPlane.mat 5th, file

    Binary Matlab file of saved variable occPlane, which defines transferred occlusion plane points for lower (j=UK) or upper (j=OK) jaw of patient p

    File formats:

    stl files describe an unstructured triangulates surface by vertices and triangles. These files can be read by the open source software freecad or the MATLAB function stlread.

    mat files are MATLAB files and can be read via MATLAB function load.

  15. T

    Bankfull geometry dataset of major exorheic rivers on the Qinghai-Tibet...

    • tpdc.ac.cn
    • data.tpdc.ac.cn
    zip
    Updated Sep 1, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dan LI; Yuan XUE; Chao QIN; Baosheng WU; Bowei CHEN; Ge WANG (2022). Bankfull geometry dataset of major exorheic rivers on the Qinghai-Tibet Plateau (1984-2020) [Dataset]. http://doi.org/10.1038/s41597-022-01614-w
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 1, 2022
    Dataset provided by
    TPDC
    Authors
    Dan LI; Yuan XUE; Chao QIN; Baosheng WU; Bowei CHEN; Ge WANG
    Area covered
    Description

    Based on the Sentinel-2 and Landsat 5/7/8 multispectral instrument imageries combined with in-situ measured hydrological data, bankfull river geometry of six major exorheic river basins of the Qinghai-Tibet Plateau (the upper Yellow River, upper Jinsha River, Yalong River, Lantsang River, Nu River and Yalung Zangbo River) are presented. River surface of six mainstreams and major tributaries are included. For each river basin, two types of rivers are included: connected and disconnected rivers. Format of the dataset is .shp exported from the ArcGIS 10.5. Three products are included in the dataset: one original product (bankfull river surface dataset) and two derived products (bankfull river width dataset and bankfull river surface area dataset with a 1 km river length interval). These three products are in three folders. The first folder, “1-Bankfull River Surface”, contains river surface vectors for six river basins in the .shp file. The second folder, “2-Bankfull River Width”, contains bankfull river widths and corresponding coordinates with a 1 km-step river length for six mainstreams and some connected tributaries in .xlsx format. The river width vectors in the .shp files are also provided in the second folder. The third folder, “3-Bankfull River Surface Area”, contains bankfull river surface areas and corresponding coordinates with a 1 km-step river length for six mainstreams and some connected tributaries in .xlsx format. Three Supplementary Files are included: Supplementary File 1, tables and figures related to the dataset; Supplementary File 2, used for river surface extraction based on GEE platform; Supplementary File 3, used for river width extraction based on Matlab. The provided planform river hydromorphology data can supplement global hydrography datasets and effectively represent the combined fluvial geomorphology and geological background in the study area.

  16. n

    Efficiency and Accuracy of Micro-Macro Models for Mineral Dissolution -...

    • narcis.nl
    • data.4tu.nl
    Updated Aug 20, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stephan Gärttner; Peter Frolkovič; Peter Knabner; Nadja Ray (2020). Efficiency and Accuracy of Micro-Macro Models for Mineral Dissolution - Simulation Data [Dataset]. http://doi.org/10.4121/12776063
    Explore at:
    Dataset updated
    Aug 20, 2020
    Dataset provided by
    4TU.ResearchData
    Authors
    Stephan Gärttner; Peter Frolkovič; Peter Knabner; Nadja Ray
    Description

    This folder contains the output data of the major computations performed in the paper 'Efficiency and Accuracy of Micro-Macro Models for Mineral Dissolution', DOI: 10.1029/2020WR027585. Opening the files requires a recent release of Matlab and the software package 'HyPHM' to be installed. The Matlab code files contain a reference to the illustration in the paper and automatically load the data files needed to compute them.

  17. d

    Data from: Why does the metabolic cost of walking increase on compliant...

    • search.dataone.org
    • data.niaid.nih.gov
    • +1more
    Updated May 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Barbara Grant (2025). Why does the metabolic cost of walking increase on compliant substrates? [Dataset]. http://doi.org/10.5061/dryad.6hdr7sr31
    Explore at:
    Dataset updated
    May 10, 2025
    Dataset provided by
    Dryad Digital Repository
    Authors
    Barbara Grant
    Time period covered
    Jan 1, 2022
    Description

    Walking on compliant substrates requires more energy than walking on hard substrates, but the biomechanical factors that contribute to this increase are debated. Previous studies suggest various causative mechanical factors, including disruption to pendular energy recovery, increased muscle work, decreased muscle efficiency and increased gait variability. We test each of these hypotheses simultaneously by collecting a large kinematic and kinetic data set of human walking on foams of differing thickness. This allowed us to systematically characterise changes in gait with substrate compliance, and, by combining data with mechanical substrate testing, drive the very first subject-specific computer simulations of human locomotion on compliant substrates to estimate the internal kinetic demands on the musculoskeletal system. Negative changes to pendular energy exchange or ankle mechanics are not supported by our analyses. Instead, we find that the mechanistic causes of increased energetic co...

  18. H

    SnowClim Model and Dataset

    • hydroshare.org
    • beta.hydroshare.org
    • +1more
    zip
    Updated Jul 4, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    A. C. Lute; John Abatzoglou; Timothy Link (2022). SnowClim Model and Dataset [Dataset]. http://doi.org/10.4211/hs.acc4f39ad6924a78811750043d59e5d0
    Explore at:
    zip(0 bytes)Available download formats
    Dataset updated
    Jul 4, 2022
    Dataset provided by
    HydroShare
    Authors
    A. C. Lute; John Abatzoglou; Timothy Link
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Oct 1, 1850 - Sep 30, 2100
    Area covered
    Description

    The SnowClim Model and Dataset address the need for climate and snow data products that are based on physical principles, that are simulated at high spatial resolution, and that cover large geographic domains.

    The SnowClim Model is a physics-based snow model that incorporates key energy balance processes necessary for capturing snowpack spatiotemporal variability, including under future climate scenarios, while optimizing computational efficiency throughout several empirical simplifications. The model code can be downloaded or run in the cloud using MATLAB Online through HydroShare.

    The SnowClim Dataset consists of climate forcing data for and snow outputs from the SnowClim Model. Climate forcing data was downscaled from 4 km climate data from the Weather Research and Forecasting (WRF) model (Rasmussen and Liu, 2017) to ~210 m across the contiguous western United States. Climate forcings were downscaled from WRF directly for a present day (2000-2013) period and a thirteen year pseudo global warming scenario reflecting conditions between 2071-2100 under RCP 8.5. Climate forcings were prepared for a third time period by perturbing present-day downscaled climate data by the multi-model mean from CMIP5 to reflect conditions under pre-industrial conditions (1850-1879).

    Additional details regarding the SnowClim model physics, model calibration, climate data downscaling, model application to the western US, and model performance are available in: Lute, A. C., Abatzoglou, J., and Link, T.: SnowClim v1.0: high-resolution snow model and data for the western United States, Geosci. Model Dev., 15, 5045–5071, https://doi.org/10.5194/gmd-15-5045-2022, 2022.

  19. d

    Subregions specific dynamics of striatal dopamine

    • dataone.org
    • search.dataone.org
    Updated Jul 26, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ali Mohebi; Wei Wei; Joshua Berke (2025). Subregions specific dynamics of striatal dopamine [Dataset]. http://doi.org/10.5061/dryad.00000008m
    Explore at:
    Dataset updated
    Jul 26, 2025
    Dataset provided by
    Dryad Digital Repository
    Authors
    Ali Mohebi; Wei Wei; Joshua Berke
    Time period covered
    Jan 1, 2023
    Description

    Transient increases in dopamine within the striatum can encode reward prediction errors, critical signals for updating predictions of future rewards. However, it is unclear how this mechanism can provide suitable feedback for predictions across a wide range of time horizons: from seconds or less (if singing a song) to potentially hours or more (if hunting for food). Here we report that dopamine transients in distinct striatal subregions convey prediction errors over distinct time scales. Dopamine dynamics systematically accelerated from ventral to dorsal- medial to dorsal-lateral striatum, in the tempo of their spontaneous fluctuations, their temporal integration of prior rewards, and their discounting of future rewards. This spectrum of time scales for evaluative computations can help achieve efficient learning and adaptive motivation for a wide range of behaviors.

    , , , # Subregion Specific Dynamics of Striatal Dopamine

    Description of the data and file structure

    The dataset included in this repository follows the standard format used by the Berke lab. It contains the necessary files and data required to replicate the experiments and findings described in the research article. Additionally, we have provided MATLAB code that can be utilized to read and interpret the data, as well as generate figures related to our results.

    The dataset consists of three *.zip files. Each zip file contains MATLAB/python scripts and data files required to reproduce the graphs in corresponding figures. Following is the description for each data file.

    Fig2.zip:

    This text contains data and MATLAB scripts to replicate Fig2. The data is saved in .mat format, which can be loaded in MATLAB as a 1x41 struct. Each struct has recordings from one behavioral session of the bandit task and has these elements:

    • Fs: Sampling Frequency of the photometry signa...
  20. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
U.S. EPA Office of Research and Development (ORD) (2020). Model output and data used for analysis [Dataset]. https://catalog.data.gov/dataset/model-output-and-data-used-for-analysis
Organization logo

Model output and data used for analysis

Explore at:
Dataset updated
Nov 12, 2020
Dataset provided by
United States Environmental Protection Agencyhttp://www.epa.gov/
Description

The modeled data in these archives are in the NetCDF format (https://www.unidata.ucar.edu/software/netcdf/). NetCDF (Network Common Data Form) is a set of software libraries and machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. It is also a community standard for sharing scientific data. The Unidata Program Center supports and maintains netCDF programming interfaces for C, C++, Java, and Fortran. Programming interfaces are also available for Python, IDL, MATLAB, R, Ruby, and Perl. Data in netCDF format is: • Self-Describing. A netCDF file includes information about the data it contains. • Portable. A netCDF file can be accessed by computers with different ways of storing integers, characters, and floating-point numbers. • Scalable. Small subsets of large datasets in various formats may be accessed efficiently through netCDF interfaces, even from remote servers. • Appendable. Data may be appended to a properly structured netCDF file without copying the dataset or redefining its structure. • Sharable. One writer and multiple readers may simultaneously access the same netCDF file. • Archivable. Access to all earlier forms of netCDF data will be supported by current and future versions of the software. Pub_figures.tar.zip Contains the NCL scripts for figures 1-5 and Chesapeake Bay Airshed shapefile. The directory structure of the archive is ./Pub_figures/Fig#_data. Where # is the figure number from 1-5. EMISS.data.tar.zip This archive contains two NetCDF files that contain the emission totals for 2011ec and 2040ei emission inventories. The name of the files contain the year of the inventory and the file header contains a description of each variable and the variable units. EPIC.data.tar.zip contains the monthly mean EPIC data in NetCDF format for ammonium fertilizer application (files with ANH3 in the name) and soil ammonium concentration (files with NH3 in the name) for historical (Hist directory) and future (RCP-4.5 directory) simulations. WRF.data.tar.zip contains mean monthly and seasonal data from the 36km downscaled WRF simulations in the NetCDF format for the historical (Hist directory) and future (RCP-4.5 directory) simulations. CMAQ.data.tar.zip contains the mean monthly and seasonal data in NetCDF format from the 36km CMAQ simulations for the historical (Hist directory), future (RCP-4.5 directory) and future with historical emissions (RCP-4.5-hist-emiss directory). This dataset is associated with the following publication: Campbell, P., J. Bash, C. Nolte, T. Spero, E. Cooter, K. Hinson, and L. Linker. Projections of Atmospheric Nitrogen Deposition to the Chesapeake Bay Watershed. Journal of Geophysical Research - Biogeosciences. American Geophysical Union, Washington, DC, USA, 12(11): 3307-3326, (2019).

Search
Clear search
Close search
Google apps
Main menu