18 datasets found
  1. H

    CDF2MAT Automated SCRIPT to import NETCDF files to MATLAB | RESAMPLING added...

    • dataverse.harvard.edu
    • search.dataone.org
    Updated May 30, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Leandro Wang Hantao; Carlos Alberto Teixeira; Victor Hugo Cavalcanti Ferreira (2022). CDF2MAT Automated SCRIPT to import NETCDF files to MATLAB | RESAMPLING added to correct RESHAPE for non-integer MS acquisition rates in GCxGC-MS data [Dataset]. http://doi.org/10.7910/DVN/WMTEMF
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    May 30, 2022
    Dataset provided by
    Harvard Dataverse
    Authors
    Leandro Wang Hantao; Carlos Alberto Teixeira; Victor Hugo Cavalcanti Ferreira
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Function name "cdf2mat" Please use this function to open MS-based chromatographic data from NETCDF (*.CDF) files. Resampling included for non-integer acquisition rates. Outputs nominal mass. Script optimized to process data from comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GCxGC-MS). Updated to remove negative noise signal. INPUT file: Opens the netCDF like 'Sample01.CDF' rate_MS: Desired integer acquisition rate OUTPUT FullMS Full MS chromatogram (second order data tensor) axis_min Retention time axis in minutes axis_mz m/z axis in Daltons I/O: [TIC,FullMS,axis_min,axis_mz] = cdf2mat(file,rate_MS) Compiled with MATLAB R2021b (v.9.11.0.1809720). Requires the Signal Processing Toolbox (v.9.0). Based on netCDFload.m (Murphy, Wenig, Parcsi, Skov e Stuetz) e de iCDF_load (Skov e Bro 2008). K.R. Murphy, P. Wenig, G. Parcsi, T. Skov, R.M. Stuetz (in press) Characterizing odorous emissions using new software for identifying peaks in chemometric models of GC-MS datasets. Chem Intel Lab Sys. doi: 10.1016/j.chemolab.2012.07.006 Skov T and Bro R. (2008) Solving fundamental problems in chromatographic analysis, Analytical and Bioanalytical Chemistry, 390 (1): 281-285. doi: 10.1007/s00216-007-1618-z

  2. t

    Vegetation and climate in Syria, link to matlab and netCDF files - Vdataset...

    • service.tib.eu
    Updated Nov 30, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Vegetation and climate in Syria, link to matlab and netCDF files - Vdataset - LDM [Dataset]. https://service.tib.eu/ldmservice/dataset/png-doi-10-1594-pangaea-944946
    Explore at:
    Dataset updated
    Nov 30, 2024
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Syria
    Description

    Here we present data on vegetation and climate conditions in Syria, including some nc files. The nc files describe the spatial status of Syria, including land cover in 2010, trends in temperature and precipitation, EVI mean and trend, EVI residual analysis and water use efficiency. Detailed information can be found in the paper by Chen et al.

  3. EGO gliders data processing chain

    • seanoe.org
    bin
    Updated Nov 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    EGO gliders data management team (2025). EGO gliders data processing chain [Dataset]. http://doi.org/10.17882/45402
    Explore at:
    binAvailable download formats
    Dataset updated
    Nov 2025
    Dataset provided by
    SEANOE
    Authors
    EGO gliders data management team
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    the ego data processing chain decodes, processes, formats and performs quality control on glider data and metadata. the decoder performs the following actions for a glider deployment: decode and format the glider deployment data and metadata into an ego netcdf time series file apply real time quality control (rtqc) tests on ego netcdf time series file, for slocum gliders, estimate subsurface currents and store them into the ego file, generate netcdf profile files from ego file data and apply specific rtqc tests to them.the decoder manages slocum, seaglider and seaexplorer gliders observations. it is a matlab code (see groom_gliders_coriolis_matlab_decoder_*.pdf in decglider_doc\decoder_user_manual folder) a compiled version is available that does not require matlab licence (see readme.txt in decglider_soft\soft_compiled folder)

  4. OLR data

    • figshare.com
    bin
    Updated Jul 31, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nathaniel Johnson (2019). OLR data [Dataset]. http://doi.org/10.6084/m9.figshare.9199502.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jul 31, 2019
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Nathaniel Johnson
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Outgoing longwave radiation data analyzed in Johnson et al. (2019), provided as Matlab .mat files and processed from NetCDF data (Matlab scripts for processing the NetCDF data are provided in a separate file). The NetCDF source data are freely available on the web:http://olr.umd.edu/

  5. m

    Connecticut River Plume Mixing: ROMS and MATLAB files

    • data.mendeley.com
    Updated Aug 25, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Michael Whitney (2023). Connecticut River Plume Mixing: ROMS and MATLAB files [Dataset]. http://doi.org/10.17632/674yyd3drw.1
    Explore at:
    Dataset updated
    Aug 25, 2023
    Authors
    Michael Whitney
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset is associated with a manuscript on Connecticut River plume mixing with first author Michael M. Whitney. The dataset includes source code, compilation files, and input for the Regional Ocean Modeling System (ROMS) runs used in this study. ROMS output files in NetCDF format are generated by executing the compiled ROMS code with the input files. The dataset also includes MATLAB routines and datafiles for the analysis of model results and generation of figures in the manuscript. The following zip files are included:

    ROMS_v783_Yan_code.zip [ROMS source code branch used in this study] ctplume_ROMS_compilation.zip [files to compile ROMS source code and run-specific Fortran-90 built code] ctplume_ROMS_input.zip [ROMS ASCII and NetCDF input files for runs] ctplume_MATLAB_analysis.zip [custom analysis routines in MATLAB used in this study] ctplume_MATLAB_figures.zip [custom MATLAB routine for manuscript figure generation and MATLAB data files with all data fields included in figures] ctplume_figures_tif.zip [TIF image files of each figure in manuscript]

  6. t

    ESA CCI SM GAPFILLED Long-term Climate Data Record of Surface Soil Moisture...

    • researchdata.tuwien.ac.at
    • researchdata.tuwien.at
    zip
    Updated Sep 5, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wolfgang Preimesberger; Wolfgang Preimesberger; Pietro Stradiotti; Pietro Stradiotti; Wouter Arnoud Dorigo; Wouter Arnoud Dorigo (2025). ESA CCI SM GAPFILLED Long-term Climate Data Record of Surface Soil Moisture from merged multi-satellite observations [Dataset]. http://doi.org/10.48436/3fcxr-cde10
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 5, 2025
    Dataset provided by
    TU Wien
    Authors
    Wolfgang Preimesberger; Wolfgang Preimesberger; Pietro Stradiotti; Pietro Stradiotti; Wouter Arnoud Dorigo; Wouter Arnoud Dorigo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description
    This dataset was produced with funding from the European Space Agency (ESA) Climate Change Initiative (CCI) Plus Soil Moisture Project (CCN 3 to ESRIN Contract No: 4000126684/19/I-NB "ESA CCI+ Phase 1 New R&D on CCI ECVS Soil Moisture"). Project website: https://climate.esa.int/en/projects/soil-moisture/

    This dataset contains information on the Surface Soil Moisture (SM) content derived from satellite observations in the microwave domain.

    Dataset Paper (Open Access)

    A description of this dataset, including the methodology and validation results, is available at:

    Preimesberger, W., Stradiotti, P., and Dorigo, W.: ESA CCI Soil Moisture GAPFILLED: an independent global gap-free satellite climate data record with uncertainty estimates, Earth Syst. Sci. Data, 17, 4305–4329, https://doi.org/10.5194/essd-17-4305-2025, 2025.

    Abstract

    ESA CCI Soil Moisture is a multi-satellite climate data record that consists of harmonized, daily observations coming from 19 satellites (as of v09.1) operating in the microwave domain. The wealth of satellite information, particularly over the last decade, facilitates the creation of a data record with the highest possible data consistency and coverage.
    However, data gaps are still found in the record. This is particularly notable in earlier periods when a limited number of satellites were in operation, but can also arise from various retrieval issues, such as frozen soils, dense vegetation, and radio frequency interference (RFI). These data gaps present a challenge for many users, as they have the potential to obscure relevant events within a study area or are incompatible with (machine learning) software that often relies on gap-free inputs.
    Since the requirement of a gap-free ESA CCI SM product was identified, various studies have demonstrated the suitability of different statistical methods to achieve this goal. A fundamental feature of such gap-filling method is to rely only on the original observational record, without need for ancillary variable or model-based information. Due to the intrinsic challenge, there was until present no global, long-term univariate gap-filled product available. In this version of the record, data gaps due to missing satellite overpasses and invalid measurements are filled using the Discrete Cosine Transform (DCT) Penalized Least Squares (PLS) algorithm (Garcia, 2010). A linear interpolation is applied over periods of (potentially) frozen soils with little to no variability in (frozen) soil moisture content. Uncertainty estimates are based on models calibrated in experiments to fill satellite-like gaps introduced to GLDAS Noah reanalysis soil moisture (Rodell et al., 2004), and consider the gap size and local vegetation conditions as parameters that affect the gapfilling performance.

    Summary

    • Gap-filled global estimates of volumetric surface soil moisture from 1991-2023 at 0.25° sampling
    • Fields of application (partial): climate variability and change, land-atmosphere interactions, global biogeochemical cycles and ecology, hydrological and land surface modelling, drought applications, and meteorology
    • Method: Modified version of DCT-PLS (Garcia, 2010) interpolation/smoothing algorithm, linear interpolation over periods of frozen soils. Uncertainty estimates are provided for all data points.
    • More information: See Preimesberger et al. (2025) and https://doi.org/10.5281/zenodo.8320869" target="_blank" rel="noopener">ESA CCI SM Algorithm Theoretical Baseline Document [Chapter 7.2.9] (Dorigo et al., 2023)

    Programmatic Download

    You can use command line tools such as wget or curl to download (and extract) data for multiple years. The following command will download and extract the complete data set to the local directory ~/Download on Linux or macOS systems.

    #!/bin/bash

    # Set download directory
    DOWNLOAD_DIR=~/Downloads

    base_url="https://researchdata.tuwien.at/records/3fcxr-cde10/files"

    # Loop through years 1991 to 2023 and download & extract data
    for year in {1991..2023}; do
    echo "Downloading $year.zip..."
    wget -q -P "$DOWNLOAD_DIR" "$base_url/$year.zip"
    unzip -o "$DOWNLOAD_DIR/$year.zip" -d $DOWNLOAD_DIR
    rm "$DOWNLOAD_DIR/$year.zip"
    done

    Data details

    The dataset provides global daily estimates for the 1991-2023 period at 0.25° (~25 km) horizontal grid resolution. Daily images are grouped by year (YYYY), each subdirectory containing one netCDF image file for a specific day (DD), month (MM) in a 2-dimensional (longitude, latitude) grid system (CRS: WGS84). The file name has the following convention:

    ESACCI-SOILMOISTURE-L3S-SSMV-COMBINED_GAPFILLED-YYYYMMDD000000-fv09.1r1.nc

    Data Variables

    Each netCDF file contains 3 coordinate variables (WGS84 longitude, latitude and time stamp), as well as the following data variables:

    • sm: (float) The Soil Moisture variable reflects estimates of daily average volumetric soil moisture content (m3/m3) in the soil surface layer (~0-5 cm) over a whole grid cell (0.25 degree).
    • sm_uncertainty: (float) The Soil Moisture Uncertainty variable reflects the uncertainty (random error) of the original satellite observations and of the predictions used to fill observation data gaps.
    • sm_anomaly: Soil moisture anomalies (reference period 1991-2020) derived from the gap-filled values (`sm`)
    • sm_smoothed: Contains DCT-PLS predictions used to fill data gaps in the original soil moisture field. These values are also provided for cases where an observation was initially available (compare `gapmask`). In this case, they provided a smoothed version of the original data.
    • gapmask: (0 | 1) Indicates grid cells where a satellite observation is available (1), and where the interpolated (smoothed) values are used instead (0) in the 'sm' field.
    • frozenmask: (0 | 1) Indicates grid cells where ERA5 soil temperature is <0 °C. In this case, a linear interpolation over time is applied.

    Additional information for each variable is given in the netCDF attributes.

    Version Changelog

    Changes in v9.1r1 (previous version was v09.1):

    • This version uses a novel uncertainty estimation scheme as described in Preimesberger et al. (2025).

    Software to open netCDF files

    These data can be read by any software that supports Climate and Forecast (CF) conform metadata standards for netCDF files, such as:

    References

    • Preimesberger, W., Stradiotti, P., and Dorigo, W.: ESA CCI Soil Moisture GAPFILLED: an independent global gap-free satellite climate data record with uncertainty estimates, Earth Syst. Sci. Data, 17, 4305–4329, https://doi.org/10.5194/essd-17-4305-2025, 2025.
    • Dorigo, W., Preimesberger, W., Stradiotti, P., Kidd, R., van der Schalie, R., van der Vliet, M., Rodriguez-Fernandez, N., Madelon, R., & Baghdadi, N. (2023). ESA Climate Change Initiative Plus - Soil Moisture Algorithm Theoretical Baseline Document (ATBD) Supporting Product Version 08.1 (version 1.1). Zenodo. https://doi.org/10.5281/zenodo.8320869
    • Garcia, D., 2010. Robust smoothing of gridded data in one and higher dimensions with missing values. Computational Statistics & Data Analysis, 54(4), pp.1167-1178. Available at: https://doi.org/10.1016/j.csda.2009.09.020
    • Rodell, M., Houser, P. R., Jambor, U., Gottschalck, J., Mitchell, K., Meng, C.-J., Arsenault, K., Cosgrove, B., Radakovich, J., Bosilovich, M., Entin, J. K., Walker, J. P., Lohmann, D., and Toll, D.: The Global Land Data Assimilation System, Bulletin of the American Meteorological Society, 85, 381 – 394, https://doi.org/10.1175/BAMS-85-3-381, 2004.

    Related Records

    The following records are all part of the ESA CCI Soil Moisture science data records community

    1

    ESA CCI SM MODELFREE Surface Soil Moisture Record

    <a href="https://doi.org/10.48436/svr1r-27j77" target="_blank"

  7. g

    CSIRO Marine Research Ocean Neutral Density Surfaces Software | gimi9.com

    • gimi9.com
    Updated Jul 2, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). CSIRO Marine Research Ocean Neutral Density Surfaces Software | gimi9.com [Dataset]. https://gimi9.com/dataset/au_csiro-marine-research-ocean-neutral-density-surfaces-software1/
    Explore at:
    Dataset updated
    Jul 2, 2025
    Description

    The neutral density code comes as a package of MATLAB and/or FORTRAN routines which enable the user to fit neutral density surfaces to arbitrary hydrographic data. The FORTRAN implementation consists of a FORTRAN subroutine which labels a cast of hydrographic data with neutral density, and another subroutine which then finds the positions of specified neutral density surfaces within the water column. The MATLAB implementation consists of two MATLAB functions performing these same operations, only on sections of hydrographic data. Versions are available for Unix workstations running with the NETCDF data archiving library and PC's not running NETCDF. This latter code is suitable for compilation on Unix workstations or other machines not running the NETCDF library. The MATLAB version for the PC does not require compilation of the underlying FORTRAN code, unlike the UNIX version of the code. All code comes with documentation in the form of Readme files, as well as Makefiles and examples to provide check values for the user. This "in-house" CSIRO software is available under conditions which are attached with the software.

  8. Z

    Data from: The impact of Indonesian Throughflow constrictions on eastern...

    • data.niaid.nih.gov
    Updated May 19, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Michael Eabry; Ryan Holmes; Alex Sen Gupta (2022). The impact of Indonesian Throughflow constrictions on eastern Pacific upwelling and water-mass transformation [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_6443021
    Explore at:
    Dataset updated
    May 19, 2022
    Dataset provided by
    University of Sydney
    University of New South Wales
    Authors
    Michael Eabry; Ryan Holmes; Alex Sen Gupta
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Pacific Ocean
    Description

    Netcdf data and Matlab processing scripts for the article:

    Eabry, Holmes and Sen Gupta (2022): The impact of Indonesian Throughflow constrictions on eastern Pacific upwelling and water-mass transformation. Journal of Geophysical Research: Oceans. https://doi.org/10.1029/2022JC018509

    Included are netcdf files with output from the ACCESS-OM2 1-degree ocean model averaged over years 500-600 of the spin-up simulation. CONTROL indicates the control simulation (realistic ITF topography), OPENITF indicates the Open ITF experiment and DIFF indicates difference files between the two. Please refer to the meta-data within the netcdf files for more information. Scripts to help with plotting standard variables are part of the COSIMA cookbook repository at https://github.com/COSIMA/cosima-recipes.

    An example script Control_WMT_budget.m is provided to plot the control WMT budget and can be easily modified to plot the Open ITF or anomalous WMT budget. This script uses the Pacific masks found in mask.mat. The small tendency term is provided separately as dV_dt_nrho.mat.

  9. d

    U-CTD data of Cruise M135 in netcdf format

    • search.dataone.org
    • doi.pangaea.de
    Updated Nov 5, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Martin Visbeck; Gerd Krahmann (2025). U-CTD data of Cruise M135 in netcdf format [Dataset]. http://doi.org/10.1594/PANGAEA.917462
    Explore at:
    Dataset updated
    Nov 5, 2025
    Dataset provided by
    PANGAEA Data Publisher for Earth and Environmental Science
    Authors
    Martin Visbeck; Gerd Krahmann
    Time period covered
    Mar 29, 2017 - Apr 6, 2017
    Area covered
    Description

    Version = PO-GLOBAL-SVN = 656 Matlab = 9.4.0.813654 (R2018a) Release = 2 P-correction = 0 dbar T-correction = 0 deg C S-correction = 0 PSU Comment = corrections were added to raw data Comment = p is pressure in dbar Comment = t is in situ temperature in deg C (ITS-90) […]

  10. Z

    A global daily seamless 9-km Vegetation Optical Depth (VOD) product from...

    • data.niaid.nih.gov
    • data-staging.niaid.nih.gov
    Updated Mar 14, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hu, Die; Wang, Yuan; Jing, Han; Yue, Linwei; Zhang, Qiang; Yuan, Qiangqiang; Fan, Lei; Shen, Huanfeng; Zhang, Liangpei (2025). A global daily seamless 9-km Vegetation Optical Depth (VOD) product from 2010 to 2021 [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_13334756
    Explore at:
    Dataset updated
    Mar 14, 2025
    Dataset provided by
    Wuhan University
    Authors
    Hu, Die; Wang, Yuan; Jing, Han; Yue, Linwei; Zhang, Qiang; Yuan, Qiangqiang; Fan, Lei; Shen, Huanfeng; Zhang, Liangpei
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    (I) DESCRIPTION:

    · A global daily seamless 9-km Vegetation Optical Depth (VOD) product is generated through gap-filling and spatiotemporal fusion model. This daily products start from Jan 01, 2010 to Jul 31, 2021 (about 20GB memory after uncompressing all zip files).

    · To further validate the effectiveness of these products, three verification ways are employed as follow: 1) Time series validation; 2) Simulated missing-region validation; And 3) Data comparison validation.

    · It is important to note that the original data contain missing dates, and these corresponding gaps are also present in our dataset.

    (II) DATA FORMATTING AND FILE NAMES

    For the convenience of our readers, we have two formats of data available for download.

    1) MAT file (Version v1)

    Data from 2010 to 2021 are stored separately into folders for the corresponding years, with each folder containing daily .mat files. The naming convention for the data is “YYYYXXZZ,” where YYYY is the 4-digit year, XX is the 2-digit month, and ZZ is the 2-digit date. The geographic scope is global and the grid size is 4000*2000.

    MATFILES (.mat): The folders with matfiles contain individual files for:

    1. Vegetation Optical Depth: VOD_seamless_9km_ YYYYXXZZ.mat

    2. Latitude/Longitude: VOD_9km_Coordinates.mat

    2) NetCDF file (Version v2)

    The year-by-year daily data from 2010 to 2021 are stored in the ‘.nc’ files for the corresponding years. The daily data within each year into one NetCDF file. The variable names are named as VOD_xxxxyydd, where xxxx represents the year, yy represents the month, and dd represents the day. The longitude variable is named “lon” with a dimension of 4000×1, and the latitude variable is named “lat” with a dimension of 2000×1.

    It should be noted that these NetCDF files are saved using the netCDF4 library in Python, with the dimension order being (lat, lon). When reading these NetCDF files in MATLAB, the default data dimension order is (lon, lat). Therefore, it is necessary to transpose the variables to match the correct dimension order.

  11. Z

    Data from: Bathymetry beneath the Amery ice shelf, East Antarctica, revealed...

    • data.niaid.nih.gov
    • nde-dev.biothings.io
    • +1more
    Updated Sep 21, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Junjun (2021). Bathymetry beneath the Amery ice shelf, East Antarctica, revealed by airborne gravity [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_5518252
    Explore at:
    Dataset updated
    Sep 21, 2021
    Dataset provided by
    Yang
    Authors
    Junjun
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Antarctica, East Antarctica
    Description

    We estimated the seafloor topography beneath the Amery Ice Shelf, East Antarctica, from airborne gravity anomaly through a nonlinear inversion method called simulated annealing. The estimation results provide a view of the seafloor beneath the Amery Ice Shelf, where direct bathymetric observations are rare. The model, 'gravity_estimated_seafloor_topography_beneath_the_Amery_Ice_Shelf.nc', is in NetCDF format which can be read through MATLAB commands "ncdisp" and "ncread". Contents of the model can be found in "contents.txt". The MATLAB program "nc2mat.m" reads the NetCDF ".nc" format model and saves the variables in the model to a MATLAB ".mat" format file.

  12. Inverse Model Results for Filchner-Ronne Catchment

    • zenodo.org
    bin, nc
    Updated Nov 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Michael Wolovick; Michael Wolovick; Angelika Humbert; Angelika Humbert; Thomas Kleiner; Thomas Kleiner; Martin Rückamp; Martin Rückamp (2023). Inverse Model Results for Filchner-Ronne Catchment [Dataset]. http://doi.org/10.5281/zenodo.7798650
    Explore at:
    bin, ncAvailable download formats
    Dataset updated
    Nov 30, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Michael Wolovick; Michael Wolovick; Angelika Humbert; Angelika Humbert; Thomas Kleiner; Thomas Kleiner; Martin Rückamp; Martin Rückamp
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This page contains the results of the inversions for basal drag and drag coefficient in the Filchner-Ronne catchment presented in Wolovick et al., (2023), along with the code used to perform the inversions and L-curves, analyze the results, and produce the figures presented in that paper.

    This all looks very complicated. There's so many files here. The description is so long. I just want to know the basal drag!

    If you don't want to get into the weeds of inverse modeling and L-curve analysis, or if you are uninterested in wading through our collection of model structures and scripts, then you should use the file BestCombinedDragEstimate.nc. That file contains our best weighted mean estimate of the ice sheet basal drag in our domain, along with the weighted standard deviation of the scatter of the different models about the mean. As discussed in the paper, this combined estimate is constructed from the weighted mean of 24 individual inversions, representing 8 separate L-curve experiments on our highest-resolution mesh, with three regularization values per L-curve (best estimate regularization, along with minimum and maximum acceptable regularization levels). Each inversion is weighted according to the inverse of its total variance ratio, which is a quality metric incorporating both observational misfit and inverted structure. For ease of use, these results have been interpolated from the unstructured model mesh onto a 250 m regular grid. If you only want to know the basal drag in the Filchner-Ronne region, that is the only file you should use.

    For users who want to go further, we will now explain the remaining files in this release. First we give a brief summary of all of the scripts included here and their functions, and then we will give an explanation of the matfiles that contain the actual inversion and L-curve results. Note that the scripts presented here are the matlab scripts used to organize and set up model runs for ISSM. The Ice-Sheet and Sea-level System Model (ISSM) is a highly versatile parallelized finite-element ice sheet model run in C but controlled using Matlab or Python front-ends. We do not include the underlying code for ISSM here; users who are interested in installing ISSM should go to the ISSM home page. We merely include the Matlab scripts we used to organize our ISSM front-end, set up model structures, and then analyze and visualize results.

    Main Matlab scripts:

    These are the main functional scripts used to set up and run the model.

    • ISSMInversion_v3.m. This is the primary script we used to set up the inversions and perform L-curve analysis. It requires a model mesh as input along with some gridded data. It also produces an L-curve figure (figure 3) after performing the L-curve analysis. This script can be run in two modes: "setupandsend", which prepares model structures and sends them to the cluster to be solved, and "loadandanalyze", which loads the solutions from the cluster, saves them to matfiles, and performs analysis and visualization. In addition to L-curve analysis and the L-curve figure, this script can also produce a variety of additional figures of model output that we did not show in the paper.
    • MakeISSMMesh_v4.m. This is the script we used to make our model meshes. It requires a domain boundary as input along with some gridded data.
    • ModelBoundaryPicker_v1.m. This script opens a crude graphical interface for picking the domain outline.
    • OrganizeInversionsForRelease_v2.m. This script assembles L-curve and inverse model results and organizes them into the data release you see here. Note that it doesn't compute the combined drag estimate itself, (that is done by CombinedDragFigure_v1.m), but it does interpolate the combined drag estimate from the model mesh to the grid, and it produces the output netcdf file.

    Note that the gridded data files needed by some of the above scripts are not included in our release here. Users interested in using these scripts for their own projects will need to provide their own gridded inputs, for instance from BedMachine or Measures.

    Figure-making scripts:

    These scripts produced almost all of the figures we presented in the paper, and also computed the statistics we presented in the tables in the paper.

    • CombinedDragFigure_v1.m. This script computes the combined drag estimate on the highest-resolution mesh, and makes a figure displaying it (Figure 12 in the paper).
    • InversionComparisonFigure_HOSSA_v1.m. This makes figure 11 in the paper and also computes the statistics shown in table 3.
    • InversionComparisonFigure_m_v1.m This makes figures 9 and 10, and also computes the statistics shown in table 2.
    • InversionComparisonFigure_N_v1.m. This makes figure 8, and also computes the statistics shown in table 1.
    • InversionComparisonFigure_v1a.m. This makes figure 4 in the paper.
    • InversionResConvergenceFigure_v2.m. This makes figure 6.
    • InversionResMisfitFigure_v1.m. This makes figure 7.
    • InversionSettingFigure_v1.m. This makes figure 1.
    • InversionSpectrumFigure_v1.m. This performs spectral analysis and makes figure 5.
    • InversionThermalSettingFigure_v1.m. This makes figure A1.
    • MeshSizeFigure_v1.m. This makes figure A2.
    • NComparisonFigure_v1.m. This makes figure 2.

    Other utility Matlab functions:

    These miscellaneous function do various tasks. Many of them are called as subroutines of the scripts above. Additionally, many of them are generally useful in contexts beyond the inverse modeling presented here.

    • FlattenModelStructure.m. ISSM has the unfortunate convention of saving every variable in 3D meshes on every single 3D mesh node, which is quire wasteful for variables that are actually 2D (ie, most of the model variables). This function flattens all uneccessarily 3D information, but unlike the built-in ISSM function flatten.m, this script preserves the 3D geometry of the mesh, along with 3D variables that actually are 3D (such as englacial temperature, for example). This function can also be run in reverse to expand variables back to full 3D before calling solve().
    • intuitive_lowpass.m. This function low-pass filters a 1D dataset using a gaussian filter. It has several options for handling boundary conditions at the end points.
    • LaplacianInterpolation.m. This function fills in missing data values for gridded data products by solving Poisson's equation (Laplacian=0).
    • LaplacianInterpolation_mesh.m. This function does the same thing but on an unstructured mesh.
    • loadnetcdf.m. This function loads variables from netcdf files into the Matlab workspace using a similar syntax as load() for matfiles.
    • MultiWavelengthInterpolator.m. This function interpolates gridded data onto an unstructured mesh using a multi-grid approach. The grid is smoothed at multiple wavelengths and each mesh element interpolates from the wavelength that is appropriate for its size. This functionality is useful for preventing aliasing in coarse-resolution areas when interpolating onto a mesh with variable mesh size. It also produces results that are approximately (but not precisely) conservative.
    • ThreeByThree.m. This function iteratively performs a 3x3 smoothing on gridded data.
    • unpack.m. This function takes a structure and "unpacks" it by making every field into a variable in the workspace.

    Matfiles with L-curve data and model structures.

    The results of our L-curve analyses and our actual inversion results are stored in matfiles. We performed 21 experiments shown in the paper; for each one we performed an independent L-curve analysis using 25 individual inversions, for a total of 525 inversions. However, for this data release we simplify matters by only presenting 3 inversions per experiment, corresponding to the best regularization value (LambdaBest) and the maximum and minimum acceptable regularization values (LambdaMax and LambdaMin). In addition, for each experiment we also provide an LCurveFile that summarizes the L-curve analysis but does not contain any actual model results. In total, we present 84 matfiles in this data release.

    Naming convention:

    All matfiles presented here have the following naming convention:

    Mesh#_eqn_m#_Ntype_LambdaType.mat

    • Mesh#: this represents the mesh on which the inversions were performed, ranging from Mesh1 (highest resolution) to Mesh10 (lowest resolution).
    • eqn: this represents the type of equations solved in the inversion. Values are "SSA" or "HO".
    • m#: exponent in the sliding law. Values are m1, m3, and m5.
    • Ntype: effective pressure source in the sliding law. Values are "noN" (ie, Weertman sliding), "Nop", "Nopc", and "Ncuas".
    • LambdaType: values of this string are "LCurveFile" (for the file summarizing the whole L-curve experiment), "LambdaMin", "LambdaBest", and "LambdaMax".

    Variables in the model files:

    Every file ending with "LambdaMin", "LambdaBest", or "LambdaMax" is a model file containing the same set of variables. Those variables are:

    • md. This is a a model structure variable usable by any ISSM installation. Note that if you do not have ISSM installed on your machine, Matlab will not recognize class "model" and you will not be able to load this variable. The results of the inversion are stored in md.results.StressbalanceSolution. Other important things for the inversion, such as cost functions, cost function coefficients, and

  13. m

    High-resolution gridded multibeam bathymetry data (netCDF grid format) of...

    • marine-geo.org
    nc
    Updated Mar 16, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zach Eilon (2020). High-resolution gridded multibeam bathymetry data (netCDF grid format) of Pacific seafloor in the region of the old ORCA OBS array (31S,158W to 38S,152W) [Dataset]. http://doi.org/10.1594/IEDA/327342
    Explore at:
    ncAvailable download formats
    Dataset updated
    Mar 16, 2020
    Dataset provided by
    Marine Geoscience Data System (MGDS)
    Authors
    Zach Eilon
    License

    Attribution-NonCommercial-ShareAlike 3.0 (CC BY-NC-SA 3.0)https://creativecommons.org/licenses/by-nc-sa/3.0/
    License information was derived automatically

    Area covered
    Description

    Abstract: The bathymetry was mapped with a Simrad EM122 multibeam system (1° x 2° system, 12 kHz swath mapping) in Nov/Dec 2019, and processed for quality control through QIMERA software while onboard the vessel. In addition to tracks between ocean bottom seismometer drop sites associated with the Old ORCA experiment (a US contribution to the PacificArray initiative), this data set contains multibeam swath passes within the field region, and on transits to/from Tahiti, and to/from a rescue at ~30S,153W. Sippican MK-21/PC-based XBT measurements were conducted at least daily to account for varying sound speed throughout the experiment, and processed through Simrad SIS software. The final bathymetric maps and grids were created using MATLAB and GMT. The data files are in GMT-compatible netCDF grid format suitable for import to GMT scripts. The km1922_all.grd file contains bathymetry for the entire cruise mapped at 50 m resolution, and the obs_array.grd file contains bathymetry for just the OBS array region, at 100 m resolution. The data files were generated as part of a project called Imaging small-scale convection and structure of the mantle in the south Pacific: a US contribution to international collaboration PacificArray, and Seismological Components of the MELT Experiment on the Southern East Pacific Rise. Funding was provided by NSF awards OCE16-58491, OCE16-58214 and OCE94-02375.

  14. Data from: Data and Code for "Ocean Wave Slope Effects on Global Air-Sea...

    • scholarship.miami.edu
    Updated Nov 28, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chong Jia; Milan Curcic (2024). Data and Code for "Ocean Wave Slope Effects on Global Air-Sea Turbulent Heat Fluxes" by Jia and Curcic (2025, GRL) [Dataset]. https://scholarship.miami.edu/esploro/outputs/dataset/Data-and-Code-for-Ocean-Wave/991032800397502976
    Explore at:
    Dataset updated
    Nov 28, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Chong Jia; Milan Curcic
    Time period covered
    Sep 23, 2025
    Description

    Data and code for Jia and Curcic (2025, GRL), under review. Preprint: https://doi.org/10.22541/essoar.175510754.46241590/v1 Intermediate ERA5 data (mean square slope, 10-m wind speed, and sensible and latent heat fluxes) are provided here for convenience in NetCDF and MATLAB formats. MATLAB source code to process the intermediate data and plot the paper figures is in make_paper_figures.m.

  15. a

    Code and glacial isostatic adjustment model outputs associated with "Glacial...

    • arcticdata.io
    • search.dataone.org
    Updated Jun 3, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Roger Creel (2025). Code and glacial isostatic adjustment model outputs associated with "Glacial isostatic adjustment driven by asymmetric ice sheet melt during the last interglacial causes multiple local sea level peaks" [Dataset]. http://doi.org/10.18739/A2S756N0M
    Explore at:
    Dataset updated
    Jun 3, 2025
    Dataset provided by
    Arctic Data Center
    Authors
    Roger Creel
    Time period covered
    Jan 1, 2024
    Area covered
    Earth
    Variables measured
    ESL, LAT, LON, RSL, age, lat, lon, rsl, thk, Pmax, and 14 more
    Description

    Access

    NetCDF files be accessed and downloaded from the directory via: http://arcticdata.io/data/10.18739/A2S756N0M.

    Overview

    This dataset comprises the code used to produce the results in Creel, R.C., Austermann, J.A., Glacial isostatic adjustment driven by asymmetric ice sheet melt during the Last Interglacial causes multiple local sea-level peaks. Geology. https://doi.org/10.1130/G52483.1.

    Research abstract:

    Global mean sea-level (GMSL) change during the Last Interglacial (LIG, 129−116 kiloannum (ka)) gives perspective on how ice sheets respond to warming. Observations of multiple peaks in LIG relative sea level (RSL) records, combined with an assumption that the Laurentide Ice Sheet (LIS) collapsed prior to the LIG, have been used to infer Greenland and Antarctic ice sheet melt histories as well as oscillations in LIG GMSL. However, evidence for an LIS outburst flood at ca. 125 ka and extensive early-LIG Antarctic melt suggests that Laurentide remnants may have persisted longer into the LIG than typically thought even as Antarctic melt accelerated. Here, we explore the effect of concurrent early-Holocene Laurentide persistence and Antarctic collapse on glacial isostatic adjustment and sea level. In our models, we hold GMSL constant at present levels (i.e., GMSL = 0) from 128 ka to 117 ka by balancing excess Laurentide ice with early-LIG Antarctic melt. We find that due to glacial isostatic adjustment, this synchronous but asymmetric ice change causes multiple RSL peaks, separated by ∼4.2 ± 2.5 m of RSL fall near North America and ∼1.3 ± 0.7 m around the Indian Ocean. This spatial pattern resembles observations. These results show that multiple peaks in LIG RSL could have occurred with asymmetric ice changes between the Northern and Southern Hemisphere that sum to little, if any, change in GMSL. Our work highlights the need for LIG modeling studies to consider that dynamic cryospheric changes can occur even with near-constant GMSL.

    This research was conducted at Lamont Doherty Earth Observatory in 2022 and 2023. It is entirely composed of modeling and compilation of existing data; no new data were produced for the study. The methodologies employed include glacial isostatic adjustment modeling using MATLAB code written by Jacky Austermann and post-processing of the resulting models using python scripts written by Roger Creel.

  16. Data from: Long-term Earth-Moon evolution with high-level orbit and ocean...

    • commons.datacite.org
    • deepblue.lib.umich.edu
    Updated Oct 7, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Brian K. Arbic; Michael Schindelegger (2021). Long-term Earth-Moon evolution with high-level orbit and ocean tide models [Dataset]. http://doi.org/10.7302/zck4-0058
    Explore at:
    Dataset updated
    Oct 7, 2021
    Dataset provided by
    DataCitehttps://www.datacite.org/
    University of Michigan
    Authors
    Brian K. Arbic; Michael Schindelegger
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Area covered
    Earth
    Dataset funded by
    National Aeronautics and Space Administration (NASA)
    Description

    These netcdf and Matlab files contain the information needed to reproduce Figures 1, 4, 8, 17, 18, 9-16 (minus the proxy values and Monte Carlo results), and the "24 hour" results of Figures 2 and 3.

  17. d

    GEOS-Chem output for "Lightning NOx Emissions: Reconciling measured and...

    • datadryad.org
    zip
    Updated Jun 2, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Joshua Laughner; Ronald Cohen; Benjamin Nault (2017). GEOS-Chem output for "Lightning NOx Emissions: Reconciling measured and modeled emissions estimates with updated NOx chemistry" [Dataset]. http://doi.org/10.6078/D10P4P
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 2, 2017
    Dataset provided by
    Dryad
    Authors
    Joshua Laughner; Ronald Cohen; Benjamin Nault
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Jun 2, 2017
    Description

    The data are stored as compressed netCDF4 files (compression level 1).

  18. b

    Profile data from WireWalker deployments at Mission Beach, California in...

    • bco-dmo.org
    • search.dataone.org
    • +1more
    bin, csv, zip
    Updated Jul 24, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Peter Franks; Andrew J Lucas (2018). Profile data from WireWalker deployments at Mission Beach, California in 2016 at a 50m depth [Dataset]. http://doi.org/10.1575/1912/bco-dmo.742124.1
    Explore at:
    bin(35.69 MB), csv(67.69 MB), zip(45.09 MB)Available download formats
    Dataset updated
    Jul 24, 2018
    Dataset provided by
    Biological and Chemical Data Management Office
    Authors
    Peter Franks; Andrew J Lucas
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Jun 13, 2016 - Jun 28, 2016
    Area covered
    Variables measured
    B, C, P, S, T, DO, n2, chl, dPt, rho, and 7 more
    Measurement technique
    Fluorometer, Data Logger, Oxygen Sensor
    Description

    Profile data from WireWalker deployments at Mission Beach, California in 2016 at a 50m depth.

    The default data format served through the BCO-DMO data system is tabular. These data are available to download as matrices in NetCDF (.nc) and Matlab (.mat) files in the "Data Files" section of this page.

    Related Datasets (Jun 2016, Mission Beach, CA)

    * Thermistor chain https://www.bco-dmo.org/dataset/742137
    * ADCP https://www.bco-dmo.org/dataset/742132

  19. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Leandro Wang Hantao; Carlos Alberto Teixeira; Victor Hugo Cavalcanti Ferreira (2022). CDF2MAT Automated SCRIPT to import NETCDF files to MATLAB | RESAMPLING added to correct RESHAPE for non-integer MS acquisition rates in GCxGC-MS data [Dataset]. http://doi.org/10.7910/DVN/WMTEMF

CDF2MAT Automated SCRIPT to import NETCDF files to MATLAB | RESAMPLING added to correct RESHAPE for non-integer MS acquisition rates in GCxGC-MS data

Related Article
Explore at:
6 scholarly articles cite this dataset (View in Google Scholar)
CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
Dataset updated
May 30, 2022
Dataset provided by
Harvard Dataverse
Authors
Leandro Wang Hantao; Carlos Alberto Teixeira; Victor Hugo Cavalcanti Ferreira
License

CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically

Description

Function name "cdf2mat" Please use this function to open MS-based chromatographic data from NETCDF (*.CDF) files. Resampling included for non-integer acquisition rates. Outputs nominal mass. Script optimized to process data from comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GCxGC-MS). Updated to remove negative noise signal. INPUT file: Opens the netCDF like 'Sample01.CDF' rate_MS: Desired integer acquisition rate OUTPUT FullMS Full MS chromatogram (second order data tensor) axis_min Retention time axis in minutes axis_mz m/z axis in Daltons I/O: [TIC,FullMS,axis_min,axis_mz] = cdf2mat(file,rate_MS) Compiled with MATLAB R2021b (v.9.11.0.1809720). Requires the Signal Processing Toolbox (v.9.0). Based on netCDFload.m (Murphy, Wenig, Parcsi, Skov e Stuetz) e de iCDF_load (Skov e Bro 2008). K.R. Murphy, P. Wenig, G. Parcsi, T. Skov, R.M. Stuetz (in press) Characterizing odorous emissions using new software for identifying peaks in chemometric models of GC-MS datasets. Chem Intel Lab Sys. doi: 10.1016/j.chemolab.2012.07.006 Skov T and Bro R. (2008) Solving fundamental problems in chromatographic analysis, Analytical and Bioanalytical Chemistry, 390 (1): 281-285. doi: 10.1007/s00216-007-1618-z

Search
Clear search
Close search
Google apps
Main menu