Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Function name "cdf2mat" Please use this function to open MS-based chromatographic data from NETCDF (*.CDF) files. Resampling included for non-integer acquisition rates. Outputs nominal mass. Script optimized to process data from comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GCxGC-MS). Updated to remove negative noise signal. INPUT file: Opens the netCDF like 'Sample01.CDF' rate_MS: Desired integer acquisition rate OUTPUT FullMS Full MS chromatogram (second order data tensor) axis_min Retention time axis in minutes axis_mz m/z axis in Daltons I/O: [TIC,FullMS,axis_min,axis_mz] = cdf2mat(file,rate_MS) Compiled with MATLAB R2021b (v.9.11.0.1809720). Requires the Signal Processing Toolbox (v.9.0). Based on netCDFload.m (Murphy, Wenig, Parcsi, Skov e Stuetz) e de iCDF_load (Skov e Bro 2008). K.R. Murphy, P. Wenig, G. Parcsi, T. Skov, R.M. Stuetz (in press) Characterizing odorous emissions using new software for identifying peaks in chemometric models of GC-MS datasets. Chem Intel Lab Sys. doi: 10.1016/j.chemolab.2012.07.006 Skov T and Bro R. (2008) Solving fundamental problems in chromatographic analysis, Analytical and Bioanalytical Chemistry, 390 (1): 281-285. doi: 10.1007/s00216-007-1618-z
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Here we present data on vegetation and climate conditions in Syria, including some nc files. The nc files describe the spatial status of Syria, including land cover in 2010, trends in temperature and precipitation, EVI mean and trend, EVI residual analysis and water use efficiency. Detailed information can be found in the paper by Chen et al.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
the ego data processing chain decodes, processes, formats and performs quality control on glider data and metadata. the decoder performs the following actions for a glider deployment: decode and format the glider deployment data and metadata into an ego netcdf time series file apply real time quality control (rtqc) tests on ego netcdf time series file, for slocum gliders, estimate subsurface currents and store them into the ego file, generate netcdf profile files from ego file data and apply specific rtqc tests to them.the decoder manages slocum, seaglider and seaexplorer gliders observations. it is a matlab code (see groom_gliders_coriolis_matlab_decoder_*.pdf in decglider_doc\decoder_user_manual folder) a compiled version is available that does not require matlab licence (see readme.txt in decglider_soft\soft_compiled folder)
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Outgoing longwave radiation data analyzed in Johnson et al. (2019), provided as Matlab .mat files and processed from NetCDF data (Matlab scripts for processing the NetCDF data are provided in a separate file). The NetCDF source data are freely available on the web:http://olr.umd.edu/
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset is associated with a manuscript on Connecticut River plume mixing with first author Michael M. Whitney. The dataset includes source code, compilation files, and input for the Regional Ocean Modeling System (ROMS) runs used in this study. ROMS output files in NetCDF format are generated by executing the compiled ROMS code with the input files. The dataset also includes MATLAB routines and datafiles for the analysis of model results and generation of figures in the manuscript. The following zip files are included:
ROMS_v783_Yan_code.zip [ROMS source code branch used in this study] ctplume_ROMS_compilation.zip [files to compile ROMS source code and run-specific Fortran-90 built code] ctplume_ROMS_input.zip [ROMS ASCII and NetCDF input files for runs] ctplume_MATLAB_analysis.zip [custom analysis routines in MATLAB used in this study] ctplume_MATLAB_figures.zip [custom MATLAB routine for manuscript figure generation and MATLAB data files with all data fields included in figures] ctplume_figures_tif.zip [TIF image files of each figure in manuscript]
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains information on the Surface Soil Moisture (SM) content derived from satellite observations in the microwave domain.
A description of this dataset, including the methodology and validation results, is available at:
Preimesberger, W., Stradiotti, P., and Dorigo, W.: ESA CCI Soil Moisture GAPFILLED: an independent global gap-free satellite climate data record with uncertainty estimates, Earth Syst. Sci. Data, 17, 4305–4329, https://doi.org/10.5194/essd-17-4305-2025, 2025.
ESA CCI Soil Moisture is a multi-satellite climate data record that consists of harmonized, daily observations coming from 19 satellites (as of v09.1) operating in the microwave domain. The wealth of satellite information, particularly over the last decade, facilitates the creation of a data record with the highest possible data consistency and coverage.
However, data gaps are still found in the record. This is particularly notable in earlier periods when a limited number of satellites were in operation, but can also arise from various retrieval issues, such as frozen soils, dense vegetation, and radio frequency interference (RFI). These data gaps present a challenge for many users, as they have the potential to obscure relevant events within a study area or are incompatible with (machine learning) software that often relies on gap-free inputs.
Since the requirement of a gap-free ESA CCI SM product was identified, various studies have demonstrated the suitability of different statistical methods to achieve this goal. A fundamental feature of such gap-filling method is to rely only on the original observational record, without need for ancillary variable or model-based information. Due to the intrinsic challenge, there was until present no global, long-term univariate gap-filled product available. In this version of the record, data gaps due to missing satellite overpasses and invalid measurements are filled using the Discrete Cosine Transform (DCT) Penalized Least Squares (PLS) algorithm (Garcia, 2010). A linear interpolation is applied over periods of (potentially) frozen soils with little to no variability in (frozen) soil moisture content. Uncertainty estimates are based on models calibrated in experiments to fill satellite-like gaps introduced to GLDAS Noah reanalysis soil moisture (Rodell et al., 2004), and consider the gap size and local vegetation conditions as parameters that affect the gapfilling performance.
You can use command line tools such as wget or curl to download (and extract) data for multiple years. The following command will download and extract the complete data set to the local directory ~/Download on Linux or macOS systems.
#!/bin/bash
# Set download directory
DOWNLOAD_DIR=~/Downloads
base_url="https://researchdata.tuwien.at/records/3fcxr-cde10/files"
# Loop through years 1991 to 2023 and download & extract data
for year in {1991..2023}; do
echo "Downloading $year.zip..."
wget -q -P "$DOWNLOAD_DIR" "$base_url/$year.zip"
unzip -o "$DOWNLOAD_DIR/$year.zip" -d $DOWNLOAD_DIR
rm "$DOWNLOAD_DIR/$year.zip"
done
The dataset provides global daily estimates for the 1991-2023 period at 0.25° (~25 km) horizontal grid resolution. Daily images are grouped by year (YYYY), each subdirectory containing one netCDF image file for a specific day (DD), month (MM) in a 2-dimensional (longitude, latitude) grid system (CRS: WGS84). The file name has the following convention:
ESACCI-SOILMOISTURE-L3S-SSMV-COMBINED_GAPFILLED-YYYYMMDD000000-fv09.1r1.nc
Each netCDF file contains 3 coordinate variables (WGS84 longitude, latitude and time stamp), as well as the following data variables:
Additional information for each variable is given in the netCDF attributes.
Changes in v9.1r1 (previous version was v09.1):
These data can be read by any software that supports Climate and Forecast (CF) conform metadata standards for netCDF files, such as:
The following records are all part of the ESA CCI Soil Moisture science data records community
| 1 |
ESA CCI SM MODELFREE Surface Soil Moisture Record | <a href="https://doi.org/10.48436/svr1r-27j77" target="_blank" |
Facebook
TwitterThe neutral density code comes as a package of MATLAB and/or FORTRAN routines which enable the user to fit neutral density surfaces to arbitrary hydrographic data. The FORTRAN implementation consists of a FORTRAN subroutine which labels a cast of hydrographic data with neutral density, and another subroutine which then finds the positions of specified neutral density surfaces within the water column. The MATLAB implementation consists of two MATLAB functions performing these same operations, only on sections of hydrographic data. Versions are available for Unix workstations running with the NETCDF data archiving library and PC's not running NETCDF. This latter code is suitable for compilation on Unix workstations or other machines not running the NETCDF library. The MATLAB version for the PC does not require compilation of the underlying FORTRAN code, unlike the UNIX version of the code. All code comes with documentation in the form of Readme files, as well as Makefiles and examples to provide check values for the user. This "in-house" CSIRO software is available under conditions which are attached with the software.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Netcdf data and Matlab processing scripts for the article:
Eabry, Holmes and Sen Gupta (2022): The impact of Indonesian Throughflow constrictions on eastern Pacific upwelling and water-mass transformation. Journal of Geophysical Research: Oceans. https://doi.org/10.1029/2022JC018509
Included are netcdf files with output from the ACCESS-OM2 1-degree ocean model averaged over years 500-600 of the spin-up simulation. CONTROL indicates the control simulation (realistic ITF topography), OPENITF indicates the Open ITF experiment and DIFF indicates difference files between the two. Please refer to the meta-data within the netcdf files for more information. Scripts to help with plotting standard variables are part of the COSIMA cookbook repository at https://github.com/COSIMA/cosima-recipes.
An example script Control_WMT_budget.m is provided to plot the control WMT budget and can be easily modified to plot the Open ITF or anomalous WMT budget. This script uses the Pacific masks found in mask.mat. The small tendency term is provided separately as dV_dt_nrho.mat.
Facebook
TwitterVersion = PO-GLOBAL-SVN = 656 Matlab = 9.4.0.813654 (R2018a) Release = 2 P-correction = 0 dbar T-correction = 0 deg C S-correction = 0 PSU Comment = corrections were added to raw data Comment = p is pressure in dbar Comment = t is in situ temperature in deg C (ITS-90) […]
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
(I) DESCRIPTION:
· A global daily seamless 9-km Vegetation Optical Depth (VOD) product is generated through gap-filling and spatiotemporal fusion model. This daily products start from Jan 01, 2010 to Jul 31, 2021 (about 20GB memory after uncompressing all zip files).
· To further validate the effectiveness of these products, three verification ways are employed as follow: 1) Time series validation; 2) Simulated missing-region validation; And 3) Data comparison validation.
· It is important to note that the original data contain missing dates, and these corresponding gaps are also present in our dataset.
(II) DATA FORMATTING AND FILE NAMES
For the convenience of our readers, we have two formats of data available for download.
1) MAT file (Version v1)
Data from 2010 to 2021 are stored separately into folders for the corresponding years, with each folder containing daily .mat files. The naming convention for the data is “YYYYXXZZ,” where YYYY is the 4-digit year, XX is the 2-digit month, and ZZ is the 2-digit date. The geographic scope is global and the grid size is 4000*2000.
MATFILES (.mat): The folders with matfiles contain individual files for:
Vegetation Optical Depth: VOD_seamless_9km_ YYYYXXZZ.mat
Latitude/Longitude: VOD_9km_Coordinates.mat
2) NetCDF file (Version v2)
The year-by-year daily data from 2010 to 2021 are stored in the ‘.nc’ files for the corresponding years. The daily data within each year into one NetCDF file. The variable names are named as VOD_xxxxyydd, where xxxx represents the year, yy represents the month, and dd represents the day. The longitude variable is named “lon” with a dimension of 4000×1, and the latitude variable is named “lat” with a dimension of 2000×1.
It should be noted that these NetCDF files are saved using the netCDF4 library in Python, with the dimension order being (lat, lon). When reading these NetCDF files in MATLAB, the default data dimension order is (lon, lat). Therefore, it is necessary to transpose the variables to match the correct dimension order.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
We estimated the seafloor topography beneath the Amery Ice Shelf, East Antarctica, from airborne gravity anomaly through a nonlinear inversion method called simulated annealing. The estimation results provide a view of the seafloor beneath the Amery Ice Shelf, where direct bathymetric observations are rare. The model, 'gravity_estimated_seafloor_topography_beneath_the_Amery_Ice_Shelf.nc', is in NetCDF format which can be read through MATLAB commands "ncdisp" and "ncread". Contents of the model can be found in "contents.txt". The MATLAB program "nc2mat.m" reads the NetCDF ".nc" format model and saves the variables in the model to a MATLAB ".mat" format file.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This page contains the results of the inversions for basal drag and drag coefficient in the Filchner-Ronne catchment presented in Wolovick et al., (2023), along with the code used to perform the inversions and L-curves, analyze the results, and produce the figures presented in that paper.
This all looks very complicated. There's so many files here. The description is so long. I just want to know the basal drag!
If you don't want to get into the weeds of inverse modeling and L-curve analysis, or if you are uninterested in wading through our collection of model structures and scripts, then you should use the file BestCombinedDragEstimate.nc. That file contains our best weighted mean estimate of the ice sheet basal drag in our domain, along with the weighted standard deviation of the scatter of the different models about the mean. As discussed in the paper, this combined estimate is constructed from the weighted mean of 24 individual inversions, representing 8 separate L-curve experiments on our highest-resolution mesh, with three regularization values per L-curve (best estimate regularization, along with minimum and maximum acceptable regularization levels). Each inversion is weighted according to the inverse of its total variance ratio, which is a quality metric incorporating both observational misfit and inverted structure. For ease of use, these results have been interpolated from the unstructured model mesh onto a 250 m regular grid. If you only want to know the basal drag in the Filchner-Ronne region, that is the only file you should use.
For users who want to go further, we will now explain the remaining files in this release. First we give a brief summary of all of the scripts included here and their functions, and then we will give an explanation of the matfiles that contain the actual inversion and L-curve results. Note that the scripts presented here are the matlab scripts used to organize and set up model runs for ISSM. The Ice-Sheet and Sea-level System Model (ISSM) is a highly versatile parallelized finite-element ice sheet model run in C but controlled using Matlab or Python front-ends. We do not include the underlying code for ISSM here; users who are interested in installing ISSM should go to the ISSM home page. We merely include the Matlab scripts we used to organize our ISSM front-end, set up model structures, and then analyze and visualize results.
Main Matlab scripts:
These are the main functional scripts used to set up and run the model.
Note that the gridded data files needed by some of the above scripts are not included in our release here. Users interested in using these scripts for their own projects will need to provide their own gridded inputs, for instance from BedMachine or Measures.
Figure-making scripts:
These scripts produced almost all of the figures we presented in the paper, and also computed the statistics we presented in the tables in the paper.
Other utility Matlab functions:
These miscellaneous function do various tasks. Many of them are called as subroutines of the scripts above. Additionally, many of them are generally useful in contexts beyond the inverse modeling presented here.
Matfiles with L-curve data and model structures.
The results of our L-curve analyses and our actual inversion results are stored in matfiles. We performed 21 experiments shown in the paper; for each one we performed an independent L-curve analysis using 25 individual inversions, for a total of 525 inversions. However, for this data release we simplify matters by only presenting 3 inversions per experiment, corresponding to the best regularization value (LambdaBest) and the maximum and minimum acceptable regularization values (LambdaMax and LambdaMin). In addition, for each experiment we also provide an LCurveFile that summarizes the L-curve analysis but does not contain any actual model results. In total, we present 84 matfiles in this data release.
Naming convention:
All matfiles presented here have the following naming convention:
Mesh#_eqn_m#_Ntype_LambdaType.mat
Variables in the model files:
Every file ending with "LambdaMin", "LambdaBest", or "LambdaMax" is a model file containing the same set of variables. Those variables are:
Facebook
TwitterAttribution-NonCommercial-ShareAlike 3.0 (CC BY-NC-SA 3.0)https://creativecommons.org/licenses/by-nc-sa/3.0/
License information was derived automatically
Abstract: The bathymetry was mapped with a Simrad EM122 multibeam system (1° x 2° system, 12 kHz swath mapping) in Nov/Dec 2019, and processed for quality control through QIMERA software while onboard the vessel. In addition to tracks between ocean bottom seismometer drop sites associated with the Old ORCA experiment (a US contribution to the PacificArray initiative), this data set contains multibeam swath passes within the field region, and on transits to/from Tahiti, and to/from a rescue at ~30S,153W. Sippican MK-21/PC-based XBT measurements were conducted at least daily to account for varying sound speed throughout the experiment, and processed through Simrad SIS software. The final bathymetric maps and grids were created using MATLAB and GMT. The data files are in GMT-compatible netCDF grid format suitable for import to GMT scripts. The km1922_all.grd file contains bathymetry for the entire cruise mapped at 50 m resolution, and the obs_array.grd file contains bathymetry for just the OBS array region, at 100 m resolution. The data files were generated as part of a project called Imaging small-scale convection and structure of the mantle in the south Pacific: a US contribution to international collaboration PacificArray, and Seismological Components of the MELT Experiment on the Southern East Pacific Rise. Funding was provided by NSF awards OCE16-58491, OCE16-58214 and OCE94-02375.
Facebook
TwitterData and code for Jia and Curcic (2025, GRL), under review. Preprint: https://doi.org/10.22541/essoar.175510754.46241590/v1 Intermediate ERA5 data (mean square slope, 10-m wind speed, and sensible and latent heat fluxes) are provided here for convenience in NetCDF and MATLAB formats. MATLAB source code to process the intermediate data and plot the paper figures is in make_paper_figures.m.
Facebook
TwitterNetCDF files be accessed and downloaded from the directory via: http://arcticdata.io/data/10.18739/A2S756N0M.
This dataset comprises the code used to produce the results in Creel, R.C., Austermann, J.A., Glacial isostatic adjustment driven by asymmetric ice sheet melt during the Last Interglacial causes multiple local sea-level peaks. Geology. https://doi.org/10.1130/G52483.1.
Research abstract:
Global mean sea-level (GMSL) change during the Last Interglacial (LIG, 129−116 kiloannum (ka)) gives perspective on how ice sheets respond to warming. Observations of multiple peaks in LIG relative sea level (RSL) records, combined with an assumption that the Laurentide Ice Sheet (LIS) collapsed prior to the LIG, have been used to infer Greenland and Antarctic ice sheet melt histories as well as oscillations in LIG GMSL. However, evidence for an LIS outburst flood at ca. 125 ka and extensive early-LIG Antarctic melt suggests that Laurentide remnants may have persisted longer into the LIG than typically thought even as Antarctic melt accelerated. Here, we explore the effect of concurrent early-Holocene Laurentide persistence and Antarctic collapse on glacial isostatic adjustment and sea level. In our models, we hold GMSL constant at present levels (i.e., GMSL = 0) from 128 ka to 117 ka by balancing excess Laurentide ice with early-LIG Antarctic melt. We find that due to glacial isostatic adjustment, this synchronous but asymmetric ice change causes multiple RSL peaks, separated by ∼4.2 ± 2.5 m of RSL fall near North America and ∼1.3 ± 0.7 m around the Indian Ocean. This spatial pattern resembles observations. These results show that multiple peaks in LIG RSL could have occurred with asymmetric ice changes between the Northern and Southern Hemisphere that sum to little, if any, change in GMSL. Our work highlights the need for LIG modeling studies to consider that dynamic cryospheric changes can occur even with near-constant GMSL.
This research was conducted at Lamont Doherty Earth Observatory in 2022 and 2023. It is entirely composed of modeling and compilation of existing data; no new data were produced for the study. The methodologies employed include glacial isostatic adjustment modeling using MATLAB code written by Jacky Austermann and post-processing of the resulting models using python scripts written by Roger Creel.
Facebook
TwitterAttribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
These netcdf and Matlab files contain the information needed to reproduce Figures 1, 4, 8, 17, 18, 9-16 (minus the proxy values and Monte Carlo results), and the "24 hour" results of Figures 2 and 3.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The data are stored as compressed netCDF4 files (compression level 1).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Profile data from WireWalker deployments at Mission Beach, California in 2016 at a 50m depth.
The default data format served through the BCO-DMO data system is tabular. These data are available to download as matrices in NetCDF (.nc) and Matlab (.mat) files in the "Data Files" section of this page.
Related Datasets (Jun 2016, Mission Beach, CA)
* Thermistor chain https://www.bco-dmo.org/dataset/742137
* ADCP https://www.bco-dmo.org/dataset/742132
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Function name "cdf2mat" Please use this function to open MS-based chromatographic data from NETCDF (*.CDF) files. Resampling included for non-integer acquisition rates. Outputs nominal mass. Script optimized to process data from comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GCxGC-MS). Updated to remove negative noise signal. INPUT file: Opens the netCDF like 'Sample01.CDF' rate_MS: Desired integer acquisition rate OUTPUT FullMS Full MS chromatogram (second order data tensor) axis_min Retention time axis in minutes axis_mz m/z axis in Daltons I/O: [TIC,FullMS,axis_min,axis_mz] = cdf2mat(file,rate_MS) Compiled with MATLAB R2021b (v.9.11.0.1809720). Requires the Signal Processing Toolbox (v.9.0). Based on netCDFload.m (Murphy, Wenig, Parcsi, Skov e Stuetz) e de iCDF_load (Skov e Bro 2008). K.R. Murphy, P. Wenig, G. Parcsi, T. Skov, R.M. Stuetz (in press) Characterizing odorous emissions using new software for identifying peaks in chemometric models of GC-MS datasets. Chem Intel Lab Sys. doi: 10.1016/j.chemolab.2012.07.006 Skov T and Bro R. (2008) Solving fundamental problems in chromatographic analysis, Analytical and Bioanalytical Chemistry, 390 (1): 281-285. doi: 10.1007/s00216-007-1618-z