Hydrological and meteorological information can help inform the conditions and risk factors related to the environment and their inhabitants. Due to the limitations of observation sampling, gridded data sets provide the modeled information for areas where data collection are infeasible using observations collected and known process relations. Although available, data users are faced with barriers to use, challenges like how to access, acquire, then analyze data for small watershed areas, when these datasets were produced for large, continental scale processes. In this tutorial, we introduce Observatory for Gridded Hydrometeorology (OGH) to resolve such hurdles in a use-case that incorporates NetCDF gridded data sets processes developed to interpret the findings and apply secondary modeling frameworks (landlab).
LEARNING OBJECTIVES - Familiarize with data management, metadata management, and analyses with gridded data - Inspecting and problem solving with Python libraries - Explore data architecture and processes - Learn about OGH Python Library - Discuss conceptual data engineering and science operations
Use-case operations: 1. Prepare computing environment 2. Get list of grid cells 3. NetCDF retrieval and clipping to a spatial extent 4. Extract NetCDF metadata and convert NetCDFs to 1D ASCII time-series files 5. Visualize the average monthly total precipitations 6. Apply summary values as modeling inputs 7. Visualize modeling outputs 8. Save results in a new HydroShare resource
For inquiries, issues, or contribute to the developments, please refer to https://github.com/freshwater-initiative/Observatory
Test MI resource for netcdf metadata recognition. Visit https://dataone.org/datasets/sha256%3Aa4ee6ebf8a86fb119abda633d63110b85dfe37df20220482b6bdf5b421f59591 for complete metadata about this dataset.
The Global Forecast System (GFS) is a weather forecast model produced by the National Centers for Environmental Prediction (NCEP). Dozens of atmospheric and land-soil variables are available through this dataset, from temperatures, winds, and precipitation to soil moisture and atmospheric ozone concentration. The GFS data files stored here can be immediately used for OAR/ARL’s NOAA-EPA Atmosphere-Chemistry Coupler Cloud (NACC-Cloud) tool, and are in a Network Common Data Form (netCDF), which is a very common format used across the scientific community. These particular GFS files contain a comprehensive number of global atmosphere/land variables at a relatively high spatiotemporal resolution (approximately 13x13 km horizontal, vertical resolution of 127 levels, and hourly), are not only necessary for the NACC-Cloud tool to adequately drive community air quality applications (e.g., U.S. EPA’s Community Multiscale Air Quality model; https://www.epa.gov/cmaq), but can be very useful for a myriad of other applications in the Earth system modeling communities (e.g., atmosphere, hydrosphere, pedosphere, etc.). While many other data file and record formats are indeed available for Earth system and climate research (e.g., GRIB, HDF, GeoTIFF), the netCDF files here are advantageous to the larger community because of the comprehensive, high spatiotemporal information they contain, and because they are more scalable, appendable, shareable, self-describing, and community-friendly (i.e., many tools available to the community of users). Out of the four operational GFS forecast cycles per day (at 00Z, 06Z, 12Z and 18Z) this particular netCDF dataset is updated daily (/inputs/yyyymmdd/) for the 12Z cycle and includes 24-hr output for both 2D (gfs.t12z.sfcf$0hh.nc) and 3D variables (gfs.t12z.atmf$0hh.nc).
Also available are netCDF formatted Global Land Surface Datasets (GLSDs) developed by Hung et al. (2024). The GLSDs are based on numerous satellite products, and have been gridded to match the GFS spatial resolution (~13x13 km). These GLSDs contain vegetation canopy data (e.g., land surface type, vegetation clumping index, leaf area index, vegetative canopy height, and green vegetation fraction) that are supplemental to and can be combined with the GFS meteorological netCDF data for various applications, including NOAA-ARL's canopy-app. The canopy data variables are climatological, based on satellite data from the year 2020, combined with GFS meteorology for the year 2022, and are created at a daily temporal resolution (/inputs/geo-files/gfs.canopy.t12z.2022mmdd.sfcf000.global.nc)
This is an auto-generated index table corresponding to a folder of files in this dataset with the same name. This table can be used to extract a subset of files based on their metadata, which can then be used for further analysis. You can view the contents of specific files by navigating to the "cells" tab and clicking on an individual file_id.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains information on the Surface Soil Moisture (SM) content derived from satellite observations in the microwave domain.
A description of this dataset, including the methodology and validation results, is available at:
Preimesberger, W., Stradiotti, P., and Dorigo, W.: ESA CCI Soil Moisture GAPFILLED: An independent global gap-free satellite climate data record with uncertainty estimates, Earth Syst. Sci. Data Discuss. [preprint], https://doi.org/10.5194/essd-2024-610, in review, 2025.
ESA CCI Soil Moisture is a multi-satellite climate data record that consists of harmonized, daily observations coming from 19 satellites (as of v09.1) operating in the microwave domain. The wealth of satellite information, particularly over the last decade, facilitates the creation of a data record with the highest possible data consistency and coverage.
However, data gaps are still found in the record. This is particularly notable in earlier periods when a limited number of satellites were in operation, but can also arise from various retrieval issues, such as frozen soils, dense vegetation, and radio frequency interference (RFI). These data gaps present a challenge for many users, as they have the potential to obscure relevant events within a study area or are incompatible with (machine learning) software that often relies on gap-free inputs.
Since the requirement of a gap-free ESA CCI SM product was identified, various studies have demonstrated the suitability of different statistical methods to achieve this goal. A fundamental feature of such gap-filling method is to rely only on the original observational record, without need for ancillary variable or model-based information. Due to the intrinsic challenge, there was until present no global, long-term univariate gap-filled product available. In this version of the record, data gaps due to missing satellite overpasses and invalid measurements are filled using the Discrete Cosine Transform (DCT) Penalized Least Squares (PLS) algorithm (Garcia, 2010). A linear interpolation is applied over periods of (potentially) frozen soils with little to no variability in (frozen) soil moisture content. Uncertainty estimates are based on models calibrated in experiments to fill satellite-like gaps introduced to GLDAS Noah reanalysis soil moisture (Rodell et al., 2004), and consider the gap size and local vegetation conditions as parameters that affect the gapfilling performance.
You can use command line tools such as wget or curl to download (and extract) data for multiple years. The following command will download and extract the complete data set to the local directory ~/Download on Linux or macOS systems.
#!/bin/bash
# Set download directory
DOWNLOAD_DIR=~/Downloads
base_url="https://researchdata.tuwien.at/records/3fcxr-cde10/files"
# Loop through years 1991 to 2023 and download & extract data
for year in {1991..2023}; do
echo "Downloading $year.zip..."
wget -q -P "$DOWNLOAD_DIR" "$base_url/$year.zip"
unzip -o "$DOWNLOAD_DIR/$year.zip" -d $DOWNLOAD_DIR
rm "$DOWNLOAD_DIR/$year.zip"
done
The dataset provides global daily estimates for the 1991-2023 period at 0.25° (~25 km) horizontal grid resolution. Daily images are grouped by year (YYYY), each subdirectory containing one netCDF image file for a specific day (DD), month (MM) in a 2-dimensional (longitude, latitude) grid system (CRS: WGS84). The file name has the following convention:
ESACCI-SOILMOISTURE-L3S-SSMV-COMBINED_GAPFILLED-YYYYMMDD000000-fv09.1r1.nc
Each netCDF file contains 3 coordinate variables (WGS84 longitude, latitude and time stamp), as well as the following data variables:
Additional information for each variable is given in the netCDF attributes.
Changes in v9.1r1 (previous version was v09.1):
These data can be read by any software that supports Climate and Forecast (CF) conform metadata standards for netCDF files, such as:
The following records are all part of the Soil Moisture Climate Data Records from satellites community
1 |
ESA CCI SM MODELFREE Surface Soil Moisture Record | <a href="https://doi.org/10.48436/svr1r-27j77" target="_blank" |
The netCDF (network Common Data Form) file format is increasingly used to store and manage multidimensional scientific data. Although netCDF files offer multiple advanced features and functionality in their own right, workflows that involve netCDF files can be intimidating for new users due to their binary format. There are several methods to manage netCDF file data including via libraries in programming languages such as Fortran or Python. However these methods require knowledge of the programming languages as a prerequisite. Other user-interface applications such as Panoply, NetCDF Explorer, or ArcGIS have functionality to access, view, and in some cases modify or create netCDF files. Another tool to manage netCDF files is the netCDF operators (NCO). NCO is a set of command line tools developed and maintained by the original creators of the netCDF file, the Unidata program at the University Corporation for Atmospheric Research. As such NCO tools are highly optimized and flexible, allowing a myriad of netCDF workflows. This html-based tutorial aims to demystify basic functionalities and syntax of NCO commands that are useful for analysing netCDF scientific data. The tutorial contains multiple examples that focus on scientific data (e.g. climatic measurements or model output) analysis including code snippets, explanations, and figures. Specifically, part 1 covers basic concatenation and averaging of single and ensemble record variables using the ncrcat, ncecat, ncra, and ncea commands respectively. Part 2 builds on part 1 and focuses on basic and advanced uses of the weighted-averaging command ncwa. Examples of other common NCO commands including breif desctiptions on how to download or install the package, and tools for netCDF visualization are also included in the tutorial. Although the tutorial is not in depth, as it does not explicitly cover all the NCO commands nor all of their options, it is a good starting point as many other NCO commands follow similar syntax and conventions.
Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
License information was derived automatically
One of the clearest manifestations of ongoing global climate change is the dramatic retreat and thinning of the Arctic sea-ice cover. While all state-of-the-art climate models consistently reproduce the sign of these changes, they largely disagree on their magnitude, the reasons for which remain contentious. As such, consensual methods to reduce uncertainty in projections are lacking. Here, using the CMIP5 ensemble, we propose a process-oriented approach to revisit this issue. We show that inter-model differences in sea-ice loss and, more generally, in simulated sea-ice variability, can be traced to differences in the simulation of seasonal growth and melt. The way these processes are simulated is relatively independent of the complexity of the sea-ice model used, but rather a strong function of the background thickness. The larger role played by thermodynamic processes as sea ice thins further suggests the recent and projected reductions in sea-ice thickness induce a transition of the Arctic towards a state with enhanced volume seasonality but reduced interannual volume variability and persistence, before summer ice-free conditions eventually occur. These results prompt modelling groups to focus their priorities on the reduction of sea-ice thickness biases.
https://entrepot.recherche.data.gouv.fr/api/datasets/:persistentId/versions/1.0/customlicense?persistentId=doi:10.57745/P92JBLhttps://entrepot.recherche.data.gouv.fr/api/datasets/:persistentId/versions/1.0/customlicense?persistentId=doi:10.57745/P92JBL
Fully processed and cleaned up netCDF files for Maison en Terre data. They are fully organized by themes, for better replication and understanding of every community. You will find, in this ZIP file: All files from 2018 to 2025, with each file for a year of data (Exception for 2018 that only has from 2018-11-16 to 2018-12-31 and 2025 that only has from 2025-01-01 to 2025-05-15). Global netCDF file with all years, from 2018 to 2025. All netCDF files are organized following this list: Air quality: Interior CO₂ (north and south), CO₂ (PPM) and micro-particles (µg/m³) Climatic parameters: Interior temperature (north and south), perceived temperature and temperature (°C), interior humidity (north and south), absolute and relative humidity (%), pyranometer measurements (W/m², kWh/m²), azimuth and solar elevation, weather station temperature and sensor state (°C, boolean), front, left, right and maximum light intensity (lux), wind speed (m/s) and rain. Energetic performance: Connectors, light, outlets and ventilation energy consumption. Connectors, light, outlets and ventilation power (W). Maison en terre structure: Roof, south walls and walls temperature (°C). All these netCDF files can be used to compare with own data, and are targeted towards students, researchers, and environmental office workers. These netCDF files are also a good alternative if not wanting to replicate the whole workflow and stick to aleardy processed data.
https://ottawa.ca/en/city-hall/get-know-your-city/open-data#open-data-licence-version-2-0https://ottawa.ca/en/city-hall/get-know-your-city/open-data#open-data-licence-version-2-0
This dataset contains netcdf files for the indices calculated in the report. Timeseries of the index (for each tridecade, year, season, or month) are provided for each grid cell and for each model.
Accuracy: Index-dependent caveats are detailed in the report.
Update Frequency: One-time upload (2020)
Obtained from: Findings obtained during the project.
Contact: Climate Change and Resiliency Unit
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
From http://northweb.hpl.umces.edu/LTRANS.htm. CHECK FOR UPDATES. NEWER VERSION MAY BE AVAILABLE.
PDF of original LTRANS_v.2b website [2016-09-14]
LTRANS v.2b Model Description
The Larval TRANSport Lagrangian model (LTRANS v.2b) is an off-line particle-tracking model that runs with the stored predictions of a 3D hydrodynamic model, specifically the Regional Ocean Modeling System (ROMS). Although LTRANS was built to simulate oyster larvae, it can easily be adapted to simulate passive particles and other planktonic organisms. LTRANS v.2 is written in Fortran 90 and is designed to track the trajectories of particles in three dimensions. It includes a 4th order Runge-Kutta scheme for particle advection and a random displacement model for vertical turbulent particle motion. Reflective boundary conditions, larval behavior, and settlement routines are also included. A brief description of the LTRANS particle-tracking model can be found here (68 KB .pdf). For more information on LTRANS and the application of LTRANS to oyster larvae transport, see a summary web page with animations, the publications North et al. (2008, 2011), and the LTRANS v.2 User's Guide. Please cite North et al. (2011) when referring to LTRANS v.2b. The updates that were made for LTRANS v.2b are listed here.
The Lagrangian TRANSport (LTRANS v.2b) model is based upon LTRANS v.1 (formerly the Larval TRANSport Lagrangian model). Ian Mitchell made the bug fixes in LTRANS v.2b. Zachary Schlag completed signigicant updates to the code in LTRANS v.2 with input from Elizabeth North, Chris Sherwood, and Scott Peckham. LTRANS v.1 was built by Elizabeth North and Zachary Schlag of University of Maryland Center for Environmental Science Horn Point Laboratory. Funding was provided by the National Science Foundation Biological and Physical Oceanography Programs**, Maryland Department of Natural Resources, NOAA Chesapeake Bay Studies, NOAA Maryland Sea Grant College Program, and NOAA-funded UMCP Advanced Study Institute for the Environment.
A beta version of LTRANS v2b which uses predictions from the circulation model ADCIRC is available here.
LTRANS Code
LTRANS v.2b Open Source Code. We would appreciate knowing who is using LTRANS. If you would like to share this information with us, please send us your name, contact information, and a brief description of how you plan to use LTRANS to enorth@umces.edu. To refer to LTRANS in a peer-reviewed publication, please cite the publication(s) listed in the Description section above.
License file. This license was based on the ROMS license. Please note that this license applies to all sections of LTRANS v.2b except those listed in the 'External Dependencies and Programs' section below. | |
LTRANS v.2b Code. This zip file contains the LTRANS code, license, and User's Guide. Section II of the LTRANS v.2 User's Guide contains instructions for setting up and running LTRANS v.2b in Linux and Windows environments. Before using LTRANS v.2b, please read the External Dependencies and Programs section below. This version of LTRANS is parameterized to run with the input files that are available in the LTRANS v.2b Example Input Files section below. This section also contains a tar ball with this code and the example input files. |
External Dependencies and Programs. LTRANS v.2b requires NetCDF libraries and uses the following programs to calculate random numbers (Mersenne Twister) and fit tension splines (TSPACK). Because LTRANS v.2 reads-in ROMS-generated NetCDF (.nc) files, it requires that the appropriate NetCDF libraries be installed on your computer (see files and links below). Also, please note that although the Mersenne Twister and TSPACK programs are included in the LTRANS v.2b in the Random_module.f90 and Tension_module.f90, respectively, they do not share the same license file as LTRANS v.2b. Please review and respect their permissions (links and instructions provided below).
Windows Visual Fortran NetCDF libraries. These NetCDF files that are compatible with Visual Fortran were downloaded from the Unidata NetCDF Binaries Website for LTRANS v.1. The NetCDF 90 files were downloaded from Building the F90 API for Windows for the Intel ifort compilerwebsite. The VF-NetCDF.zip folder contains README.txt that describes where to place the enclosed files. If these files do not work, you may have to download updated versions or build your own by following the instructions at the UCAR Unidata NetCDF website. | |
Linux NetCDF libraries. Linux users will likely have to build their own Fortran 90 libraries using the source code/binaries that are available on the UCAR Unidata NetCDF website. | |
Mersenne Twister random number generator. This program was recoded into F90 and included in the Random_module.f90 in LTRANS. See the Mersenne Twister Home Page for more information about this open source program. If you plan to use this program in LTRANS, please send an email to: m-mat @ math.sci.hiroshima-u.ac.jp (remove space) to inform the developers as a courtesy. | |
| TSPACK: tension spline curve-fitting package. This program (ACM TOMS Algorithm 716) was created by Robert J. Renka and is used in LTRANS as part of the water column profile interpolation technique. The original TSPACK code can be found at the link to the left and is copyrighted by the Association for Computing Machinery (ACM). With the permission of Dr. Renka and ACM, TSPACK was modified for use in LTRANS by removing unused code and call variables and updating it to Fortran 90. The modified version of TSPACK is included in the LTRANS source code in the Tension Spline Module (tension_module.f90). If you would like to use LTRANS with the modified TSPACK software, please read and respect the ACM Software Copyright and License Agreement. For noncommercial use, ACM grants "a royalty-free, nonexclusive right to execute, copy, modify and distribute both the binary and source code solely for academic, research and other similar noncommercial uses" subject to the conditions noted in the license agreement. Note that if you plan commercial use of LTRANS with the modified TSPACK software, you must contact ACM at permissions@acm.org to arrange an appropriate license. It may require payment of a license fee for commerical use. |
LTRANS v.2b Example Input Files. These files can be used to test LTRANS v.2b. They include examples of particle location and habitat polygon input files (.csv) and ROMS grid and history files (.nc) that are needed to run LTRANS v.2b. Many thanks to Wen Long for sharing the ROMS .nc files. The LTRANS v.2b code above is configured to run with these input files. Note: please download the tar (LTRANSv2.tgz) history files (clippped_macroms_his_*.nc) files between the hours of 5 pm and 6 am Eastern Standard Time because of their large size.
https://entrepot.recherche.data.gouv.fr/api/datasets/:persistentId/versions/1.1/customlicense?persistentId=doi:10.57745/S5EQYNhttps://entrepot.recherche.data.gouv.fr/api/datasets/:persistentId/versions/1.1/customlicense?persistentId=doi:10.57745/S5EQYN
Fully processed and cleaned up netCDF files for BBC room data. They are fully organized by themes, for better replication and understanding of every community. All 39 measurements included are organized by themes. You can find, in the ZIP file: netCDF files of BBC room data: seperated by years. That means you will find a file for 2019, 2020, 2021 and so on to 2024. Global netCDF file with all years, from 2019 to 2024. All netCDF files are organized following this list: Office environement: Ambiant values (Temperature, humidity and CO2), Indicators (Presence, ventilation, VOC and heating), Doors and windows (Opened of closed, only starting in 2020) Specialized office values Relative temperature, humidity and CO2 concentration Specialized sensors:Thermocouples sensors(TC1-4, temperature), Triosys 1-2 measuring temperature and triosys 1 measuring humidity. Solar measurements: Adjustable sunshade (south, southeast, east, southwest and west), starting in 2020. Pyranometer energy and irradiance. Illuminance: North and south. Energy: General, heating, kitchen, main and outlets meter. All these netCDF files can be used to compare with own data, and are targeted towards students, researchers, and environmental office workers. These netCDF files are also a good alternative if not wanting to replicate the whole workflow and stick to aleardy processed data.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Data Summary: US states grid mask file and NOAA climate regions grid mask file, both compatible with the 12US1 modeling grid domain. Note:The datasets are on a Google Drive. The metadata associated with this DOI contain the link to the Google Drive folder and instructions for downloading the data. These files can be used with CMAQ-ISAMv5.3 to track state- or region-specific emissions. See Chapter 11 and Appendix B.4 in the CMAQ User's Guide for further information on how to use the ISAM control file with GRIDMASK files. The files can also be used for state or region-specific scaling of emissions using the CMAQv5.3 DESID module. See the DESID Tutorial and Appendix B.4 in the CMAQ User's Guide for further information on how to use the Emission Control File to scale emissions in predetermined geographical areas. File Location and Download Instructions: Link to GRIDMASK files Link to README text file with information on how these files were created File Format: The grid mask are stored as netcdf formatted files using I/O API data structures (https://www.cmascenter.org/ioapi/). Information on the model projection and grid structure is contained in the header information of the netcdf file. The output files can be opened and manipulated using I/O API utilities (e.g. M3XTRACT, M3WNDW) or other software programs that can read and write netcdf formatted files (e.g. Fortran, R, Python). File descriptions These GRIDMASK files can be used with the 12US1 modeling grid domain (grid origin x = -2556000 m, y = -1728000 m; N columns = 459, N rows = 299). GRIDMASK_STATES_12US1.nc - This file containes 49 variables for the 48 states in the conterminous U.S. plus DC. Each state variable (e.g., AL, AZ, AR, etc.) is a 2D array (299 x 459) providing the fractional area of each grid cell that falls within that state. GRIDMASK_CLIMATE_REGIONS_12US1.nc - This file containes 9 variables for 9 NOAA climate regions based on the Karl and Koss (1984) definition of climate regions. Each climate region variable (e.g., CLIMATE_REGION_1, CLIMATE_REGION_2, etc.) is a 2D array (299 x 459) providing the fractional area of each grid cell that falls within that climate region. NOAA Climate regions: CLIMATE_REGION_1: Northwest (OR, WA, ID) CLIMATE_REGION_2: West (CA, NV) CLIMATE_REGION_3: West North Central (MT, WY, ND, SD, NE) CLIMATE_REGION_4: Southwest (UT, AZ, NM, CO) CLIMATE_REGION_5: South (KS, OK, TX, LA, AR, MS) CLIMATE_REGION_6: Central (MO, IL, IN, KY, TN, OH, WV) CLIMATE_REGION_7: East North Central (MN, IA, WI, MI) CLIMATE_REGION_8: Northeast (MD, DE, NJ, PA, NY, CT, RI, MA, VT, NH, ME) + Washington, D.C.* CLIMATE_REGION_9: Southeast (VA, NC, SC, GA, AL, GA) *Note that Washington, D.C. is not included in any of the climate regions on the website but was included with the “Northeast” region for the generation of this GRIDMASK file.
Errata: Due to a coding error, monthly files with "dma8epax" statistics were wrongly aggregated. This concerns all gridded files of this metric as well as the monthly aggregated csv files. All erroneous files were replaced with corrected versions on Jan, 16th, 2018. Each updated file contains a version label "1.1" and a brief description of the error. If you have made use of previous TOAR data files with the "dma8epax" metric, please exchange your data files.
Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
License information was derived automatically
The Northern Circumpolar Soil Carbon Database version 2 (NCSCDv2) is a geospatial database created for the purpose of quantifying storage of organic carbon in soils of the northern circumpolar permafrost region down to a depth of 300 cm. The NCSCDv2 is based on polygons from different regional soils maps homogenized to the U.S. Soil Taxonomy. The NCSCDv2 contains information on fractions of coverage of different soil types (following U.S. Soil Taxonomy nomenclature) as well as estimated storage of soil organic carbon (kg/m2) between 0-30 cm, 0-100 cm, 100-200 cm and 200-300 cm depth. The database was compiled by combining and homogenizing several regional/national soil maps. To calculate storage of soil organic carbon, these soil maps have been linked to field-data on soil organic carbon storage from sites with circumpolar coverage. More information on database processing and properties can be found in the product guide. Citation
Attribution 1.0 (CC BY 1.0)https://creativecommons.org/licenses/by/1.0/
License information was derived automatically
Overview
Zip file contains two netCDF files with a subset of data from the "No Change 2" (NC2) experiment conducted by Savi et al., 2020 and published in Earth Surface Dynamics (https://doi.org/10.5194/esurf-8-303-2020) with the original data available via the Sediment Experimentalists Network Project Space SEAD Internal Repository (https://doi.org/10.26009/s0ZOQ0S6). Topographic scan data were re-formatted into the netCDF file "T_NC2_scans.nc", and overhead imagery was extracted from the video of the experiment approximately once every minute of experimental time and RGB band data is provided in the formatted netCDF file "T_NC2_images.nc". These data were formatted into netCDF files for easy loading into the "deltametrics" analysis toolbox.
Additional Details
Re-packaging the scan data from the .tif files was straightforward. From the metadata spreadsheet, we know the times at which the scans were taken (and can eliminate the redundant scan). From the paper itself we know the resolution of the topographic scans is 1 mm in the horizontal and vertical. We also know the input discharges, both water and sediment, through both the main channel and tributary, from the paper. We provide these values as metadata in the netCDF files. The scans form the 'eta' field representing the topography in the file. The packaged up netCDF file is called 'T_NC2_scans.nc'.
Overhead imagery from the T_NC2_Complete21fps.wmv video file was extracted using the following command:
ffmpeg -i T_NC2_Complete21fps.wmv -r 21 T_NC2_frames/%04d.png
This command utilizes the ffmpeg tool to extract the frames at a rate of 21 frames per second (-r 21) as the file name implies that is the rate at which the overhead photos were combined into a video. The NC designation indicates that this experiment was performed with no change in the input conditions in either the main or tributary channels.
The experiment ran for a total of 480 minutes. A total of 1466 images were obtained from the ffmpeg extraction. This translates to an image approximately every 20 seconds of real time (480 minutes / 1466 frames * 60 seconds/minute = 19.6453 seconds / frame). We sample every 3rd frame, which gives us images roughly once a minute (489 frames in all), to create the subset of data re-packaged as a netCDF file for deltametrics. Dimensions for the pixels were approximated based on our knowledge of the topographic scan resolution. Assuming the extents of the scans and overhead images are the same (although they are not), we calculate the number of millimeters per pixel in the x and y directions for the overhead images. We assume the pixels are more likely to be square than rectangular, so we average these values and assign this as the distance per pixel in both the x and y dimensions for these data.
References
Savi, Sara, et al. "Interactions between main channels and tributary alluvial fans: channel adjustments and sediment-signal propagation." Earth Surface Dynamics 8.2 (2020): 303-322.
Physical experiments on interactions between main-channels and tributary alluvial fans
S. Savi, Tofelde, A. Wickert, A. Bufe, T. Schildgen, and M. Strecker
https://doi.org/10.26009/s0ZOQ0S6
CTD profile (Network Common Data Format (NetCDF) files) station ANTARES - Service d'Observation en Milieu Littoral (SOMLIT). Conductivity, Temperature, Depth (CTD) profile Data for station ANTARES _NCProperties=version=1|netcdflibversion=4.6.0|hdf5libversion=1.10.0 cdm_data_type=Profile cdm_profile_variables=depth,latitude,stationname,longitude comment=Seabird CTD Measurements : NetCDF files contact=Christian Grenz (christian.grenz@mio.osupytheas.fr) contributor_email=maurice.libes@osupytheas.fr contributor_institution=MIO UMR 7294 CNRS / OSU Pytheas UMS 3470 CNRS contributor_name=OSU Pytheas UMS 3470 CNRS contributor_role=data formatting in netCDF contributor_type=institution, contributor_url=http://www.osupytheas.fr Conventions=CF-1.6, ACDD-1.3, COARDS description=CTD profile (NetCDF files) for station ANTARES Easternmost_Easting=6.14 featureType=Profile geospatial_lat_max=42.86 geospatial_lat_min=42.48 geospatial_lat_units=degrees_north geospatial_lon_max=6.14 geospatial_lon_min=6.04 geospatial_lon_units=degrees_east geospatial_vertical_max=-0.9 geospatial_vertical_min=-2505.0 geospatial_vertical_positive=down geospatial_vertical_units=m history=Created 01/04/20 infoUrl=http://www.mio.osupytheas.fr institution=MIO UMR 7294 CNRS / OSU Pytheas UMS 3470 CNRS keywords_vocabulary=GCMD Science Keywords Northernmost_Northing=42.86 processing_level=1a production=MIO UMR 7294 CNRS / OSU Pytheas UMS3470 CNRS program=SOMLIT Service d’Observation en Milieu Littoral project=SOMLIT references=https://www.mio.osupytheas.fr/ source=water column measurements observation sourceUrl=(local files) Southernmost_Northing=42.48 standard_name_vocabulary=CF Standard Name Table v55 subsetVariables=time, stationname, latitude, longitude, oxygenmol testOutOfDate=now-49days time_coverage_end=2020-03-08T10:10:00Z time_coverage_start=2000-01-01T23:50:00Z Westernmost_Easting=6.04
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Atmospheric data associated with "Climate responses to the splitting of a supercontinent: Implications for the breakup of Pangea" by Tabor et al. (2019; GRL). Files contain climatologies of FLNS, FLNSC, FLNT, FLNTC, FSNS, FSNSC, FSNT, FSNTC, LHFLX, LWCF, OMEGA, PRECC, PRECL, PS, SHFLX, SWCF, TS, U, and V variables for the One Continent and Two Continents geographic configurations discussed in the manuscript.
CLAPPP Project : Conductivity, Temperature, Depth (CTD) profile (Network Common Data Format (NetCDF) files) station voh1-6. The New Caledonia lagoons show high seasonal and interannual variability (related to El Nino - Southern Oscillation (ENSO) - variability). They present a great diversity of local situations linked to differences in their geomorphology, to the nature of terrigenous inputs and to varied anthropogenic pressure. _NCProperties=version=2,netcdf=4.7.3,hdf5=1.10.4 cdm_data_type=Profile cdm_profile_variables=stationname,depth comment=Seabird CTD SBE19plus Measurements : NetCDF files contact=martine Rodier (martine.rodier@mio.osupytheas.fr) contributor_email=maurice.libes@osupytheas.fr contributor_institution=MIO UMR 7294 CNRS / OSU Pytheas UMS 3470 CNRS contributor_name=OSU Pytheas UMS 3470 CNRS contributor_role=data formatting in netCDF contributor_type=institution, contributor_url=http://www.osupytheas.fr Conventions=CF-1.6, ACDD-1.3, COARDS defaultGraphQuery=latitude%2Clongitude%2Ctemperature&time>2008-01-01T&time<2019-01-01T&.draw=markers&.marker=7|5 description=CTD profile (NetCDF files) for station ./Clappp5//voh-6 doi=https://doi.org/10.34930/2b52defe-e5f3-4fe2-9f2f-741d90e624ea Easternmost_Easting=167.04 featureType=Profile geospatial_lat_max=-20.3 geospatial_lat_min=-22.4 geospatial_lat_units=degrees_north geospatial_lon_max=167.04 geospatial_lon_min=164.09 geospatial_lon_units=degrees_east geospatial_vertical_max=-0.1 geospatial_vertical_min=-77.1 geospatial_vertical_positive=down geospatial_vertical_units=m history=Created 02/06/21 infoUrl=https://dataset.osupytheas.fr/geonetwork/srv/fre/catalog.search#/metadata/2b52defe-e5f3-4fe2-9f2f-741d90e624ea institution=MIO UMR7294 CNRS - OSU Pytheas keywords_vocabulary=GCMD Science Keywords Northernmost_Northing=-20.3 processing_level=1a production=MIO UMR 7294 CNRS / OSU Pytheas UMS3470 CNRS program=CLAPPP New Caledonian lagoons: Physics and Phytoplankton processes project=Campagne Clappp5 : New Caledonian lagoons: Physics and Phytoplankton processes references=https://www.mio.osupytheas.fr/ source=water column measurements observation by Seabird SBE19plus CTD Measurements sourceUrl=(local files) Southernmost_Northing=-22.4 standard_name_vocabulary=CF Standard Name Table v70 subsetVariables=time, stationname, latitude, longitude time_coverage_end=2010-11-24T08:29:00Z time_coverage_start=2008-11-11T09:33:00Z Westernmost_Easting=164.09
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is used for testing of the Hyrax server data download
No description is available. Visit https://dataone.org/datasets/38d89b6d1ac13884f7d34552f061fd28 for complete metadata about this dataset.
Hydrological and meteorological information can help inform the conditions and risk factors related to the environment and their inhabitants. Due to the limitations of observation sampling, gridded data sets provide the modeled information for areas where data collection are infeasible using observations collected and known process relations. Although available, data users are faced with barriers to use, challenges like how to access, acquire, then analyze data for small watershed areas, when these datasets were produced for large, continental scale processes. In this tutorial, we introduce Observatory for Gridded Hydrometeorology (OGH) to resolve such hurdles in a use-case that incorporates NetCDF gridded data sets processes developed to interpret the findings and apply secondary modeling frameworks (landlab).
LEARNING OBJECTIVES - Familiarize with data management, metadata management, and analyses with gridded data - Inspecting and problem solving with Python libraries - Explore data architecture and processes - Learn about OGH Python Library - Discuss conceptual data engineering and science operations
Use-case operations: 1. Prepare computing environment 2. Get list of grid cells 3. NetCDF retrieval and clipping to a spatial extent 4. Extract NetCDF metadata and convert NetCDFs to 1D ASCII time-series files 5. Visualize the average monthly total precipitations 6. Apply summary values as modeling inputs 7. Visualize modeling outputs 8. Save results in a new HydroShare resource
For inquiries, issues, or contribute to the developments, please refer to https://github.com/freshwater-initiative/Observatory