Facebook
TwitterNo description was included in this Dataset collected from the OSF
Facebook
TwitterThe Terra Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Water Bodies Database (ASTWBD) Version 1 data product provides global coverage of water bodies larger than 0.2 square kilometers at a spatial resolution of 1 arc second (approximately 30 meters) at the equator, along with associated elevation information. The ASTWBD data product was created in conjunction with the ASTER Global Digital Elevation Model (ASTER GDEM) Version 3 data product by the Sensor Information Laboratory Corporation (SILC) in Tokyo. The ASTER GDEM Version 3 data product was generated using ASTER Level 1A scenes acquired between March 1, 2000, and November 30, 2013. The ASTWBD data product was then generated to correct elevation values of water body surfaces.To generate the ASTWBD data product, water bodies were separated from land areas and then classified into three categories: ocean, river, or lake. Oceans and lakes have a flattened, constant elevation value. The effects of sea ice were manually removed from areas classified as oceans to better delineate ocean shorelines in high latitude areas. For lake water bodies, the elevation for each lake was calculated from the perimeter elevation data using the mosaic image that covers the entire area of the lake. Rivers presented a unique challenge given that their elevations gradually step down from upstream to downstream; therefore, visual inspection and other manual detection methods were required. The geographic coverage of the ASTWBD extends from 83°N to 83°S. Each tile is distributed in GeoTIFF format and referenced to the 1984 World Geodetic System (WGS84)/1996 Earth Gravitational Model (EGM96) geoid. Each data product is provided as a zipped file that contains an attribute file with the water body classification information and a DEM file, which provides elevation information in meters.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains information on the Surface Soil Moisture (SM) content derived from satellite observations in the microwave domain.
A description of this dataset, including the methodology and validation results, is available at:
Preimesberger, W., Stradiotti, P., and Dorigo, W.: ESA CCI Soil Moisture GAPFILLED: an independent global gap-free satellite climate data record with uncertainty estimates, Earth Syst. Sci. Data, 17, 4305–4329, https://doi.org/10.5194/essd-17-4305-2025, 2025.
ESA CCI Soil Moisture is a multi-satellite climate data record that consists of harmonized, daily observations coming from 19 satellites (as of v09.1) operating in the microwave domain. The wealth of satellite information, particularly over the last decade, facilitates the creation of a data record with the highest possible data consistency and coverage.
However, data gaps are still found in the record. This is particularly notable in earlier periods when a limited number of satellites were in operation, but can also arise from various retrieval issues, such as frozen soils, dense vegetation, and radio frequency interference (RFI). These data gaps present a challenge for many users, as they have the potential to obscure relevant events within a study area or are incompatible with (machine learning) software that often relies on gap-free inputs.
Since the requirement of a gap-free ESA CCI SM product was identified, various studies have demonstrated the suitability of different statistical methods to achieve this goal. A fundamental feature of such gap-filling method is to rely only on the original observational record, without need for ancillary variable or model-based information. Due to the intrinsic challenge, there was until present no global, long-term univariate gap-filled product available. In this version of the record, data gaps due to missing satellite overpasses and invalid measurements are filled using the Discrete Cosine Transform (DCT) Penalized Least Squares (PLS) algorithm (Garcia, 2010). A linear interpolation is applied over periods of (potentially) frozen soils with little to no variability in (frozen) soil moisture content. Uncertainty estimates are based on models calibrated in experiments to fill satellite-like gaps introduced to GLDAS Noah reanalysis soil moisture (Rodell et al., 2004), and consider the gap size and local vegetation conditions as parameters that affect the gapfilling performance.
You can use command line tools such as wget or curl to download (and extract) data for multiple years. The following command will download and extract the complete data set to the local directory ~/Download on Linux or macOS systems.
#!/bin/bash
# Set download directory
DOWNLOAD_DIR=~/Downloads
base_url="https://researchdata.tuwien.at/records/3fcxr-cde10/files"
# Loop through years 1991 to 2023 and download & extract data
for year in {1991..2023}; do
echo "Downloading $year.zip..."
wget -q -P "$DOWNLOAD_DIR" "$base_url/$year.zip"
unzip -o "$DOWNLOAD_DIR/$year.zip" -d $DOWNLOAD_DIR
rm "$DOWNLOAD_DIR/$year.zip"
done
The dataset provides global daily estimates for the 1991-2023 period at 0.25° (~25 km) horizontal grid resolution. Daily images are grouped by year (YYYY), each subdirectory containing one netCDF image file for a specific day (DD), month (MM) in a 2-dimensional (longitude, latitude) grid system (CRS: WGS84). The file name has the following convention:
ESACCI-SOILMOISTURE-L3S-SSMV-COMBINED_GAPFILLED-YYYYMMDD000000-fv09.1r1.nc
Each netCDF file contains 3 coordinate variables (WGS84 longitude, latitude and time stamp), as well as the following data variables:
Additional information for each variable is given in the netCDF attributes.
Changes in v9.1r1 (previous version was v09.1):
These data can be read by any software that supports Climate and Forecast (CF) conform metadata standards for netCDF files, such as:
The following records are all part of the ESA CCI Soil Moisture science data records community
| 1 |
ESA CCI SM MODELFREE Surface Soil Moisture Record | <a href="https://doi.org/10.48436/svr1r-27j77" target="_blank" |
Facebook
Twitterhttps://ottawa.ca/en/city-hall/get-know-your-city/open-data#open-data-licence-version-2-0https://ottawa.ca/en/city-hall/get-know-your-city/open-data#open-data-licence-version-2-0
This dataset contains netcdf files for the indices calculated in the report. Timeseries of the index (for each tridecade, year, season, or month) are provided for each grid cell and for each model.
Accuracy: Index-dependent caveats are detailed in the report.
Update Frequency: One-time upload (2020)
Obtained from: Findings obtained during the project.
Contact: Climate Change and Resiliency Unit
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Title:
Flume Experiment Dataset – Granular Flow Tests (2023)
Authors:
I. Koa, A. Recking, F. Gimbert, H. Bellot, G. Chambon, T. Faug
Contact:
islamkoaa111@gmail.com
Description:
This dataset contains NetCDF (.nc) files from controlled flume experiments conducted in 2023 to study the transition from bedload to complex granular flow dynamics on steep slopes. Each file name encodes the experiment date and test number (e.g., CanalMU-20-04-2023-test5.nc = Test 5 on April 20, 2023).
Each test corresponds to a specific discharge (Q) value, detailed in the table below.
Example filename:
CanalMU-20-04-2023-test5.nc → Test 5 conducted on April 20, 2023.
Discharge Table:
Discharge (l/s) | Date | Test Number
----------------|-------------|-------------
0.14 | 06-04-2023 | Test 3
0.14 | 04-05-2023 | Test 5
0.15 | 13-04-2023 | Test 3
0.15 | 14-04-2023 | Test 1
0.15 | 14-04-2023 | Test 2
0.16 | 17-04-2023 | Test 2
0.16 | 18-04-2023 | Test 3
0.16 | 04-05-2023 | Test 3
0.16 | 04-05-2023 | Test 4
0.17 | 18-04-2023 | Test 4
0.17 | 18-04-2023 | Test 5
0.17 | 20-04-2023 | Test 2
0.17 | 20-04-2023 | Test 4
0.17 | 20-04-2023 | Test 5
0.18 | 20-04-2023 | Test 8
0.18 | 20-04-2023 | Test 9
0.19 | 20-04-2023 | Test 10
0.19 | 20-04-2023 | Test 11
0.20 | 20-04-2023 | Test 12
0.20 | 04-05-2023 | Test 1
0.20 | 04-05-2023 | Test 2
0.21 | 20-04-2023 | Test 13
0.21 | 21-04-2023 | Test 1
0.21 | 21-04-2023 | Test 2
0.22 | 21-04-2023 | Test 3
0.22 | 21-04-2023 | Test 4
0.23 | 21-04-2023 | Test 5
0.23 | 27-04-2023 | Test 2
0.23 | 27-04-2023 | Test 3
0.23 | 28-04-2023 | Test 7
0.24 | 28-04-2023 | Test 1
0.24 | 28-04-2023 | Test 2
0.24 | 28-04-2023 | Test 3
0.25 | 28-04-2023 | Test 4
0.25 | 21-06-2023 | Test 1
0.26 | 28-04-2023 | Test 6
0.26 | 21-06-2023 | Test 3
0.26 | 21-06-2023 | Test 4
0.27 | 22-06-2023 | Test 2
0.27 | 22-06-2023 | Test 3
0.27 | 22-06-2023 | Test 1
Data Acquisition and Processing:
The original data were acquired using LabVIEW and saved in TDMS (.tdms) format. These files were processed using custom Python scripts to extract synchronized time-series data, assign physical units, and store the results in structured NetCDF-4 files.
NetCDF File Structure:
Each file includes the following structured groups and variables:
1. Group: Data_Hydro (Hydraulic Measurements)
- Time_Hydro: Time [s]
- Date_et_heure_mesure: Measurement timestamps [string]
- Etat_de_l'interrupteur: Switch state [V]
- Debit_liquide_instant: Instantaneous water discharge [L/s]
- Debit_liquide_consigne: Target water discharge [L/s]
- Vitesse_tapis_instant: Instantaneous conveyor speed [m/s]
- Vitesse_tapis_consigne: Set conveyor speed [V]
- Debit_solide_instant: Instantaneous solid discharge [g/s]
- Hauteur1–4: Water heights from four sensors [cm]
2. Group: Data_Force (Impact Force Measurements)
- Time_Force: Time [s]
- Force_Normale: Vertical impact force [N]
- Force_Tangentielle: Tangential force [N]
3. Group: Data_Annexe (Experimental Metadata)
- channel_width, Channel_slope: Flume geometry
- Position_capteur_hauteur1–4: Water sensor locations [m]
- Position_capteur_force: Force sensor position [m]
- Plaque dimensions and mass: Plate size and weight [m, kg]
- Sensor frequencies and sensitivities [Hz, pC/N]
Format:
NetCDF-4 (.nc)
Suggested software for reading:
- Python (xarray, netCDF4)
- NASA Panoply
- MATLAB
Note:
The data were processed using custom Python scripts. These are available from the corresponding author upon request.
Example: Accessing NetCDF Data in Python
The dataset can be read using the `netCDF4` or `xarray` libraries in Python. Below is a simple example using netCDF4:
```python
from netCDF4 import Dataset
import numpy as np
# Open netCDF file
data = Dataset('CanalMU-20-04-2023-test5.nc')
# Load hydraulic data
thydro = data.groups['Data_Hydro'].variables['Time_Hydro'][:]
Qcons = data.groups['Data_Hydro'].variables['Debit_liquide_consigne'][:]
Qins = data.groups['Data_Hydro'].variables['Debit_liquide_instant'][:]
Tapis = data.groups['Data_Hydro'].variables['Vitesse_tapis_consigne'][:]
h1 = data.groups['Data_Hydro'].variables['Hauteur1'][:]
h2 = data.groups['Data_Hydro'].variables['Hauteur2'][:]
h3 = data.groups['Data_Hydro'].variables['Hauteur3'][:]
h4 = data.groups['Data_Hydro'].variables['Hauteur4'][:]
# Load force data
tforce = data.groups['Data_Force'].variables['Time_Force'][:]
FN = data.groups['Data_Force'].variables['Force_Normale'][:]
FT = data.groups['Data_Force'].variables['Force_Tangentielle'][:]
# Apply calibration factors
FN = FN
FT = FT
# Fetch metadata
slope = data.groups['Data_Annexe'].variables['Channel_slope']
alpha = np.arctan(slope[:]/100)
L = data.groups['Data_Annexe'].variables['Longueur_plaque_impact'][:]
W = data.groups['Data_Annexe'].variables['Largeur_plaque_impact'][:]
```
For more advanced processing, consider using `xarray` which provides easier multi-dimensional data access.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The NetCDF4 files in this repository were converted from their original source formats using GMTED2010-netcdf scripts.
Lemoine, F. G., S. C. Kenyon, J. K. Factor, R.G. Trimmer, N. K. Pavlis, D. S. Chinn, C. M. Cox, S. M. Klosko, S. B. Luthcke, M. H. Torrence, Y. M. Wang, R. G. Williamson, E. C. Pavlis, R. H. Rapp and T. R. Olson (1998). The Development of the Joint NASA GSFC and the National Imagery and Mapping Agency (NIMA) Geopotential Model EGM96. NASA/TP-1998-206861, July 1998. https://ntrs.nasa.gov/citations/19980218814
Pavlis, N. K., Holmes, S. A., Kenyon, S. C., & Factor, J. K. (2012). The development and evaluation of the Earth Gravitational Model 2008 (EGM2008). Journal of Geophysical Research: Solid Earth, 117(B4), 2011JB008916. https://doi.org/10.1029/2011JB008916
Danielson, J. J. and D. B. Gesch (2011). Global multi-resolution terrain elevation data 2010 (GMTED2010). U.S. Geologic Survey, Open-File Report 2011-1073, https://doi.org/10.3133/ofr20111073
Facebook
TwitterFunction name "cdf2mat" Please use this function to open MS-based chromatographic data from NETCDF (*.CDF) files. Resampling included for non-integer acquisition rates. Outputs nominal mass. Script optimized to process data from comprehensive two-dimensional gas chromatography coupled to mass spectrometry (GCxGC-MS). Updated to remove negative noise signal. INPUT file: Opens the netCDF like 'Sample01.CDF' rate_MS: Desired integer acquisition rate OUTPUT FullMS Full MS chromatogram (second order data tensor) axis_min Retention time axis in minutes axis_mz m/z axis in Daltons I/O: [TIC,FullMS,axis_min,axis_mz] = cdf2mat(file,rate_MS) Compiled with MATLAB R2021b (v.9.11.0.1809720). Requires the Signal Processing Toolbox (v.9.0). Based on netCDFload.m (Murphy, Wenig, Parcsi, Skov e Stuetz) e de iCDF_load (Skov e Bro 2008). K.R. Murphy, P. Wenig, G. Parcsi, T. Skov, R.M. Stuetz (in press) Characterizing odorous emissions using new software for identifying peaks in chemometric models of GC-MS datasets. Chem Intel Lab Sys. doi: 10.1016/j.chemolab.2012.07.006 Skov T and Bro R. (2008) Solving fundamental problems in chromatographic analysis, Analytical and Bioanalytical Chemistry, 390 (1): 281-285. doi: 10.1007/s00216-007-1618-z
Facebook
TwitterThis file contains the MISR Level 3 FIRSTLOOK Global Cloud public Product in netCDF format covering a month
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This data repository contains the accompanying data for the study by Stradiotti et al. (2025). Developed as part of the ESA Climate Change Initiative (CCI) Soil Moisture project. Project website: https://climate.esa.int/en/projects/soil-moisture/
This dataset was created as part of the following study, which contains a description of the algorithm and validation results.
Stradiotti, P., Gruber, A., Preimesberger, W., & Dorigo, W. (2025). Accounting for seasonal retrieval errors in the merging of multi-sensor satellite soil moisture products. Science of Remote Sensing, 12, 100242. https://doi.org/10.1016/j.srs.2025.100242
This repository contains the final, merged soil moisture and uncertainty values from Stradiotti et al. (2025), derived using a novel uncertainty quantification and merging scheme. In the accompanying study, we present a method to quantify the seasonal component of satellite soil moisture observations, based on Triple Collocation Analysis. Data from three independent satellite missions are used (from ASCAT, AMSR2, and SMAP). We observe consistent intra-annual variations in measurement uncertainties across all products (primarily caused by dynamics on the land surface such as seasonal vegetation changes), which affect the quality of the received signals. We then use these estimates to merge data from the three missions into a single consistent record, following the approach described by Dorigo et al. (2017). The new (seasonal) uncertainty estimates are propagated through the merging scheme, to enhance the uncertainty characterization of the final merged product provided here.
Evaluation against in situ data suggests that the estimated uncertainties of the new product are more representative of their true seasonal behaviour, compared to the previously used static approach. Based on these findings, we conclude that using a seasonal TCA approach can provide a more realistic characterization of dataset uncertainty, in particular its temporal variation. However, improvements in the merged soil moisture values are constrained, primarily due to correlated uncertainties among the sensors.
The dataset provides global daily gridded soil moisture estimates for the 2012-2023 period at 0.25° (~25 km) resolution. Daily images are grouped by year (YYYY), each subdirectory containing one netCDF image file for a specific day (DD), month (MM) in a 2-dimensional (longitude, latitude) grid system (CRS: WGS84). All file names follow the naming convention:
L3S-SSMS-MERGED-SOILMOISTURE-YYYYMMDD000000-fv0.1.nc
Each netCDF file contains 3 coordinate variables (WGS84 longitude, latitude and time stamp), as well as the following data variables:
After extracting the .nc files from the downloaded zip archived, they can read by any software that supports Climate and Forecast (CF) standard conform netCDF files, such as:
This dataset was produced with funding from the European Space Agency (ESA) Climate Change Initiative (CCI) Plus Soil Moisture Project (CCN 3 to ESRIN Contract No: 4000126684/19/I-NB "ESA CCI+ Phase 1 New R&D on CCI ECVS Soil Moisture"). Project website: https://climate.esa.int/en/projects/soil-moisture/
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is the accompanying dataset to the following paper https://www.nature.com/articles/s41597-023-01975-w
Caravan is an open community dataset of meteorological forcing data, catchment attributes, and discharge daat for catchments around the world. Additionally, Caravan provides code to derive meteorological forcing data and catchment attributes from the same data sources in the cloud, making it easy for anyone to extend Caravan to new catchments. The vision of Caravan is to provide the foundation for a truly global open source community resource that will grow over time.
If you use Caravan in your research, it would be appreciated to not only cite Caravan itself, but also the source datasets, to pay respect to the amount of work that was put into the creation of these datasets and that made Caravan possible in the first place.
All current development and additional community extensions can be found at https://github.com/kratzert/Caravan
Channel Log:
23 May 2022: Version 0.2 - Resolved a bug when renaming the LamaH gauge ids from the LamaH ids to the official gauge ids provided as "govnr" in the LamaH dataset attribute files.
24 May 2022: Version 0.3 - Fixed gaps in forcing data in some "camels" (US) basins.
15 June 2022: Version 0.4 - Fixed replacing negative CAMELS US values with NaN (-999 in CAMELS indicates missing observation).
1 December 2022: Version 0.4 - Added 4298 basins in the US, Canada and Mexico (part of HYSETS), now totalling to 6830 basins. Fixed a bug in the computation of catchment attributes that are defined as pour point properties, where sometimes the wrong HydroATLAS polygon was picked. Restructured the attribute files and added some more meta data (station name and country).
16 January 2023: Version 1.0 - Version of the official paper release. No changes in the data but added a static copy of the accompanying code of the paper. For the most up to date version, please check https://github.com/kratzert/Caravan
10 May 2023: Version 1.1 - No data change, just update data description.
17 May 2023: Version 1.2 - Updated a handful of attribute values that were affected by a bug in their derivation. See https://github.com/kratzert/Caravan/issues/22 for details.
16 April 2024: Version 1.4 - Added 9130 gauges from the original source dataset that were initially not included because of the area thresholds (i.e. basins smaller than 100sqkm or larger than 2000sqkm). Also extended the forcing period for all gauges (including the original ones) to 1950-2023. Added two different download options that include timeseries data only as either csv files (Caravan-csv.tar.xz) or netcdf files (Caravan-nc.tar.xz). Including the large basins also required an update in the earth engine code
16 Jan 2025: Version 1.5 - Added FAO Penman-Monteith PET (potential_evaporation_sum_FAO_PENMAN_MONTEITH) and renamed the ERA5-LAND potential_evaporation band to potential_evaporation_sum_ERA5_LAND. Also added all PET-related climated indices derived with the Penman-Monteith PET band (suffix "_FAO_PM") and renamed the old PET-related indices accordingly (suffix "_ERA5_LAND").
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Data for the figures of paper "Parametric drive of a double quantum dot in a cavity" by L. Jarjat, B. Hue, T. Philippe--Kagan, B. Neukelmance, J. Craquelin, A. Théry, C. Fruy, G. Abulizi, J. Becdelievre, M.M. Desjardins, T. Kontos and M.R. Delbecq
The dataset contains datafiles in netcdf *.nc file format and a jupyter notebook to open the dataset, create and display the figures. Each file is named as "data_M#%.nc' for Main figure data and "data_SM#%.nc' for supplementary material figures, with # the figure number and % the subpanels if needed.
The required python package are xarray (to open the files) and holoviews for data plotting.
Facebook
TwitterThis dataset provides daily historical Water Balance Model outputs from a Thornthwaite-type, single bucket model. Climate inputs to the model are from GridMet daily temperature and precipitation for the Continental United States (CONUS). The Water Balance Model output variables include the following: Potential Evapotranspiration (PET, mm), Actual Evapotranspiration (AET, mm), Moisture Deficit (Deficit, mm), Soil Water (soilwater, mm), Runoff (mm), Rain (mm), and Accumulated Snow Water Equivalent (accumswe, mm). The dataset covers the period from January 1 to December 31 for years 1980 through 2023 for the CONUS. Water Balance Model variables are provided as individual files, by variable and year, at a 1 km x 1 km spatial resolution and a daily temporal resolution. Data are in a North America Lambert Conformal Conic projection and are distributed in a standardized Climate and Forecast (CF)-compliant NetCDF file format.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
# ERA-NUTS (1980-2018)
This dataset contains a set of time-series of meteorological variables based on Copernicus Climate Change Service (C3S) ERA5 reanalysis. The data files can be downloaded from here while notebooks and other files can be found on the associated Github repository.
This data has been generated with the aim of providing hourly time-series of the meteorological variables commonly used for power system modelling and, more in general, studies on energy systems.
An example of the analysis that can be performed with ERA-NUTS is shown in this video.
Important: this dataset is still a work-in-progress, we will add more analysis and variables in the near-future. If you spot an error or something strange in the data please tell us sending an email or opening an Issue in the associated Github repository.
## Data
The time-series have hourly/daily/monthly frequency and are aggregated following the NUTS 2016 classification. NUTS (Nomenclature of Territorial Units for Statistics) is a European Union standard for referencing the subdivisions of countries (member states, candidate countries and EFTA countries).
This dataset contains NUTS0/1/2 time-series for the following variables obtained from the ERA5 reanalysis data (in brackets the name of the variable on the Copernicus Data Store and its unit measure):
- t2m: 2-meter temperature (`2m_temperature`, Celsius degrees)
- ssrd: Surface solar radiation (`surface_solar_radiation_downwards`, Watt per square meter)
- ssrdc: Surface solar radiation clear-sky (`surface_solar_radiation_downward_clear_sky`, Watt per square meter)
- ro: Runoff (`runoff`, millimeters)
There are also a set of derived variables:
- ws10: Wind speed at 10 meters (derived by `10m_u_component_of_wind` and `10m_v_component_of_wind`, meters per second)
- ws100: Wind speed at 100 meters (derived by `100m_u_component_of_wind` and `100m_v_component_of_wind`, meters per second)
- CS: Clear-Sky index (the ratio between the solar radiation and the solar radiation clear-sky)
- HDD/CDD: Heating/Cooling Degree days (derived by 2-meter temperature the EUROSTAT definition.
For each variable we have 350 599 hourly samples (from 01-01-1980 00:00:00 to 31-12-2019 23:00:00) for 34/115/309 regions (NUTS 0/1/2).
The data is provided in two formats:
- NetCDF version 4 (all the variables hourly and CDD/HDD daily). NOTE: the variables are stored as `int16` type using a `scale_factor` of 0.01 to minimise the size of the files.
- Comma Separated Value ("single index" format for all the variables and the time frequencies and "stacked" only for daily and monthly)
All the CSV files are stored in a zipped file for each variable.
## Methodology
The time-series have been generated using the following workflow:
1. The NetCDF files are downloaded from the Copernicus Data Store from the ERA5 hourly data on single levels from 1979 to present dataset
2. The data is read in R with the climate4r packages and aggregated using the function `/get_ts_from_shp` from panas. All the variables are aggregated at the NUTS boundaries using the average except for the runoff, which consists of the sum of all the grid points within the regional/national borders.
3. The derived variables (wind speed, CDD/HDD, clear-sky) are computed and all the CSV files are generated using R
4. The NetCDF are created using `xarray` in Python 3.7.
NOTE: air temperature, solar radiation, runoff and wind speed hourly data have been rounded with two decimal digits.
## Example notebooks
In the folder `notebooks` on the associated Github repository there are two Jupyter notebooks which shows how to deal effectively with the NetCDF data in `xarray` and how to visualise them in several ways by using matplotlib or the enlopy package.
There are currently two notebooks:
- exploring-ERA-NUTS: it shows how to open the NetCDF files (with Dask), how to manipulate and visualise them.
- ERA-NUTS-explore-with-widget: explorer interactively the datasets with [jupyter]() and ipywidgets.
The notebook `exploring-ERA-NUTS` is also available rendered as HTML.
## Additional files
In the folder `additional files`on the associated Github repository there is a map showing the spatial resolution of the ERA5 reanalysis and a CSV file specifying the number of grid points with respect to each NUTS0/1/2 region.
## License
This dataset is released under CC-BY-4.0 license.
Facebook
TwitterMIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Updated on 29 June, 2021:Added data for monthly mean differences in LST, albedo, and ET between forest and open land at 0.5 degree. This data were used by "Meier, R., Davin, E. L., Lejeune, Q., Hauser, M., Li, Y., Martens, B., … Thiery, W. (2018). Evaluating and improving the Community Land Model’s sensitivity to land cover. Biogeosciences, 15(15), 4731–4757. https://doi.org/10.5194/bg-15-4731-2018" to evaluate CLM model.File name: diff_lst_albedo_et_monthly_halfdeg.nc--------------------Updated on 28 August, 2017:Added data for annual snow frequency and rainfall at each forest and open land comparison sample location (data for Fig 4).File name: snow_rainfall_005degree.nc--------------------Updated on 21 August, 2017:Uploaded data for the comparison samples of forest minus open land across the globe at 0.05 degree for daytime, nighttime, daily mean LST, annual albedo, and daily ET. These comparison samples can reproduce most of the results from the paper (Fig1, part of Fig3, and Fig4). They can be aggregated to 1-degree map or to derive the latitudinal pattern shown in the paper.File name: delta_lst_albedo_et_005degree.nc-------------------------------This is the dataset used to create Figure 1 to Figure 3 in Li et al 2015 (Local cooling and warming effects of forest based on satellite data, Nature Communications). The dataset is saved as netcdf file, named as Fig1_NC.nc, Fig2_NC.nc, and Fig3_NC.nc. Figure 1 to 4 are also included in the dataset. When using the data, please cite the original paper "Li, Y. et al. Local cooling and warming effects of forest based on satellite data. Nat. Commun. 6:6603, (2015)" and also cite or acknowledge the dataset on figshare (doi:10.6084/m9.figshare.2445310). If you have any questions, please feel free to contact Yan Li. Email: yanli.geo@gmail.com
Facebook
TwitterThis collection comprises data covering meteorology, physical oceanography, transport of water, biogeochemistry, and parameters relevant to the carbon cycle, ocean acidification, the ecosystem, and geophysics. The data are collected from long-term, high-frequency observations at fixed locations in the open ocean. OceanSITES data are stored in netCDF files conforming to the OceanSITES Data Format Reference Manual. OceanSITES is a worldwide system of long-term, open-ocean reference stations measuring dozens of variables and monitoring the full depth of the ocean from air-sea interactions down to the seafloor. It is a network of stations or observatories measuring many aspects of the ocean's surface and water column using, where possible, automated systems with advanced sensors and telecommunications systems, yielding high time resolution, often in real-time, while building a long record.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset provides global daily estimates of Root-Zone Soil Moisture (RZSM) content at 0.25° spatial grid resolution, derived from gap-filled merged satellite observations of 14 passive satellites sensors operating in the microwave domain of the electromagnetic spectrum. Data is provided from January 1991 to December 2023.
This dataset was produced with funding from the European Space Agency (ESA) Climate Change Initiative (CCI) Plus Soil Moisture Project (CCN 3 to ESRIN Contract No: 4000126684/19/I-NB "ESA CCI+ Phase 1 New R&D on CCI ECVS Soil Moisture"). Project website: https://climate.esa.int/en/projects/soil-moisture/" target="_blank" rel="noopener">https://climate.esa.int/en/projects/soil-moisture/. Operational implementation is supported by the Copernicus Climate Change Service implemented by ECMWF through C3S2 312a/313c.
This dataset is used by Hirschi et al. (2025) to assess recent summer drought trends in Switzerland.
Hirschi, M., Michel, D., Schumacher, D. L., Preimesberger, W., and Seneviratne, S. I.: Recent summer soil moisture drying in Switzerland based on measurements from the SwissSMEX network, Earth Syst. Sci. Data Discuss. [preprint], https://doi.org/10.5194/essd-2025-416, in review, 2025.
ESA CCI Soil Moisture is a multi-satellite climate data record that consists of harmonized, daily observations from various microwave satellite remote sensing sensors (Dorigo et al., 2017, 2024; Gruber et al., 2019). This version of the dataset uses the PASSIVE record as input, which contains only observations from passive (radiometer) measurements (scaling reference AMSR-E). The surface observations are gap-filled using a univariate interpolation algorithm (Preimesberger et al., 2025). The gap-filled passive observations serve as input for an exponential filter based method to assess soil moisture in different layers of the root-zone of soil (0-200 cm) following the approach by Pasik et al. (2023). The final gap-free root-zone soil moisture estimates based on passive surface input data are provided here at 4 separate depth layers (0-10, 10-40, 40-100, 100-200 cm) over the period 1991-2023.
You can use command line tools such as wget or curl to download (and extract) data for multiple years. The following command will download and extract the complete data set to the local directory ~/Downloads on Linux or macOS systems.
#!/bin/bash
# Set download directory
DOWNLOAD_DIR=~/Downloads
base_url="https://researchdata.tuwien.ac.at/records/8dda4-xne96/files"
# Loop through years 1991 to 2023 and download & extract data
for year in {1991..2023}; do
echo "Downloading $year.zip..."
wget -q -P "$DOWNLOAD_DIR" "$base_url/$year.zip"
unzip -o "$DOWNLOAD_DIR/$year.zip" -d $DOWNLOAD_DIR
rm "$DOWNLOAD_DIR/$year.zip"
done
The dataset provides global daily estimates for the 1991-2023 period at 0.25° (~25 km) horizontal grid resolution. Daily images are grouped by year (YYYY), each subdirectory containing one netCDF image file for a specific day (DD), month (MM) in a 2-dimensional (longitude, latitude) grid system (CRS: WGS84). The file name has the following convention:
ESA_CCI_PASSIVERZSM-YYYYMMDD000000-fv09.1.nc
Each netCDF file contains 3 coordinate variables (WGS84 longitude, latitude and time stamp), as well as the following data variables:
Additional information for each variable is given in the netCDF attributes.
These data can be read by any software that supports Climate and Forecast (CF) conform metadata standards for netCDF files, such as:
Please see the ESA CCI Soil Moisture science data records community for more records based on ESA CCI SM.
Facebook
TwitterOpen Government Licence 3.0http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/
License information was derived automatically
Vertical profiles of horizontal and vertical wind components as well as signal-to-noise (SNR) and spectal width measurements were collected at the NERC Mesosphere-Stratosphere-Troposphere radar facility site, Capel Dewi, near Aberystwyth, Ceredigion, Wales, between 13th February and 4th August 2009 as part of ongoing long term observations made by the NERC National Centre for Atmospheric Science (NCAS). These data were collected by the NCAS Atmospheric Measreument Facility's (AMF) 1290 MHz Mobile Wind Profiler, owned and operated by the University of Manchester and previously known as the aber-radar-1290mhz at the time of these observations. The data are available at 15 minute intervals as netCDF files to all registered BADC users under the Open Government License.
The dataset contains the following measurements:
Eastward wind velocity component Northward wind velocity component Upward air velocity Direction the wind is from Signal to noise ratio Altitude of instrument above the ground Longitude of instrument Latitude of instrument Spectral width
Facebook
TwitterThe Sensory Geophysical Data Record (SGDR) files contain full accuracy altimeter data, with a high precision orbit (accuracy ~1.5 cm). The instruments on Jason-1 make direct observations of the following quantities: altimeter range, significant wave height, ocean radar backscatter cross-section (a measure of wind speed), ionospheric electron content (derived by a simple formula), tropospheric water content, mean sea surface, and position relative to the GPS satellite constellation. The SGDR contain all relevant corrections needed to calculate the sea surface height. It also contains the 20Hz waveforms that are required for retracking. The SGDR is an expert level product, if you do not require the waveforms then the GDR/GPN or GPR will be more suited for your needs.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is a collection of two datasets: one sourced from CPM data (bham64_ccpm-4x_12em_psl-sphum4th-temp4th-vort4th_pr.tar.gz) and one sourced from GCM data (bham64_gcm-4x_12em_psl-sphum4th-temp4th-vort4th_pr.tar.gz). Each dataset is made up of climate model variables extracted from the Met Office's storage system, combining many variables over many years. It consists of 3 NetCDF files (train.nc, test.nc and val.nc), a YML ds-config.yml file and a README (similar to this one but tailored to the source of the data). Code used to create the dataset can be found here: https://github.com/henryaddison/mlde-data (specifically the james-submission tag).
The YML file contains the configuration for the creation of the dataset, including the variables, scenario, ensemble members, spatial domain and resolution, and the scheme for splitting the data across the three subsets.
Each NetCDF contains the same variables but split into different subsets (train, val and test) of the based on time dimension.
Otherwise the NetCDF files have the sames dimensions and coordinates for ensemble_member, grid_longitude and grid_latitude.
UPDATE 2025-03-27: Dataset tars are renamed to make it clearer their source (ccpm for coarsened CPM and gcm for GCM).
Facebook
TwitterThe oceanographic time series data collected by U.S. Geological Survey scientists and collaborators are served in an online database at http://stellwagen.er.usgs.gov/index.html. These data were collected as part of research experiments investigating circulation and sediment transport in the coastal ocean. The experiments (projects, research programs) are typically one month to several years long and have been carried out since 1975. New experiments will be conducted, and the data from them will be added to the collection. As of 2016, all but one of the experiments were conducted in waters abutting the U.S. coast; the exception was conducted in the Adriatic Sea. Measurements acquired vary by site and experiment; they usually include current velocity, wave statistics, water temperature, salinity, pressure, turbidity, and light transmission from one or more depths over a time period. The measurements are concentrated near the sea floor but may also include data from the water column. The user interface provides an interactive map, a tabular summary of the experiments, and a separate page for each experiment. Each experiment page has documentation and maps that provide details of what data were collected at each site. Links to related publications with additional information about the research are also provided. The data are stored in Network Common Data Format (netCDF) files using the Equatorial Pacific Information Collection (EPIC) conventions defined by the National Oceanic and Atmospheric Administration (NOAA) Pacific Marine Environmental Laboratory. NetCDF is a general, self-documenting, machine-independent, open source data format created and supported by the University Corporation for Atmospheric Research (UCAR). EPIC is an early set of standards designed to allow researchers from different organizations to share oceanographic data. The files may be downloaded or accessed online using the Open-source Project for a Network Data Access Protocol (OPeNDAP). The OPeNDAP framework allows users to access data from anywhere on the Internet using a variety of Web services including Thematic Realtime Environmental Distributed Data Services (THREDDS). A subset of the data compliant with the Climate and Forecast convention (CF, currently version 1.6) is also available.
Facebook
TwitterNo description was included in this Dataset collected from the OSF