7 datasets found
  1. h

    dwd

    • huggingface.co
    Updated Feb 11, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jacob (2024). dwd [Dataset]. https://huggingface.co/datasets/jacobbieker/dwd
    Explore at:
    Dataset updated
    Feb 11, 2024
    Authors
    Jacob
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    Dataset Card for DWD Observations

    This dataset is a collection of historical German Weather Service (DWD) weather station observations at 10 minutely, and hourly resolutions for various parameters. The data has been converted to Zarr and Xarray. The data was gathered using the wonderful wetterdienst package.

      Dataset Details
    
    
    
    
    
      Dataset Description
    

    Curated by: [More Information Needed] Funded by [optional]: [More Information Needed] Shared by [optional]:… See the full description on the dataset page: https://huggingface.co/datasets/jacobbieker/dwd.

  2. era5-land

    • huggingface.co
    Updated Oct 2, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Open Climate Fix (2022). era5-land [Dataset]. https://huggingface.co/datasets/openclimatefix/era5-land
    Explore at:
    Dataset updated
    Oct 2, 2022
    Dataset provided by
    Open Climate Fix Limited
    Authors
    Open Climate Fix
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    This dataset is comprised of ECMWF ERA5-Land data covering 2014 to October 2022. This data is on a 0.1 degree grid and has fewer variables than the standard ERA5-reanalysis, but at a higher resolution. All the data has been downloaded as NetCDF files from the Copernicus Data Store and converted to Zarr using Xarray, then uploaded here. Each file is one day, and holds 24 timesteps.

  3. Lagrangian trajectories representing surface drift from the Cape Verde...

    • zenodo.org
    zip
    Updated Jul 13, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Willi Rath; Willi Rath; Perla Roman; Perla Roman (2022). Lagrangian trajectories representing surface drift from the Cape Verde islands [Dataset]. http://doi.org/10.5281/zenodo.6587208
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 13, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Willi Rath; Willi Rath; Perla Roman; Perla Roman
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Cabo Verde
    Description

    Example trajectories from a biophysical Lagrangian simulation

    This is 25x 50.000 example trajectories from biophysical experiments performed with Parcels.

    The trajectories are a tiny subset of a much bigger collection of trajectories that have been simulated with the aim of learning about the fate of particles drifting away from the Cape Verde islands.

    Note, that these trajectories should not be used for biological or physical science but merely serve as study objects for developing, testing, or benchmarking (stasticical) methods and algorithms.

    Details of the experiments

    The trajectories are taken from 25 sets of biophysical simulations which differ in the year they represent. Particles are seeded between mid of August and start of December of the years 1993 to 2017. They are subject to Ocean surface currents simulated by a high-resolution ocean model and subject to Stokes Drift estimated by a wave simulation data provided by the Copernicus Marine Service (https://marine.copernicus.eu/).

    Data store

    The data come as a Zarr store inside of a ZIP file "cape_verde_drift_trajectories_1993-2017.zarr.zip" which you need to download and unzip to be able to read it, e.g., with Xarray's open_zarr method.

    Variables and their meaning

    • "obs" contains the time step since the larva started to exist. Each trajectory covers up to 881 daily positions.

    • "traj" indicates the trajectory ID.

    • "lat" and "lon" contain the horizontal positions in degrees Latitude and Longitude.

    • "temp" contains the ambient temperature in degrees Celsius the simulated larva would have felt.

    • "time" contains time stamps for each position.

    • "z" contains the vertical positions of the simulated larva in meters counted downwards.

    Using the data

    This data set is licensed under a Creative Commons Attribution 4.0 International License.

    If you use the data, we'd love to get a notice to wrath@geomar.de. This is, however, not required.

  4. u

    Data from: Community Earth System Model v2 Large Ensemble (CESM2 LENS)

    • data.ucar.edu
    • oidc.rda.ucar.edu
    • +1more
    zarr
    Updated Nov 11, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Danabasoglu, Gokhan; Deser, Clara; Rodgers Axel Timmermann, Keith (2024). Community Earth System Model v2 Large Ensemble (CESM2 LENS) [Dataset]. https://data.ucar.edu/dataset/community-earth-system-model-v2-large-ensemble-cesm2-lens
    Explore at:
    zarrAvailable download formats
    Dataset updated
    Nov 11, 2024
    Dataset provided by
    Research Data Archive at the National Center for Atmospheric Research, Computational and Information Systems Laboratory
    Authors
    Danabasoglu, Gokhan; Deser, Clara; Rodgers Axel Timmermann, Keith
    Time period covered
    Jan 1, 1850 - Dec 31, 2014
    Description

    The US National Center for Atmospheric Research partnered with the IBS Center for Climate Physics in South Korea to generate the CESM2 Large Ensemble which consists of 100 ensemble members at 1 degree spatial resolution covering the period 1850-2100 under CMIP6 historical and SSP370 future radiative forcing scenarios. Data sets from this ensemble were made downloadable via the Climate Data Gateway on June 14, 2021. NCAR has copied a subset (currently ~500 TB) of CESM2 LENS data to Amazon S3 as part of the AWS Public Datasets Program. To optimize for large-scale analytics we have represented the data as ~275 Zarr stores format accessible through the Python Xarray library. Each Zarr store contains a single physical variable for a given model run type and temporal frequency (monthly, daily).

  5. o

    Community Earth System Model Large Ensemble (CESM LENS)

    • registry.opendata.aws
    Updated Oct 3, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Center for Atmospheric Research (2019). Community Earth System Model Large Ensemble (CESM LENS) [Dataset]. https://registry.opendata.aws/ncar-cesm-lens/
    Explore at:
    Dataset updated
    Oct 3, 2019
    Dataset provided by
    <a href="https://ncar.ucar.edu/">National Center for Atmospheric Research</a>
    Description

    The Community Earth System Model (CESM) Large Ensemble Numerical Simulation (LENS) dataset includes a 40-member ensemble of climate simulations for the period 1920-2100 using historical data (1920-2005) or assuming the RCP8.5 greenhouse gas concentration scenario (2006-2100), as well as longer control runs based on pre-industrial conditions. The data comprise both surface (2D) and volumetric (3D) variables in the atmosphere, ocean, land, and ice domains. The total data volume of the original dataset is ~500TB, which has traditionally been stored as ~150,000 individual CF/NetCDF files on disk or magnetic tape made available through the NCAR Climate Data Gateway for download or via web services. NCAR has copied a subset (currently ~70 TB) of CESM LENS data to Amazon S3 as part of the AWS Public Datasets Program. To optimize for large-scale analytics we have represented the data as ~275 Zarr stores format accessible through the Python Xarray library. Each Zarr store contains a single physical variable for a given model run type and temporal frequency (monthly, daily, 6-hourly).

  6. dwd-icon-global

    • huggingface.co
    Updated Oct 20, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Open Climate Fix (2023). dwd-icon-global [Dataset]. http://doi.org/10.57967/hf/0880
    Explore at:
    Dataset updated
    Oct 20, 2023
    Dataset provided by
    Open Climate Fix Limited
    Authors
    Open Climate Fix
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Dataset Card for DWD ICON Global Forecast

    This dataset is comprised of forecasts from the German Weather Service's (DWD) ICON-Global model from March 2023 to the present with all variables included. Each forecast runs up to 4 days into the future, and the model is ran 4 times per day. This data is an archive of the publicly available data at https://opendata.dwd.de/weather/nwp/, converted to Zarr format with Xarray. No other processing of the data is performed.

      Dataset… See the full description on the dataset page: https://huggingface.co/datasets/openclimatefix/dwd-icon-global.
    
  7. Z

    Estimates of Global Coastal Losses Under Multiple Sea Level Rise Scenarios

    • data.niaid.nih.gov
    Updated Apr 3, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hamidi, Ali (2024). Estimates of Global Coastal Losses Under Multiple Sea Level Rise Scenarios [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_6014085
    Explore at:
    Dataset updated
    Apr 3, 2024
    Dataset provided by
    Houser, Trevor
    Delgado, Michael
    Bolliger, Ian
    Hsiang, Solomon
    Choi, Jun Ho
    Kopp, Robert E.
    Allen, Daniel
    Depsky, Nicholas
    Greenstone, Michael
    Hamidi, Ali
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Results from the Python Coastal Impacts and Adaptation Model (pyCIAM), the inputs and source code necessary to replicate these outputs, and the results presented in Depsky et al. 2023.

    All zipped Zarr stores can be downloaded and accessed locally or can be directly accessed via code similar to the following:

    from fsspec.implementations.zip import ZipFileSystem import xarray as xr xr.open_zarr(ZipFileSystem(url_of_file_in_record}}).get_mapper())

    File Inventory

    Products

    pyCIAM_outputs.zarr.zip: Outputs of the pyCIAM model, using the SLIIDERS dataset to define socioeconomic and extreme sea level characteristics of coastal regions and the 17th, 50th, and 83rd quantiles of local sea level rise as projected by various modeling frameworks (LocalizeSL and FACTS) and for multiple emissions scenarios and ice sheet models.

    pyCIAM_outputs_{case}.nc: A NetCDF version of pyCIAM_outputs, in which the netcdf files are divided up by adaptation "case" to reduce file size.

    diaz2016_outputs.zarr.zip: A replication of the results from Diaz 2016 - the model upon which pyCIAM was built, using an identical configuration to that of the original model.

    suboptimal_capital_by_movefactor.zarr.zip: An analysis of the observed present-day allocation of capital compared to a "rational" allocation, as a function of the magnitude of non-market costs of relocation assumed in the model. See Depsky et al. 2023 for further details.

    Inputs

    ar5-msl-rel-2005-quantiles.zarr.zip: Quantiles of projected local sea level rise as projected from the LocalizeSL model, using a variety of temperature scenarios and ice sheet models developed in Kopp 2014, Bamber 2019, DeConto 2021, IPCC SROCC. The results contained in pyCIAM_outputs.zarr.zip cover a broader (and newer) range of SLR projections from a more recent projection framework (FACTS); however, these data are more easily obtained from the appropriate Zenodo records and thus are not hosted in this one.

    diaz2016_inputs_raw.zarr.zip: The coastal inputs used in Diaz 2016, obtained from GitHub and formatted for use in the Python-based pyCIAM. These are based on the Dynamic Integrated Vulnerability Assessment (DIVA) dataset.

    surge-lookup-seg(_adm).zarr.zip: Pre-computed lookup tables estimating average annual losses from extreme sea levels due to mortality and capital stock damage. This is an intermediate output of pyCIAM and is not necessary to replicate the model results. However, it is more time consuming to produce than the rest of the model and is provided for users who may wish to start from the pre-computed dataset. Two versions are provided - the first contains estimates for each unique intersection of ~50km coastal segment and state/province-level administrative unit (admin-1). This is derived from the characteristics in SLIIDERS. The second is simply estimated on a version of SLIIDERS collapsed over administrative units to vary only over coastal segments. Both are used in the process of running pyCIAM.

    ypk_2000_2100.zarr.zip: An intermediate output in creating SLIIDERS that contains country-level projections of GDP, capital stock, and population, based on the Shared Socioeconomic Pathways (SSPs). This is only used in normalizing costs estimated in pyCIAM by country and global GDP to report in Depsky et al. 2023. It is not used in the execution of pyCIAM but is provided to replicate results reported in the manuscript.

    Source Code

    pyCIAM.zip: Contains the python-CIAM package as well as a notebook-based workflow to replicate the results presented in Depsky et al. 2023. It also contains two master shell scripts (run_example.sh and run_full_replication.sh) to assist in executing a small sample of the pyCIAM model or in fully executing the workflow of Depsky et al. 2023, respectively. This code is consistent with release 1.2.0 in the pyCIAM GitHub repository and is available as version 1.2.0 of the python-CIAM package on PyPI.

    Version history:

    1.2

    Point data-acquisition.ipynb to updated Zenodo deposit that fixes the dtype of subsets variable in diaz2016_inputs_raw.zarr.zip to be bool rather than int8

    Variable name bugfix in data-acquisition.ipynb

    Add netcdf versions of SLIIDERS and the pyCIAM results to upload-zenodo.ipynb

    Update results in Zenodo record to use SLIIDERS v1.2

    1.1.1

    Bugfix to inputs/diaz2016_inputs_raw.zarr.zip to make the subsets variable bool instead of int8.

    1.1.0

    Version associated with publication of Depsky et al., 2023

  8. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Jacob (2024). dwd [Dataset]. https://huggingface.co/datasets/jacobbieker/dwd

dwd

jacobbieker/dwd

Explore at:
Dataset updated
Feb 11, 2024
Authors
Jacob
License

MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically

Description

Dataset Card for DWD Observations

This dataset is a collection of historical German Weather Service (DWD) weather station observations at 10 minutely, and hourly resolutions for various parameters. The data has been converted to Zarr and Xarray. The data was gathered using the wonderful wetterdienst package.

  Dataset Details





  Dataset Description

Curated by: [More Information Needed] Funded by [optional]: [More Information Needed] Shared by [optional]:… See the full description on the dataset page: https://huggingface.co/datasets/jacobbieker/dwd.

Search
Clear search
Close search
Google apps
Main menu