7 datasets found
  1. d

    (HS 2) Automate Workflows using Jupyter notebook to create Large Extent...

    • search.dataone.org
    • hydroshare.org
    Updated Oct 19, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Young-Don Choi (2024). (HS 2) Automate Workflows using Jupyter notebook to create Large Extent Spatial Datasets [Dataset]. http://doi.org/10.4211/hs.a52df87347ef47c388d9633925cde9ad
    Explore at:
    Dataset updated
    Oct 19, 2024
    Dataset provided by
    Hydroshare
    Authors
    Young-Don Choi
    Description

    We implemented automated workflows using Jupyter notebooks for each state. The GIS processing, crucial for merging, extracting, and projecting GeoTIFF data, was performed using ArcPy—a Python package for geographic data analysis, conversion, and management within ArcGIS (Toms, 2015). After generating state-scale LES (large extent spatial) datasets in GeoTIFF format, we utilized the xarray and rioxarray Python packages to convert GeoTIFF to NetCDF. Xarray is a Python package to work with multi-dimensional arrays and rioxarray is rasterio xarray extension. Rasterio is a Python library to read and write GeoTIFF and other raster formats. Xarray facilitated data manipulation and metadata addition in the NetCDF file, while rioxarray was used to save GeoTIFF as NetCDF. These procedures resulted in the creation of three HydroShare resources (HS 3, HS 4 and HS 5) for sharing state-scale LES datasets. Notably, due to licensing constraints with ArcGIS Pro, a commercial GIS software, the Jupyter notebook development was undertaken on a Windows OS.

  2. g

    Calculated Leached Nitrogen from Septic Systems in Wisconsin, 1850-2010 |...

    • gimi9.com
    Updated Apr 19, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Calculated Leached Nitrogen from Septic Systems in Wisconsin, 1850-2010 | gimi9.com [Dataset]. https://gimi9.com/dataset/data-gov_calculated-leached-nitrogen-from-septic-systems-in-wisconsin-1850-2010/
    Explore at:
    Dataset updated
    Apr 19, 2024
    Area covered
    Wisconsin
    Description

    This data release contains a netCDF file containing decadal estimates of nitrate leached from septic systems (kilograms per hectare per year, or kg/ha) in the state of Wisconsin from 1850 to 2010, as well as the python code and supporting files used to create the netCDF file. The netCDF file is used as an input to a Nitrate Decision Support Tool for the State of Wisconsin (GW-NDST; Juckem and others, 2024). The dataset was constructed starting with 1990 census records, which included responses about households using septic systems for waste disposal. The fraction of population using septic systems in 1990 was aggregated at the county scale and applied backward in time for each decade from 1850 to 1980. For decades from 1990 to 2010, the fraction of population using septic systems was computed on the finer resolution census block-group scale. Each decadal estimate of the fraction of population using septic systems was then multiplied by 4.13 kilograms per person per year of leached nitrate to estimate the per-area load of nitrate below the root zone. The data release includes a python notebook used to process the input datasets included in the data release, shapefiles created (or modified) using the python notebook, and the final netCDF file.

  3. o

    ERA-NUTS: meteorological time-series based on C3S ERA5 for European regions...

    • explore.openaire.eu
    • data.niaid.nih.gov
    • +1more
    Updated Feb 2, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    M. M. De Felice; K. K. Kavvadias (2022). ERA-NUTS: meteorological time-series based on C3S ERA5 for European regions (1980-2021) [Dataset]. http://doi.org/10.5281/zenodo.5947354
    Explore at:
    Dataset updated
    Feb 2, 2022
    Authors
    M. M. De Felice; K. K. Kavvadias
    Description

    ERA-NUTS (1980-2021) This dataset contains a set of time-series of meteorological variables based on Copernicus Climate Change Service (C3S) ERA5 reanalysis. The data files can be downloaded from here while notebooks and other files can be found on the associated Github repository. This data has been generated with the aim of providing hourly time-series of the meteorological variables commonly used for power system modelling and, more in general, studies on energy systems. An example of the analysis that can be performed with ERA-NUTS is shown in this video. Important: this dataset is still a work-in-progress, we will add more analysis and variables in the near-future. If you spot an error or something strange in the data please tell us sending an email or opening an Issue in the associated Github repository. ## Data The time-series have hourly/daily/monthly frequency and are aggregated following the NUTS 2016 classification. NUTS (Nomenclature of Territorial Units for Statistics) is a European Union standard for referencing the subdivisions of countries (member states, candidate countries and EFTA countries). This dataset contains NUTS0/1/2 time-series for the following variables obtained from the ERA5 reanalysis data (in brackets the name of the variable on the Copernicus Data Store and its unit measure): - t2m: 2-meter temperature (2m_temperature, Celsius degrees) - ssrd: Surface solar radiation (surface_solar_radiation_downwards, Watt per square meter) - ssrdc: Surface solar radiation clear-sky (surface_solar_radiation_downward_clear_sky, Watt per square meter) - ro: Runoff (runoff, millimeters) There are also a set of derived variables: - ws10: Wind speed at 10 meters (derived by 10m_u_component_of_wind and 10m_v_component_of_wind, meters per second) - ws100: Wind speed at 100 meters (derived by 100m_u_component_of_wind and 100m_v_component_of_wind, meters per second) - CS: Clear-Sky index (the ratio between the solar radiation and the solar radiation clear-sky) - HDD/CDD: Heating/Cooling Degree days (derived by 2-meter temperature the EUROSTAT definition. For each variable we have 367 440 hourly samples (from 01-01-1980 00:00:00 to 31-12-2021 23:00:00) for 34/115/309 regions (NUTS 0/1/2). The data is provided in two formats: - NetCDF version 4 (all the variables hourly and CDD/HDD daily). NOTE: the variables are stored as int16 type using a scale_factor to minimise the size of the files. - Comma Separated Value ("single index" format for all the variables and the time frequencies and "stacked" only for daily and monthly) All the CSV files are stored in a zipped file for each variable. ## Methodology The time-series have been generated using the following workflow: 1. The NetCDF files are downloaded from the Copernicus Data Store from the ERA5 hourly data on single levels from 1979 to present dataset 2. The data is read in R with the climate4r packages and aggregated using the function /get_ts_from_shp from panas. All the variables are aggregated at the NUTS boundaries using the average except for the runoff, which consists of the sum of all the grid points within the regional/national borders. 3. The derived variables (wind speed, CDD/HDD, clear-sky) are computed and all the CSV files are generated using R 4. The NetCDF are created using xarray in Python 3.8. ## Example notebooks In the folder notebooks on the associated Github repository there are two Jupyter notebooks which shows how to deal effectively with the NetCDF data in xarray and how to visualise them in several ways by using matplotlib or the enlopy package. There are currently two notebooks: - exploring-ERA-NUTS: it shows how to open the NetCDF files (with Dask), how to manipulate and visualise them. - ERA-NUTS-explore-with-widget: explorer interactively the datasets with jupyter and ipywidgets. The notebook exploring-ERA-NUTS is also available rendered as HTML. ## Additional files In the folder additional fileson the associated Github repository there is a map showing the spatial resolution of the ERA5 reanalysis and a CSV file specifying the number of grid points with respect to each NUTS0/1/2 region. ## License This dataset is released under CC-BY-4.0 license.

  4. GEMINI output used to develop volumetric reconstruction technique for EISCAT...

    • zenodo.org
    bin, nc
    Updated Jan 21, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jone Peter Reistad; Jone Peter Reistad; Matthew Zettergren; Matthew Zettergren (2024). GEMINI output used to develop volumetric reconstruction technique for EISCAT 3D [Dataset]. http://doi.org/10.5281/zenodo.10535762
    Explore at:
    bin, ncAvailable download formats
    Dataset updated
    Jan 21, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Jone Peter Reistad; Jone Peter Reistad; Matthew Zettergren; Matthew Zettergren
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset is used as a "ground truth" for investigating the performance of a volumetric reconstruction technique of electric current densities, intended to be applied to the EISCAT 3D radar system. The technique is outlined in a mnuscript in preparation, to be referred to here once submitted. The volumetric reconstruction code can be found here: https://github.com/jpreistad/e3dsecs

    This dataset contain three files:

    1) Dataset file 'gemini_dataset.nc'. This is a dump from the end of a GEMINI model run driven with a pair of up/down FAC above the region around the EISCAT 3D facility. Detailes of the GEMINI model can be found here: https://doi.org/10.5281/zenodo.3528915 . This is a NETCDF file, intended to be opened with xarray in python:

    import xaray

    dataset = xarray.open_dataset('gemini_dataset.nc')

    2) Grid file 'gemini_grid.h5'. This file is needed to get information about the grid that the values from GEMINI are represented in. The E3DSECS library (https://github.com/jpreistad/e3dsecs) has the necessary code to open this file and put it into the dictionary structure used in that package.

    3) The GEMINI simulation config file 'config.nml' used to produce the simulation. This could be used to reproduce the full simulation of the GEMINI model, which is freely available at https://github.com/gemini3d

  5. d

    Constructing visualization tools and training resources to assess climate...

    • search.dataone.org
    • data.niaid.nih.gov
    • +1more
    Updated Jun 20, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Patricia Park; Olivia Holt; Diana Navarro (2024). Constructing visualization tools and training resources to assess climate impacts on the channel islands national marine sanctuary NetCDF files [Dataset]. http://doi.org/10.5061/dryad.x0k6djht9
    Explore at:
    Dataset updated
    Jun 20, 2024
    Dataset provided by
    Dryad Digital Repository
    Authors
    Patricia Park; Olivia Holt; Diana Navarro
    Time period covered
    Jun 5, 2024
    Area covered
    Channel Islands National Marine Sanctuary
    Description

    The Channel Islands Marine Sanctuary (CINMS) comprises 1,470 square miles surrounding the Northern Channel Islands: Anacapa, Santa Cruz, Santa Rosa, San Miguel, and Santa Barbara, protecting various species and habitats. However, these sensitive habitats are highly susceptible to climate-driven ‘shock’ events which are associated with extreme values of temperature, pH, or ocean nutrient levels. A particularly devastating example was seen in 2014-16, when extreme temperatures and changes in nutrient conditions off the California coast led to large-scale die-offs of marine organisms. Global climate models are the best tool available to predict how these shocks may respond to climate change. To better understand the drivers and statistics of climate-driven ecosystem shocks, a ‘large ensemble’ of simulations run with multiple climate models will be used. The objective of this project is to develop a Python-based web application to visualize ecologically significant climate variables near th..., Data was accessed through AWS and then after subsetted to the point of interest, a netcdf file was downloaded for the purposes of the web application. More information can be found on the GitHub repository here: https://github.com/Channelislanders/toolkit It should be noted that all data found here is just for the purpose for the web application., , # GENERAL INFORMATION

    This dataset is the files that accompany the website created for this project. A subsetted version of the CESM 1 dataset was downloaded to instantly update the website.

    1. Title of the Project

    Constructing Visualization Tools and Training Resources to Assess Climate Impacts on the Channel Islands National Marine Sanctuary

    2. Author Information

    Graduate Students at the Bren School for Environmental Science & Management in the Masters of Environmental Data Science program 2023-2024.

    A. Principal Investigators Contact Information

    Names: Olivia Holt, Diana Navarro, and Patty Park

    Institution: Bren School at the University of California, Santa Barbara

    Address: Bren Hall, 2400 University of California, Santa Barbara, CA 93117

    Emails: olholt@bren.ucsb.edu, dmnavarro@bren.ucsb.edu, p_park@bren.ucsb.edu

    B. Associate or Co-investigator Contact Informat...

  6. Self Lensing simulations of WD-WD pairs

    • zenodo.org
    • data.niaid.nih.gov
    nc
    Updated Sep 14, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Guy Nir; Guy Nir (2023). Self Lensing simulations of WD-WD pairs [Dataset]. http://doi.org/10.5281/zenodo.8340555
    Explore at:
    ncAvailable download formats
    Dataset updated
    Sep 14, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Guy Nir; Guy Nir
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    These are netCDF files, created using python/xarray. They contain the simulation results gotten from running the self_lens package (https://github.com/guynir42/self_lens) with a few surveys (ZTF, TESS, LSST, DECAM, CURIOS, CURIOS_ARRAY, LAST) over simulated binaries containing two white dwarfs (WDs).

    Each file contains the results for the number of detections and effective volume for one survey, over a large parameter space of WD-WD binaires. For each binary we simulate the self-lensing flare and estimate the ability of a survey to observe that flare, at different distances of the system from Earth.

    These datasets are needed to make the plots for an upcoming paper (Nir & Bloom, in prep). In the self_lens package, run the test_produce_plots.py to pull down these files to a local folder and use them to make plots.

    An accompanying dataset includes the same files for WDs in binaries with neutron stars and black holes (BHs).

  7. o

    Data from: A Deep Learning-Based Hybrid Model of Global Terrestrial...

    • explore.openaire.eu
    • data.europa.eu
    Updated Aug 19, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Akash Koppa; Dominik Rains; Petra Hulsman; Diego G. Miralles (2021). A Deep Learning-Based Hybrid Model of Global Terrestrial Evaporation [Dataset]. http://doi.org/10.5281/zenodo.5220753
    Explore at:
    Dataset updated
    Aug 19, 2021
    Authors
    Akash Koppa; Dominik Rains; Petra Hulsman; Diego G. Miralles
    Description

    This repository contains the codes and datasets used in the research article "A Deep Learning-Based Hybrid Model of Global Terrestrial Evaporation". The repository contains the following files: 1) Codes - contains scripts used for training the deep learning models used in the study, and for creating the figures in the article. 2) Input - contains all the processed input used for training the deep learning models and the datasets used for creating the figures in the article. 3) Output - contains the final deep learning models and the outputs (evaporation and transpiration stress factor) outputs from the hybrid model developed in the study. Formats: All scripts are in the programming language Python. The datasets are in HDF5 and NetCDF formats

  8. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Young-Don Choi (2024). (HS 2) Automate Workflows using Jupyter notebook to create Large Extent Spatial Datasets [Dataset]. http://doi.org/10.4211/hs.a52df87347ef47c388d9633925cde9ad

(HS 2) Automate Workflows using Jupyter notebook to create Large Extent Spatial Datasets

Explore at:
Dataset updated
Oct 19, 2024
Dataset provided by
Hydroshare
Authors
Young-Don Choi
Description

We implemented automated workflows using Jupyter notebooks for each state. The GIS processing, crucial for merging, extracting, and projecting GeoTIFF data, was performed using ArcPy—a Python package for geographic data analysis, conversion, and management within ArcGIS (Toms, 2015). After generating state-scale LES (large extent spatial) datasets in GeoTIFF format, we utilized the xarray and rioxarray Python packages to convert GeoTIFF to NetCDF. Xarray is a Python package to work with multi-dimensional arrays and rioxarray is rasterio xarray extension. Rasterio is a Python library to read and write GeoTIFF and other raster formats. Xarray facilitated data manipulation and metadata addition in the NetCDF file, while rioxarray was used to save GeoTIFF as NetCDF. These procedures resulted in the creation of three HydroShare resources (HS 3, HS 4 and HS 5) for sharing state-scale LES datasets. Notably, due to licensing constraints with ArcGIS Pro, a commercial GIS software, the Jupyter notebook development was undertaken on a Windows OS.

Search
Clear search
Close search
Google apps
Main menu