100+ datasets found
  1. a

    Introduction to R Scripting with ArcGIS

    • edu.hub.arcgis.com
    Updated Jan 18, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Education and Research (2025). Introduction to R Scripting with ArcGIS [Dataset]. https://edu.hub.arcgis.com/documents/baec6865ffbc4c1c869a594b9cad8bc0
    Explore at:
    Dataset updated
    Jan 18, 2025
    Dataset authored and provided by
    Education and Research
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    This resource was created by Esri Canada Education and Research. To browse our full collection of higher-education learning resources, please visit https://hed.esri.ca/resourcefinder/.This Tutorial consists of four tutorials that deal with integrating the statistical programming language R with ArcGIS for Desktop. Several concepts are covered which include configuring ArcGIS with R, writing basic R scripts, writing R scripts that work with ArcGIS data, and constructing R Tools for use within ArcGIS Pro. It is recommended that the tutorials are completed in sequential order. Each of the four tutorials (as well as a version of this document), can viewed directly from your Web browser by following the links below. However, you must obtain a complete copy of the tutorial files by downloading the latest release (or by cloning the tutorial repository on GitHub) if you wish to follow the tutorials interactively using ArcGIS and R software, along with pre-configured sample data.To download the tutorial documents and datasets, click the Open button to the top right. This will automatically download a ZIP file containing all files and data required.You can also clone the tutorial documents and datasets for this GitHub repo: https://github.com/highered-esricanada/r-arcgis-tutorials.gitSoftware & Solutions Used: ArcGIS Pro 3.4 Internet browser (e.g., Mozilla Firefox, Google Chrome, Safari) R Statistical Computing Language – version 4.3.3 R-ArcGIS Bindings – version 1.0.1.311RStudio Desktop – version 2024.09.0+375Time to Complete: 2.5 h (excludes installation time)File Size: 115 MBDate Created: November 2017Last Updated: December 2024

  2. a

    Working with the R-ArcGIS Bridge

    • edu.hub.arcgis.com
    Updated Dec 15, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Education and Research (2017). Working with the R-ArcGIS Bridge [Dataset]. https://edu.hub.arcgis.com/documents/a7a03b88879b4d2ba461e0288646a198
    Explore at:
    Dataset updated
    Dec 15, 2017
    Dataset authored and provided by
    Education and Research
    Description

    A complete copy of the source files and sample data used during this workshop, arranged into a step-by-step tutorial series, can be obtained from the repository page on GitHub: https://esricanada-ce.github.io/r-arcgis-tutorials/

  3. terraceDL: A geomorphology deep learning dataset of agricultural terraces in...

    • figshare.com
    bin
    Updated Mar 22, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Aaron Maxwell (2023). terraceDL: A geomorphology deep learning dataset of agricultural terraces in Iowa, USA [Dataset]. http://doi.org/10.6084/m9.figshare.22320373.v2
    Explore at:
    binAvailable download formats
    Dataset updated
    Mar 22, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Aaron Maxwell
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Iowa, United States
    Description

    scripts.zip

    arcgisTools.atbx: terrainDerivatives: make terrain derivatives from digital terrain model (Band 1 = TPI (50 m radius circle), Band 2 = square root of slope, Band 3 = TPI (annulus), Band 4 = hillshade, Band 5 = multidirectional hillshades, Band 6 = slopeshade). rasterizeFeatures: convert vector polygons to raster masks (1 = feature, 0 = background).

    makeChips.R: R function to break terrain derivatives and chips into image chips of a defined size. makeTerrainDerivatives.R: R function to generated 6-band terrain derivatives from digital terrain data (same as ArcGIS Pro tool). merge_logs.R: R script to merge training logs into a single file. predictToExtents.ipynb: Python notebook to use trained model to predict to new data. trainExperiments.ipynb: Python notebook used to train semantic segmentation models using PyTorch and the Segmentation Models package. assessmentExperiments.ipynb: Python code to generate assessment metrics using PyTorch and the torchmetrics library. graphs_results.R: R code to make graphs with ggplot2 to summarize results. makeChipsList.R: R code to generate lists of chips in a directory. makeMasks.R: R function to make raster masks from vector data (same as rasterizeFeatures ArcGIS Pro tool).

    terraceDL.zip

    dems: LiDAR DTM data partitioned into training, testing, and validation datasets based on HUC8 watershed boundaries. Original DTM data were provided by the Iowa BMP mapping project: https://www.gis.iastate.edu/BMPs. extents: extents of the training, testing, and validation areas as defined by HUC 8 watershed boundaries. vectors: vector features representing agricultural terraces and partitioned into separate training, testing, and validation datasets. Original digitized features were provided by the Iowa BMP Mapping Project: https://www.gis.iastate.edu/BMPs.

  4. A

    Geospatial Deep Learning Seminar Online Course

    • data.amerigeoss.org
    html
    Updated Oct 18, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    AmericaView (2024). Geospatial Deep Learning Seminar Online Course [Dataset]. https://data.amerigeoss.org/dataset/geospatial-deep-learning-seminar-online-course
    Explore at:
    htmlAvailable download formats
    Dataset updated
    Oct 18, 2024
    Dataset provided by
    AmericaView
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This seminar is an applied study of deep learning methods for extracting information from geospatial data, such as aerial imagery, multispectral imagery, digital terrain data, and other digital cartographic representations. We first provide an introduction and conceptualization of artificial neural networks (ANNs). Next, we explore appropriate loss and assessment metrics for different use cases followed by the tensor data model, which is central to applying deep learning methods. Convolutional neural networks (CNNs) are then conceptualized with scene classification use cases. Lastly, we explore semantic segmentation, object detection, and instance segmentation. The primary focus of this course is semantic segmenation for pixel-level classification.

    The associated GitHub repo provides a series of applied examples. We hope to continue to add examples as methods and technologies further develop. These examples make use of a vareity of datasets (e.g., SAT-6, topoDL, Inria, LandCover.ai, vfillDL, and wvlcDL). Please see the repo for links to the data and associated papers. All examples have associated videos that walk through the process, which are also linked to the repo. A variety of deep learning architectures are explored including UNet, UNet++, DeepLabv3+, and Mask R-CNN. Currenlty, two examples use ArcGIS Pro and require no coding. The remaining five examples require coding and make use of PyTorch, Python, and R within the RStudio IDE. It is assumed that you have prior knowledge of coding in the Python and R enviroinments. If you do not have experience coding, please take a look at our Open-Source GIScience and Open-Source Spatial Analytics (R) courses, which explore coding in Python and R, respectively.

    After completing this seminar you will be able to:

    1. explain how ANNs work including weights, bias, activation, and optimization.
    2. describe and explain different loss and assessment metrics and determine appropriate use cases.
    3. use the tensor data model to represent data as input for deep learning.
    4. explain how CNNs work including convolutional operations/layers, kernel size, stride, padding, max pooling, activation, and batch normalization.
    5. use PyTorch, Python, and R to prepare data, produce and assess scene classification models, and infer to new data.
    6. explain common semantic segmentation architectures and how these methods allow for pixel-level classification and how they are different from traditional CNNs.
    7. use PyTorch, Python, and R (or ArcGIS Pro) to prepare data, produce and assess semantic segmentation models, and infer to new data.
    8. explain how object and instance segmentation are different from traditional CNNs and semantic segmentation and how they can be used to generate bounding boxes and feature masks for each instance of a class.
    9. use ArcGIS Pro to perform object detection (to obtain bounding boxes) and instance segmentation (to obtain pixel-level instance masks).
  5. 02 - D=R x T - Esri GeoInquiries™ collection for Mathematics

    • hub.arcgis.com
    Updated May 2, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri GIS Education (2017). 02 - D=R x T - Esri GeoInquiries™ collection for Mathematics [Dataset]. https://hub.arcgis.com/documents/592099bc3a8e48e492960efb38937d40
    Explore at:
    Dataset updated
    May 2, 2017
    Dataset provided by
    Esrihttp://esri.com/
    Authors
    Esri GIS Education
    Description

    Use an aerial photograph to determine the distance around a track, and then calculate rate and time for each lap and the race as a whole. THE GEOINQUIRIES™ COLLECTION FOR MATHEMATICShttp://www.esri.com/geoinquiriesThe GeoInquiry™ collection for Mathematics contains 15 free, standards-based activities that correspond and extend spatial concepts found in course textbooks frequently used in introductory algebra or geometry classes. The activities use a common inquiry-based instructional model, require only 15 minutes to deliver, and are device/laptop agnostic. Each activity includes an ArcGIS Online map but requires no login or installation. The activities harmonize with the Common Core math national curriculum standards. Activities include:· Rates & Proportions: A lost beach· D=R x T· Linear rate of change: Steady growth· How much rain? Linear equations· Rates of population change· Distance and midpoint· The coordinate plane· Euclidean vs Non-Euclidean· Area and perimeter at the mall· Measuring crop circles· Area of complex figures· Similar triangles· Perpendicular bisectors· Centers of triangles· Volume of pyramids

    Teachers, GeoMentors, and school administrators can learn more at http://www.esri.com/geoinquiries.

  6. r

    Grid Garage ArcGIS Toolbox

    • researchdata.edu.au
    • data.nsw.gov.au
    Updated Sep 6, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    data.nsw.gov.au (2018). Grid Garage ArcGIS Toolbox [Dataset]. https://researchdata.edu.au/grid-garage-arcgis-toolbox/1342780
    Explore at:
    Dataset updated
    Sep 6, 2018
    Dataset provided by
    data.nsw.gov.au
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    The Grid Garage Toolbox is designed to help you undertake the Geographic Information System (GIS) tasks required to process GIS data (geodata) into a standard, spatially aligned format. This format is required by most, grid or raster, spatial modelling tools such as the Multi-criteria Analysis Shell for Spatial Decision Support (MCAS-S) . Grid Garage contains 36 tools designed to save you time by batch processing repetitive GIS tasks as well diagnosing problems with data and capturing a record of processing step and any errors encountered.\r \r Grid Garage provides tools that function using a list based approach to batch processing where both inputs and outputs are specified in tables to enable selective batch processing and detailed result reporting. In many cases the tools simply extend the functionality of standard ArcGIS tools, providing some or all of the inputs required by these tools via the input table to enable batch processing on a 'per item' basis. This approach differs slightly from normal batch processing in ArcGIS, instead of manually selecting single items or a folder on which to apply a tool or model you provide a table listing target datasets. In summary the\r Grid Garage allows you to:\r \r * List, describe and manage very large volumes of geodata.\r * Batch process repetitive GIS tasks such as managing (renaming, describing etc.) or processing (clipping, resampling, reprojecting etc.) many geodata inputs such as time-series geodata derived from satellite imagery or climate models.\r * Record any errors when batch processing and diagnose errors by interrogating the input geodata that failed.\r * Develop your own models in ArcGIS ModelBuilder that allow you to automate any GIS workflow utilising one or more of the Grid Garage tools that can process an unlimited number of inputs.\r * Automate the process of generating MCAS-S TIP metadata files for any number of input raster datasets.\r \r The Grid Garage is intended for use by anyone with an understanding of GIS principles and an intermediate to advanced level of GIS skills. Using the Grid Garage tools in ArcGIS ModelBuilder requires skills in the use of the ArcGIS ModelBuilder tool.\r \r Download Instructions: Create a new folder on your computer or network and then download and unzip the zip file from the GitHub Release page for each of the following items in the 'Data and Resources' section below. There is a folder in each zip file that contains all the files. See the Grid Garage User Guide for instructions on how to install and use the Grid Garage Toolbox with the sample data provided. \r \r

  7. d

    Geospatial Data from the Alpine Treeline Warming Experiment (ATWE) on Niwot...

    • search.dataone.org
    • data.ess-dive.lbl.gov
    • +1more
    Updated Jul 7, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Fabian Zuest; Cristina Castanha; Nicole Lau; Lara M. Kueppers (2021). Geospatial Data from the Alpine Treeline Warming Experiment (ATWE) on Niwot Ridge, Colorado, USA [Dataset]. http://doi.org/10.15485/1804896
    Explore at:
    Dataset updated
    Jul 7, 2021
    Dataset provided by
    ESS-DIVE
    Authors
    Fabian Zuest; Cristina Castanha; Nicole Lau; Lara M. Kueppers
    Time period covered
    Jan 1, 2008 - Jan 1, 2012
    Area covered
    Description

    This is a collection of all GPS- and computer-generated geospatial data specific to the Alpine Treeline Warming Experiment (ATWE), located on Niwot Ridge, Colorado, USA. The experiment ran between 2008 and 2016, and consisted of three sites spread across an elevation gradient. Geospatial data for all three experimental sites and cone/seed collection locations are included in this package. ––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––– Geospatial files include cone collection, experimental site, seed trap, and other GPS location/terrain data. File types include ESRI shapefiles, ESRI grid files or Arc/Info binary grids, TIFFs (.tif), and keyhole markup language (.kml) files. Trimble-imported data include plain text files (.txt), Trimble COR (CorelDRAW) files, and Trimble SSF (Standard Storage Format) files. Microsoft Excel (.xlsx) and comma-separated values (.csv) files corresponding to the attribute tables of many files within this package are also included. A complete list of files can be found in this document in the “Data File Organization” section in the included Data User's Guide. Maps are also included in this data package for reference and use. These maps are separated into two categories, 2021 maps and legacy maps, which were made in 2010. Each 2021 map has one copy in portable network graphics (.png) format, and the other in .pdf format. All legacy maps are in .pdf format. .png image files can be opened with any compatible programs, such as Preview (Mac OS) and Photos (Windows). All GIS files were imported into geopackages (.gpkg) using QGIS, and double-checked for compatibility and data/attribute integrity using ESRI ArcGIS Pro. Note that files packaged within geopackages will open in ArcGIS Pro with “main.” preceding each file name, and an extra column named “geom” defining geometry type in the attribute table. The contents of each geospatial file remain intact, unless otherwise stated in “niwot_geospatial_data_list_07012021.pdf/.xlsx”. This list of files can be found as an .xlsx and a .pdf in this archive. As an open-source file format, files within gpkgs (TIFF, shapefiles, ESRI grid or “Arc/Info Binary”) can be read using both QGIS and ArcGIS Pro, and any other geospatial softwares. Text and .csv files can be read using TextEdit/Notepad/any simple text-editing software; .csv’s can also be opened using Microsoft Excel and R. .kml files can be opened using Google Maps or Google Earth, and Trimble files are most compatible with Trimble’s GPS Pathfinder Office software. .xlsx files can be opened using Microsoft Excel. PDFs can be opened using Adobe Acrobat Reader, and any other compatible programs. A selection of original shapefiles within this archive were generated using ArcMap with associated FGDC-standardized metadata (xml file format). We are including these original files because they contain metadata only accessible using ESRI programs at this time, and so that the relationship between shapefiles and xml files is maintained. Individual xml files can be opened (without a GIS-specific program) using TextEdit or Notepad. Since ESRI’s compatibility with FGDC metadata has changed since the generation of these files, many shapefiles will require upgrading to be compatible with ESRI’s latest versions of geospatial software. These details are also noted in the “niwot_geospatial_data_list_07012021” file.

  8. Global map of tree density

    • figshare.com
    zip
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Crowther, T. W.; Glick, H. B.; Covey, K. R.; Bettigole, C.; Maynard, D. S.; Thomas, S. M.; Smith, J. R.; Hintler, G.; Duguid, M. C.; Amatulli, G.; Tuanmu, M. N.; Jetz, W.; Salas, C.; Stam, C.; Piotto, D.; Tavani, R.; Green, S.; Bruce, G.; Williams, S. J.; Wiser, S. K.; Huber, M. O.; Hengeveld, G. M.; Nabuurs, G. J.; Tikhonova, E.; Borchardt, P.; Li, C. F.; Powrie, L. W.; Fischer, M.; Hemp, A.; Homeier, J.; Cho, P.; Vibrans, A. C.; Umunay, P. M.; Piao, S. L.; Rowe, C. W.; Ashton, M. S.; Crane, P. R.; Bradford, M. A. (2023). Global map of tree density [Dataset]. http://doi.org/10.6084/m9.figshare.3179986.v2
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Crowther, T. W.; Glick, H. B.; Covey, K. R.; Bettigole, C.; Maynard, D. S.; Thomas, S. M.; Smith, J. R.; Hintler, G.; Duguid, M. C.; Amatulli, G.; Tuanmu, M. N.; Jetz, W.; Salas, C.; Stam, C.; Piotto, D.; Tavani, R.; Green, S.; Bruce, G.; Williams, S. J.; Wiser, S. K.; Huber, M. O.; Hengeveld, G. M.; Nabuurs, G. J.; Tikhonova, E.; Borchardt, P.; Li, C. F.; Powrie, L. W.; Fischer, M.; Hemp, A.; Homeier, J.; Cho, P.; Vibrans, A. C.; Umunay, P. M.; Piao, S. L.; Rowe, C. W.; Ashton, M. S.; Crane, P. R.; Bradford, M. A.
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Crowther_Nature_Files.zip This description pertains to the original download. Details on revised (newer) versions of the datasets are listed below. When more than one version of a file exists in Figshare, the original DOI will take users to the latest version, though each version technically has its own DOI. -- Two global maps (raster files) of tree density. These maps highlight how the number of trees varies across the world. One map was generated using biome-level models of tree density, and applied at the biome scale. The other map was generated using ecoregion-level models of tree density, and applied at the ecoregion scale. For this reason, transitions between biomes or between ecoregions may be unrealistically harsh, but large-scale estimates are robust (see Crowther et al 2015 and Glick et al 2016). At the outset, this study was intended to generate reliable estimates at broad spatial scales, which inherently comes at the cost of fine-scale precision. For this reason, country-scale (or larger) estimates are generally more robust than individual pixel-level estimates. Additionally, due to data limitations, estimates for Mangroves and Tropical coniferous forest (as identified by WWF and TNC) were generated using models constructed from Topical moist broadleaf forest data and Temperate coniferous forest data, respectively. Because we used ecological analogy, the estimates for these two biomes should be considered less reliable than those of other biomes . These two maps initially appeared in Crowther et al (2015), with the biome map being featured more prominently. Explicit publication of the data is associated with Glick et al (2016). As they are produced, updated versions of these datasets, as well as alternative formats, will be made available under Additional Versions (see below).

    Methods: We collected over 420,000 ground-sources estimates of tree density from around the world. We then constructed linear regression models using vegetative, climatic, topographic, and anthropogenic variables to produce forest tree density estimates for all locations globally. All modeling was done in R. Mapping was done using R and ArcGIS 10.1.

    Viewing Instructions: Load the files into an appropriate geographic information system (GIS). For the original download (ArcGIS geodatabase files), load the files into ArcGIS to view or export the data to other formats. Because these datasets are large and have a unique coordinate system that is not read by many GIS, we suggest loading them into an ArcGIS dataframe whose coordinate system matches that of the data (see File Format). For GeoTiff files (see Additional Versions), load them into any compatible GIS or image management program.

    Comments: The original download provides a zipped folder that contains (1) an ArcGIS File Geodatabase (.gdb) containing one raster file for each of the two global models of tree density – one based on biomes and one based on ecoregions; (2) a layer file (.lyr) for each of the global models with the symbology used for each respective model in Crowther et al (2015); and an ArcGIS Map Document (.mxd) that contains the layers and symbology for each map in the paper. The data is delivered in the Goode homolosine interrupted projected coordinate system that was used to compute biome, ecoregion, and global estimates of the number and density of trees presented in Crowther et al (2015). To obtain maps like those presented in the official publication, raster files will need to be reprojected to the Eckert III projected coordinate system. Details on subsequent revisions and alternative file formats are list below under Additional Versions.----------

    Additional Versions: Crowther_Nature_Files_Revision_01.zip contains tree density predictions for small islands that are not included in the data available in the original dataset. These predictions were not taken into consideration in production of maps and figures presented in Crowther et al (2015), with the exception of the values presented in Supplemental Table 2. The file structure follows that of the original data and includes both biome- and ecoregion-level models.

    Crowther_Nature_Files_Revision_01_WGS84_GeoTiff.zip contains Revision_01 of the biome-level model, but stored in WGS84 and GeoTiff format. This file was produced by reprojecting the original Goode homolosine files to WGS84 using nearest neighbor resampling in ArcMap. All areal computations presented in the manuscript were computed using the Goode homolosine projection. This means that comparable computations made with projected versions of this WGS84 data are likely to differ (substantially at greater latitudes) as a product of the resampling. Included in this .zip file are the primary .tif and its visualization support files.

    References:

    Crowther, T. W., Glick, H. B., Covey, K. R., Bettigole, C., Maynard, D. S., Thomas, S. M., Smith, J. R., Hintler, G., Duguid, M. C., Amatulli, G., Tuanmu, M. N., Jetz, W., Salas, C., Stam, C., Piotto, D., Tavani, R., Green, S., Bruce, G., Williams, S. J., Wiser, S. K., Huber, M. O., Hengeveld, G. M., Nabuurs, G. J., Tikhonova, E., Borchardt, P., Li, C. F., Powrie, L. W., Fischer, M., Hemp, A., Homeier, J., Cho, P., Vibrans, A. C., Umunay, P. M., Piao, S. L., Rowe, C. W., Ashton, M. S., Crane, P. R., and Bradford, M. A. 2015. Mapping tree density at a global scale. Nature, 525(7568): 201-205. DOI: http://doi.org/10.1038/nature14967Glick, H. B., Bettigole, C. B., Maynard, D. S., Covey, K. R., Smith, J. R., and Crowther, T. W. 2016. Spatially explicit models of global tree density. Scientific Data, 3(160069), doi:10.1038/sdata.2016.69.

  9. f

    Input data for openSTARS.

    • plos.figshare.com
    xls
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mira Kattwinkel; Eduard Szöcs; Erin Peterson; Ralf B. Schäfer (2023). Input data for openSTARS. [Dataset]. http://doi.org/10.1371/journal.pone.0239237.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Mira Kattwinkel; Eduard Szöcs; Erin Peterson; Ralf B. Schäfer
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Input data for openSTARS.

  10. Data from: D = R x T

    • hub.arcgis.com
    • geoinquiries-education.hub.arcgis.com
    Updated Apr 18, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri GIS Education (2017). D = R x T [Dataset]. https://hub.arcgis.com/maps/Education::d-r-x-t/about
    Explore at:
    Dataset updated
    Apr 18, 2017
    Dataset provided by
    Esrihttp://esri.com/
    Authors
    Esri GIS Education
    Area covered
    Description

    Use an aerial photograph to determine the distance around a track, and then calculate rate and time for each lap and the race as a whole.

  11. Racially or Ethnically Concentrated Areas of Poverty (R/ECAPs)

    • catalog.data.gov
    • hudgis-hud.opendata.arcgis.com
    Updated Mar 1, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Department of Housing and Urban Development (2024). Racially or Ethnically Concentrated Areas of Poverty (R/ECAPs) [Dataset]. https://catalog.data.gov/dataset/racially-or-ethnically-concentrated-areas-of-poverty-r-ecaps
    Explore at:
    Dataset updated
    Mar 1, 2024
    Dataset provided by
    United States Department of Housing and Urban Developmenthttp://www.hud.gov/
    Description

    To assist communities in identifying racially/ethnically-concentrated areas of poverty (R/ECAPs), HUD has developed a census tract-based definition of R/ECAPs. The definition involves a racial/ethnic concentration threshold and a poverty test. The racial/ethnic concentration threshold is straightforward: R/ECAPs must have a non-white population of 50 percent or more. Regarding the poverty threshold, Wilson (1980) defines neighborhoods of extreme poverty as census tracts with 40 percent or more of individuals living at or below the poverty line.

  12. f

    vfillDL: A geomorphology deep learning dataset of valley fill faces...

    • figshare.com
    bin
    Updated Mar 22, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Aaron Maxwell (2023). vfillDL: A geomorphology deep learning dataset of valley fill faces resulting from mountaintop removal coal mining (southern West Virginia, eastern Kentucky, and southwestern Virginia, USA) [Dataset]. http://doi.org/10.6084/m9.figshare.22318522.v2
    Explore at:
    binAvailable download formats
    Dataset updated
    Mar 22, 2023
    Dataset provided by
    figshare
    Authors
    Aaron Maxwell
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Southwest Virginia, Southern West Virginia, West Virginia, United States
    Description

    scripts.zip

    arcgisTools.atbx: terrainDerivatives: make terrain derivatives from digital terrain model (Band 1 = TPI (50 m radius circle), Band 2 = square root of slope, Band 3 = TPI (annulus), Band 4 = hillshade, Band 5 = multidirectional hillshades, Band 6 = slopeshade). rasterizeFeatures: convert vector polygons to raster masks (1 = feature, 0 = background).

    makeChips.R: R function to break terrain derivatives and chips into image chips of a defined size. makeTerrainDerivatives.R: R function to generated 6-band terrain derivatives from digital terrain data (same as ArcGIS Pro tool). merge_logs.R: R script to merge training logs into a single file. predictToExtents.ipynb: Python notebook to use trained model to predict to new data. trainExperiments.ipynb: Python notebook used to train semantic segmentation models using PyTorch and the Segmentation Models package. assessmentExperiments.ipynb: Python code to generate assessment metrics using PyTorch and the torchmetrics library. graphs_results.R: R code to make graphs with ggplot2 to summarize results. makeChipsList.R: R code to generate lists of chips in a directory. makeMasks.R: R function to make raster masks from vector data (same as rasterizeFeatures ArcGIS Pro tool).

    vfillDL.zip

    dems: LiDAR DTM data partitioned into training, three testing, and two validation datasets. Original DTM data were obtained from 3DEP (https://www.usgs.gov/3d-elevation-program) and the WV GIS Technical Center (https://wvgis.wvu.edu/) . extents: extents of the training, testing, and validation areas. These extents were defined by the researchers. vectors: vector features representing valley fills and partitioned into separate training, testing, and validation datasets. Extents were created by the researchers.

  13. Standardized Precipitation Index (SPI) Recent Conditions

    • cacgeoportal.com
    • resilience.climate.gov
    • +9more
    Updated Aug 16, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri (2022). Standardized Precipitation Index (SPI) Recent Conditions [Dataset]. https://www.cacgeoportal.com/maps/8f5deec9956e4a8cb1f13dfd8c0232db
    Explore at:
    Dataset updated
    Aug 16, 2022
    Dataset authored and provided by
    Esrihttp://esri.com/
    Area covered
    Description

    Droughts are natural occurring events in which dry conditions persist over time. Droughts are complex to characterize because they depend on water and energy balances at different temporal and spatial scales. The Standardized Precipitation Index (SPI) is used to analyze meteorological droughts. SPI estimates the deviation of precipitation from the long-term probability function at different time scales (e.g. 1, 3, 6, 9, or 12 months). SPI only uses monthly precipitation as an input, which can be helpful for characterizing meteorological droughts. Other variables should be included (e.g. temperature or evapotranspiration) in the characterization of other types of droughts (e.g. agricultural droughts).This layer shows the SPI index at different temporal periods calculated using the SPEI library in R and precipitation data from CHIRPS data set.Sources:Climate Hazards Center InfraRed Precipitation with Station data (CHIRPS)SPEI R library

  14. f

    DataSheet_1_R/UAStools::plotshpcreate: Create Multi-Polygon Shapefiles for...

    • frontiersin.figshare.com
    zip
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Steven L. Anderson; Seth C. Murray (2023). DataSheet_1_R/UAStools::plotshpcreate: Create Multi-Polygon Shapefiles for Extraction of Research Plot Scale Agriculture Remote Sensing Data.zip [Dataset]. http://doi.org/10.3389/fpls.2020.511768.s001
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Frontiers
    Authors
    Steven L. Anderson; Seth C. Murray
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Agricultural researchers are embracing remote sensing tools to phenotype and monitor agriculture crops. Specifically, large quantities of data are now being collected on small plot research studies using Unoccupied Aerial Systems (UAS, aka drones), ground systems, or other technologies but data processing and analysis lags behind. One major contributor to current data processing bottlenecks has been the lack of publicly available software tools tailored towards remote sensing of small plots and usability for researchers inexperienced in remote sensing. To address these needs we created plot shapefile maker (R/UAS::plotshpcreate): an open source R function which rapidly creates ESRI polygon shapefiles to the desired dimensions of individual agriculture research plots areas of interest and associates plot specific information. Plotshpcreate was developed to utilize inputs containing experimental design, field orientation, and plot dimensions for easily creating a multi-polygon shapefile of an entire small plot experiment. Output shapefiles are based on the user inputs geolocation of the research field ensuring accurate overlay of polygons often without manual user adjustment. The output shapefile is useful in GIS software to extract plot level data tracing back to the unique IDs of the experimental plots. Plotshpcreate is available on GitHub (https://github.com/andersst91/UAStools).

  15. n

    Tall, heterogenous forests improve prey capture, delivery to nestlings, and...

    • data.niaid.nih.gov
    • datadryad.org
    • +1more
    zip
    Updated Dec 12, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zachary Wilkinson; H. Anu Kramer; Gavin Jones; Ceeanna Zulla; Kate McGinn; Josh Barry; Sarah Sawyer; Richard Tanner; R. J. Gutiérrez; John Keane; M. Zachariah Peery (2022). Tall, heterogenous forests improve prey capture, delivery to nestlings, and reproductive success for Spotted Owls in southern California [Dataset]. http://doi.org/10.5061/dryad.h70rxwdnq
    Explore at:
    zipAvailable download formats
    Dataset updated
    Dec 12, 2022
    Dataset provided by
    University of Minnesota
    University of Wisconsin–Madison
    Rocky Mountain Research Station
    US Forest Service
    Tanner environmental services
    Authors
    Zachary Wilkinson; H. Anu Kramer; Gavin Jones; Ceeanna Zulla; Kate McGinn; Josh Barry; Sarah Sawyer; Richard Tanner; R. J. Gutiérrez; John Keane; M. Zachariah Peery
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Area covered
    California, Southern California
    Description

    Predator-prey interactions can be profoundly influenced by vegetation conditions, particularly when predator and prey prefer different habitats. Although such interactions have proven challenging to study for small and cryptic predators, recent methodological advances substantially improve opportunities for understanding how vegetation influences prey acquisition and strengthen conservation planning for this group. The California Spotted Owl (Strix occidentalis occidentalis) is well-known as an old-forest species of conservation concern, but whose primary prey in many regions – woodrats (Neotoma spp.) – occurs in a broad range of vegetation conditions. Here, we used high-resolution GPS tracking coupled with nest video monitoring to test the hypothesis that prey capture rates vary as a function of vegetation structure and heterogeneity, with emergent, reproductive consequences for Spotted Owls in Southern California. Foraging owls were more successful capturing prey, including woodrats, in taller multilayered forests, in areas with higher heterogeneity in vegetation types, and near forest-chaparral edges. Consistent with these findings, Spotted Owls delivered prey items more frequently to nests in territories with greater heterogeneity in vegetation types and delivered prey biomass at a higher rate in territories with more forest-chaparral edge. Spotted Owls had higher reproductive success in territories with higher mean canopy cover, taller trees, and more shrubby vegetation. Collectively, our results provide additional and compelling evidence that a mosaic of large tree forests with complex canopy and shrubby vegetation increases access to prey with potential reproductive benefits to Spotted Owls in landscapes where woodrats are a primary prey item. We suggest that forest management activities that enhance forest structure and vegetation heterogeneity could help curb declining Spotted Owl populations while promoting resilient ecosystems in some regions. Methods See README DOCUMENT Naming conventions *RSF or prey refers to prey capture analysis *delivery in a file name refers to delivery rate analysis *repro in a filename means that file is for the delivery rate analysis

    Setup *files with vegetation data should work with minimal alteration(will need to specify working directory) with associated R code for each analysis *Shapefiles were made in ArcGIS pro but they can be opened with any GIS software such as QGIS.

    Locational data files

    NOTE LOCATIONAL DATA IS SHIFTED AND ROTATED FROM THE ORIGINAL -due to the sensitive nature of this species. The locational_data includes: * All_2021_owls_shifted * Point file showing all GPS tag locations for prey capture analysis * Attributes include: * TERRITORY ID: Numerical identifier for each bird * Year: year GPS tag was recorded * Month: month GPS tag was recorded * Day: Day GPS tag was recorded * Hour: Hour GPS tag was recorded * Minute: minute GPS tag was recorded * All_linked_polygons_shifted * Polygon file showing capture polygons for prey capture analysis * Attributes include * Territory ID: numerical identifier for each bird * Polygon id: numerical identifier for each capture polygon for each bird * Shape area: area of each polygon * SBNF_camera_nests_shifted * Point file showing spotted owl nests for prey capture analysis * Attributes include * Territory id: numerical identifier for each bird * C95_KDE_2021_socal_shifted * Polygon file of owls 95% kernel density estimate for prey delivery rate analysis * Attributes include * Id: numerical identifier for each territory(bird) * Area: area of each polygon * San_bernardino_territory_centers * Point file showing Territory centers for historical SBNF territories – shifted for repro success analysis * Attributes include * Repro Territory id: unique identifier for each territory in broader set of territories

    Besides the sifted locational data we have included - For the Resource selection function vegetation data, for the delivery analysis we have included an overview of prey deliveries by territory and vegetation data used, and for the reproductive analysis we have again included vegetation data as well as an overview of reproductive success. these are labled as follows:

    Files for the prey capture analysis

    Socal_RSF_data.txt

    *description: Text file with vegetation data paired with capture locations both buffered polygons used in prey capture analysis and the unbuffered ones which were not used.(Pair with Socal_rsf_code R script) *format: .txt *Dimensions: 2641 X 35

    *Variables: *ORIG_fid: completely unique identifier for each row *unique_id: unique identifier for each capture polygon(shared between a buffered capture location and its unbuffered pair) *territory_id: unique numerical idenifier of territory *Polygon_id: within territory unique prey capture polygon id *buff: bianary buffered or unbuffered (1=buffered, 0=unbuffered) *used: bianary used=1 available=0 *prey_type: prey species associated with polygon unkn:unknown, flsq:flying squirel, wora:woodrat, umou:mouse, pogo:pocketgopher, grsq: grey squirel, ubrd: unknown bird, umol:unknown mole, uvol, unknown vole. *area_sqm: area of polygon in square meters *CanCov_2020_buff: average canopy cover in polygon *CanHeight_2020_buff: average canopy height in polygon *Canlayer_2020_buff: average number of canopy layers in polygon *Understory_density_2020_buff: average brushy vegetation density in polygon *pix_COUNT: count of pixels in polygon (not needed for analysis) *p_chaparral: percent of polygon comprised of chaparral habitat
    *p_conifer: percent of polygon comprised of conifer habitat *p_hardwood: percent of polygon comprised of hardwood habitat *p_other: percent of polygon comprised of other habitat types *Calveg_cap_CHt_gt10_CC_30to70_intersect_buff: percent of polygon comprised of trees taller than 10m with 30-70percent canopy cover (used to check data) *Calveg_cap_CHt_gt10_CCgt70_intersect_buff: percent of polygon comprised of trees taller than 10m with greater than 70percent canopy cover (used to check data) *Calveg_cap_CHt_lt10_intersect_buff:percent of polygon comprised of trees less than 10m (used to check data)
    *p_sm_conifer: percent of polygon comprised of conifer trees less than 10m (used to calculate diversity)
    *p_lrg_conifer_sc: percent of polygon comprised of conifer forests >10m tall with sparse canopy(used to calculate diversity) *p_large_conifer_dc: percent of polygon comprised of conifer forests greater than 10m tall with dense canopy (used to calculate diversity) *p_sm_hard: percent of polygon comprised of hardwood trees less than 10m (used to calculate diversity) *p_lrg_hard_sc: percent of polygon comprised of hardwood forests greater than 10m with sparse canopy(used to calculate diversity)
    *p_lrg_hard_dc: percent of polygon comprised of hardwood forests greater than 10m dense canopy (used to calculate diversity) *p_forests_gt10_verysparse_CC: percent of polygon comprised of trees less than 10m with very sparse canopies (used to calculate diversity) *primary_edge: total distance in meters of primary edge in a polygon
    *normalized_by_area_primary_edge: total distance in m of primary edge in a polygon divided by the area of the polygon
    *secondary_edge: total distance in meters of secondary edge in a polygon *normalized_by_area_secondary_edge:total distance in m of secondary edge in a polygon divided by the area of the polygon *coarse_diversity: shannon diversity in each polygon (see methods below) *fine_diversity: shannon diversity in each polygon (see methods below) *nest_distance: distance from polygon center to nest for each polygon in meters

    For the Delivery analysis

    note: For information on determining average prey biomass see methods as well as zulla et al 2022 for flying squirels and woodrat masses Zulla CJ, Jones GM, Kramer HA, Keane JJ, Roberts KN, Dotters BP, Sawyer SC, Whitmore SA, Berigan WJ, Kelly KG, Gutiérrez RJ, Peery MZ. Forest heterogeneity outweighs movement costs by enhancing hunting success and fitness in spotted owls. doi:10.21203/rs.3.rs-1370884/v1. PPR:PPR470028.

    prey_deliveries_byterritory.csv *Description: overview file of prey delivered to each nest *format: .csv *dimensions:332 x 8

    *Variables: *SITE: Unique numerical identifier for each territory *DATE: date prey was delivered (in UTC) *CAMERA TIME: time in UTC prey was delivered *VIDEO TIME: time on video prey was delivered - unrelated to real time just original file
    *PREY ITEM: prey species delivered to nest unkn:unknown, uncr: unknown if delivery(removed from eventual analysis due to

  16. t

    Major Streets and Routes - Open Data

    • gisdata.tucsonaz.gov
    • data-cotgis.opendata.arcgis.com
    Updated Aug 2, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    City of Tucson (2018). Major Streets and Routes - Open Data [Dataset]. https://gisdata.tucsonaz.gov/items/c6d21082e6d248f0b7db0ff4f6f0ed8e
    Explore at:
    Dataset updated
    Aug 2, 2018
    Dataset authored and provided by
    City of Tucson
    Area covered
    Description

    The MS&R Plan identifies the general location and size of existing and proposed freeways, arterial and collector streets, future rights-of-way, setback requirements, typical intersections and cross sections, and gateway and scenic routes. The City’s Department of Transportation and the Planning and Development Services Department (PDSD) implement the MS&R Plan. The MS&R Plan is considered a Land Use Plan as defined in the Unified Development Code (UDC) Section 3.6, and, therefore, is subject to amendment in accordance with the standard Land Use Plan and Adoption and Amendment Procedures. The MS&R right-of-way lines are used in determining the setback for development through the MS&R Overlay provisions of the UDC. As stated in the current MS&R Plan, page 4, “The purpose of the Major Streets and Routes Plan is to facilitate future street widening, to inform the public which streets are the main thoroughfares, so that land use decisions can be based accordingly, and to reduce the disruption of existing uses on a property. By stipulating the required right-of-way, new development can be located so as to prepare for planned street improvements without demolition of buildings or loss of necessary parking.”PurposeThe major purposes of the Major Streets and Routes Plan are to identify street classifications, the width of public rights-of-way, to designate special routes, and to guide land use decisions. General Plan policies stipulate that planning and developing new transportation facilities be accomplished by identifying rights-of-way in the Major Streets and Routes Plan. The policies also aim to encourage bicycle and pedestrian travel, "minimize disruption of the environment," and "coordinate land use patterns with transportation plans" by using the street classification as a guide to land use decisions.Dataset ClassificationLevel 0 - OpenKnown UsesThis layer is intended to be used in the Open Data portal and not for regular use in ArcGIS Online and ArcGIS Enterprise.Known ErrorsLorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.Data ContactLorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.Update FrequencyAs needed

  17. Digital Geomorphic-GIS Map of Gulf Islands National Seashore (5-meter...

    • catalog.data.gov
    • datasets.ai
    Updated Jun 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Park Service (2024). Digital Geomorphic-GIS Map of Gulf Islands National Seashore (5-meter accuracy and 1-foot resolution 2006-2007 mapping), Mississippi and Florida (NPS, GRD, GRI, GUIS, GUIS_geomorphology digital map) adapted from U.S. Geological Survey Open File Report maps by Morton and Rogers (2009) and Morton and Montgomery (2010) [Dataset]. https://catalog.data.gov/dataset/digital-geomorphic-gis-map-of-gulf-islands-national-seashore-5-meter-accuracy-and-1-foot-r
    Explore at:
    Dataset updated
    Jun 5, 2024
    Dataset provided by
    National Park Servicehttp://www.nps.gov/
    Area covered
    Guisguis Port Sariaya, Quezon
    Description

    The Digital Geomorphic-GIS Map of Gulf Islands National Seashore (5-meter accuracy and 1-foot resolution 2006-2007 mapping), Mississippi and Florida is composed of GIS data layers and GIS tables, and is available in the following GRI-supported GIS data formats: 1.) a 10.1 file geodatabase (guis_geomorphology.gdb), a 2.) Open Geospatial Consortium (OGC) geopackage, and 3.) 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. The file geodatabase format is supported with a 1.) ArcGIS Pro map file (.mapx) file (guis_geomorphology.mapx) and individual Pro layer (.lyrx) files (for each GIS data layer), as well as with a 2.) 10.1 ArcMap (.mxd) map document (guis_geomorphology.mxd) and individual 10.1 layer (.lyr) files (for each GIS data layer). The OGC geopackage is supported with a QGIS project (.qgz) file. Upon request, the GIS data is also available in ESRI 10.1 shapefile format. Contact Stephanie O'Meara (see contact information below) to acquire the GIS data in these GIS data formats. In addition to the GIS data and supporting GIS files, three additional files comprise a GRI digital geologic-GIS dataset or map: 1.) A GIS readme file (guis_geology_gis_readme.pdf), 2.) the GRI ancillary map information document (.pdf) file (guis_geomorphology.pdf) which contains geologic unit descriptions, as well as other ancillary map information and graphics from the source map(s) used by the GRI in the production of the GRI digital geologic-GIS data for the park, and 3.) a user-friendly FAQ PDF version of the metadata (guis_geomorphology_metadata_faq.pdf). Please read the guis_geology_gis_readme.pdf for information pertaining to the proper extraction of the GIS data and other map files. Google Earth software is available for free at: https://www.google.com/earth/versions/. QGIS software is available for free at: https://www.qgis.org/en/site/. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). For a complete listing of GRI products visit the GRI publications webpage: For a complete listing of GRI products visit the GRI publications webpage: https://www.nps.gov/subjects/geology/geologic-resources-inventory-products.htm. For more information about the Geologic Resources Inventory Program visit the GRI webpage: https://www.nps.gov/subjects/geology/gri,htm. At the bottom of that webpage is a "Contact Us" link if you need additional information. You may also directly contact the program coordinator, Jason Kenworthy (jason_kenworthy@nps.gov). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (guis_geomorphology_metadata.txt or guis_geomorphology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:26,000 and United States National Map Accuracy Standards features are within (horizontally) 13.2 meters or 43.3 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS, QGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.3. (available at: https://www.nps.gov/articles/gri-geodatabase-model.htm).

  18. H

    Public GIS files for mapping carbonate springs

    • beta.hydroshare.org
    • hydroshare.org
    • +1more
    zip
    Updated Aug 19, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Laura Toran; Michael Jones (2024). Public GIS files for mapping carbonate springs [Dataset]. https://beta.hydroshare.org/resource/07ebf29817dc423aae09de01741c167e/
    Explore at:
    zip(5.1 MB)Available download formats
    Dataset updated
    Aug 19, 2024
    Dataset provided by
    HydroShare
    Authors
    Laura Toran; Michael Jones
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    This abstract contains links to public ArcGIS maps that include locations of carbonate springs and some of their characteristics. Information for accessing and navigating through the maps are included in a PowerPoint presentation IN THE FILE UPLOAD SECTION BELOW. Three separate data sets are included in the maps:

    1. Geochemistry data from the US Water Quality Portal (WQP), which compiles geochemistry data from the USGS and other federal agencies.
    2. Discharge data from WoKaS, a world wide spring discharge data set (Olarinoye et al., 2020).
    3. Regional karst data from selected US state agencies.

    Several base maps are included in the links. The US carbonate map describes and categorizes carbonates (e.g., depth from surface, overlying geology/ice, climate). The carbonate springs map categorizes springs as being urban, specifically within 1000 ft of a road, or rural. The basis for this categorization was that the heat island effect defines urban as within a 1000 ft of a road. There are other methods for defining urban versus rural to consider. Map links and details of the information they contain are listed below.

    Map set 1: The WQP map provides three mapping options separated by the parameters available at each spring site. These maps summarize discrete water quality samples, but not data logger availability. Information at each spring provides links for where users can explore further data.

    Option 1: WQP data with urban and rural springs labeled, with highlight of springs with or without NWIS data https://www.arcgis.com/home/item.html?id=2ce914ec01f14c20b58146f5d9702d8a

    Options 2: WQP data by major ions and a few other solutes https://www.arcgis.com/home/item.html?id=5a114d2ce24c473ca07ef9625cd834b8

    Option 3:WQP data by various carbon species https://www.arcgis.com/home/item.html?id=ae406f1bdcd14f78881905c5e0915b96

    Map 2: The worldwide carbonate map in the WoKaS data set (citation below) includes a description of carbonate purity and distribution of urban and rural springs, for which discharge data are available: https://www.arcgis.com/apps/mapviewer/index.html?webmap=5ab43fdb2b784acf8bef85b61d0ebcbe.

    Reference: Olarinoye, T., Gleeson, T., Marx, V., Seeger, S., Adinehvand, R., Allocca, V., Andreo, B., Apaéstegui, J., Apolit, C., Arfib, B. and Auler, A., 2020. Global karst springs hydrograph dataset for research and management of the world’s fastest-flowing groundwater. Scientific Data, 7(1), pp.1-9.

    Map 3: Karst and spring data from selected states: This map includes sites that members of the RCN have suggested to our group.

    https://uageos.maps.arcgis.com/apps/mapviewer/index.html?webmap=28ed22a14bb749e2b22ece82bf8a8177

    This data set is incomplete (as of October 13, 2022 it includes Florida and Missouri). We are looking for more information. You can share data links to additional data by typing them into the hydroshare page created for our group. Then new sites will periodically be added to the map: https://www.hydroshare.org/resource/0cf10e9808fa4c5b9e6a7852323e6b11/

    Acknowledgements: These maps were created by Michael Jones, University of Arkansas and Shishir Sarker, University of Kentucky with help from Laura Toran and Francesco Navarro, Temple University.

    TIPS FOR NAVIGATING THE MAPS ARE IN THE POWERPOINT DOCUMENT IN THE FILE UPLOAD SECTION BELOW.

  19. a

    Image Footprints with Time Attributes

    • margig-edt.hub.arcgis.com
    • national-government.esrij.com
    • +15more
    Updated May 12, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri European National Government Team (2019). Image Footprints with Time Attributes [Dataset]. https://margig-edt.hub.arcgis.com/datasets/image-footprints-with-time-attributes
    Explore at:
    Dataset updated
    May 12, 2019
    Dataset authored and provided by
    Esri European National Government Team
    Area covered
    Description

    Map Information This nowCOAST time-enabled map service provides maps of experimental lightning strike density data from the NOAA/National Weather Service/NCEP's Ocean Prediction Center (OPC) which emulate (simulate) data from the future NOAA GOES-R Global Lightning Mapper (GLM). The purpose of this experimental product is to provide mariners and others with enhanced "awareness of developing and transitory thunderstorm activity, to give users the ability to determine whether a cloud system is producing lightning and if that activity is increasing or decreasing..." Lightning Strike Density, as opposed to display of individual strikes, highlights the location of lightning cores and trends of increasing and decreasing activity. The maps depict the density of lightning strikes during a 15 minute time period at an 8 km x 8 km spatial resolution. The lightning strike density maps cover the geographic area from 25 degrees South to 80 degrees North latitude and from 110 degrees East to 0 degrees West longitude. The map units are number of strikes per square km per minute multiplied by a scaling factor of 10^3. The strike density is color coded using a color scheme which allows the data to be easily seen when overlaid on GOES imagery and to distinguish values at low density values. The maps are updated on nowCOAST approximately every 15 minutes. The latest data depicted on the maps are approximately 12 minutes old (or older). The OPC lightning strike density product is still experimental and may not always be available. Given the spatial resolution and latency of the data, the data should NOT be used to activite your lightning safety plans. Always follow the safety rule: when you first hear thunder or see lightning in your area, activate your emergency plan. If outdoors, immediately seek shelter in a substantial building or a fully enclosed metal vehicle such as a car, truck or a van. Do not resume activities until 30 minutes after the last observed lightning or thunder. For more detailed information about the update schedule for the lightning strike density data maps on nowCOAST, please see: http://new.nowcoast.noaa.gov/help/#section=updateschedule Background Information The source for the data is OPC's gridded lightning strike density data on an 8 x 8 km grid. The gridded data emulate the spatial resolution of the future Global Lightning Mapper (GLM) instrument to be flown on the NOAA GOES-R series of geostationary satellites, with the first satellite scheduled for launch in early 2016. The gridded data is based on data from Vaisala's ground based Vaisala's U.S. National Lightning Detection Network (NLDN) and its global lightning detection network referred to as the Global Lightning Dataset (GLD360). These networks are capable of detecting cloud-to-ground strokes, cloud-to-ground flash information and survey level cloud lightning information. According to the National Lightning Safety Institute, NLDN uses radio frequency detectors in the spectrum 1.0 kHz through 400 kHz to measure energy discharges from lightning as well as approximate distance and direction. According to Vaisala, the GLD360 network is capable of a detection efficiency greater than 70% over most of the Northern Hemisphere with a median location accuracy of 5 km or better. OPC's experimental gridded data are coarser than the original source data from Vaisala's networks. The 15-minute gridded source data are updated at OPC every 15 minutes at 10 minutes past the valid time. The lightning strike density product from NWS/NCEP/OPC is considered a derived product or Level 5 product ("NOAA-generated products using lightning data as input but not displaying the contractor transmitted/provided lightning data") and is appropriate for public distribution. Time Information

    This map is time-enabled, meaning that each individual layer contains time-varying data and can be utilized by clients capable of making map requests that include a time component.

    This particular service can be queried with or without the use of a time component. If the time parameter is specified in a request, the data or imagery most relevant to the provided time value, if any, will be returned. If the time parameter is not specified in a request, the latest data or imagery valid for the present system time will be returned to the client. If the time parameter is not specified and no data or imagery is available for the present time, no data will be returned.

    In addition to ArcGIS Server REST access, time-enabled OGC WMS 1.3.0 access is also provided by this service.

    Due to software limitations, the time extent of the service and map layers displayed below does not provide the most up-to-date start and end times of available data. Instead, users have three options for determining the latest time information about the service:

    Issue a returnUpdates=true request for an individual layer or for the service itself, which will return the current start and end times of available data, in epoch time format (milliseconds since 00:00 January 1, 1970). To see an example, click on the "Return Updates" link at the bottom of this page under "Supported Operations". Refer to the ArcGIS REST API Map Service Documentation for more information.

    Issue an Identify (ArcGIS REST) or GetFeatureInfo (WMS) request against the proper layer corresponding with the target dataset. For raster data, this would be the "Image Footprints with Time Attributes" layer in the same group as the target "Image" layer being displayed. For vector (point, line, or polygon) data, the target layer can be queried directly. In either case, the attributes returned for the matching raster(s) or vector feature(s) will include the following:

    validtime: Valid timestamp.

    starttime: Display start time.

    endtime: Display end time.

    reftime: Reference time (sometimes reffered to as issuance time, cycle time, or initialization time).

    projmins: Number of minutes from reference time to valid time.

    desigreftime: Designated reference time; used as a common reference time for all items when individual reference times do not match.

    desigprojmins: Number of minutes from designated reference time to valid time.

    Query the nowCOAST LayerInfo web service, which has been created to provide additional information about each data layer in a service, including a list of all available "time stops" (i.e. "valid times"), individual timestamps, or the valid time of a layer's latest available data (i.e. "Product Time"). For more information about the LayerInfo web service, including examples of various types of requests, refer to the nowCOAST help documentation at: http://new.nowcoast.noaa.gov/help/#section=layerinfo

    References Kithil, 2015: Overview of Lightning Detection Equipment, National Lightning Safety Institute, Louisville, CO. (Available from http://www.lightningsafety.com/nsli_ihm/detectors.html).NASA and NOAA, 2014: Geostationary Lightning Mapper (GLM). (Available at http://www.goes-r.gov/spacesegment/glm.html).NWS, 2013: Experimental Lightning Strike Density Product Description Document. NOAA/NWS/NCEP/Ocean Prediction Center, College Park, MD (Available at http://www.opc.ncep.noaa.gov/lightning/lightning_pdd.php and http://products.weather.gov/PDD/Experimental%20Lightning%20Strike%20Density%20Product%2020130913.pdf). ,li>NOAA Knows Lightning. NWS, Silver Spring, MD (Available at http://www.lightningsafety.noaa.gov/resources/lightning3_050714.pdf).) Siebers, A., 2013: Soliciting Comments until June 3, 2014 on an Experimental Lightning Strike Density product (Offshore Waters). Public Information Notice, NOAA/NWS Headquarters, Washington, DC (Available at http://www.nws.noaa.gov/om/notification/pns13lightning_strike_density.htm).

  20. e

    State

    • coronavirus-resources.esri.com
    • arc-gis-hub-home-arcgishub.hub.arcgis.com
    • +1more
    Updated Mar 25, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri (2020). State [Dataset]. https://coronavirus-resources.esri.com/datasets/esri::state-63
    Explore at:
    Dataset updated
    Mar 25, 2020
    Dataset authored and provided by
    Esri
    Area covered
    Description

    The County Health Rankings, a collaboration between the Robert Wood Johnson Foundation and the University of Wisconsin Population Health Institute, measure the health of nearly all counties in the nation and rank them within states. This feature layer contains 2020 County Health Rankings data for nation, state, and county levels. The Rankings are compiled using county-level measures from a variety of national and state data sources. Some example measures are:adult smokingphysical inactivityflu vaccinationschild povertydriving alone to workTo see a full list of variables, as well as their definitions and descriptions, explore the Fields information by clicking the Data tab here in the Item Details. These measures are standardized and combined using scientifically-informed weights."By ranking the health of nearly every county in the nation, County Health Rankings & Roadmaps (CHR&R) illustrates how where we live affects how well and how long we live. CHR&R also shows what each of us can do to create healthier places to live, learn, work, and play – for everyone."Some new features of the 2020 Rankings data compared to previous versions:More race/ethnicity categories, including Asian/Pacific Islander and American Indian/Alaska NativeReliability flags that to flag an estimate as unreliable5 new variables: math scores, reading scores, juvenile arrests, suicides, and traffic volumeData Processing Notes:Data downloaded March 2020Slight modifications made to the source data are as follows:The string " raw value" was removed from field labels/aliases so that auto-generated legends and pop-ups would only have the measure's name, not "(measure's name) raw value" and strings such as "(%)", "rate", or "per 100,000" were added depending on the type of measure.Percentage and Prevalence fields were multiplied by 100 to make them easier to work with in the map.For demographic variables only, the word "numerator" was removed and the word "population" was added where appropriate.Fields dropped from analytic data file: yearall fields ending in "_cihigh" and "_cilow"and any variables that are not listed in the sources and years documentation.Analytic data file was then merged with state-specific ranking files so that all county rankings and subrankings are included in this layer.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Education and Research (2025). Introduction to R Scripting with ArcGIS [Dataset]. https://edu.hub.arcgis.com/documents/baec6865ffbc4c1c869a594b9cad8bc0

Introduction to R Scripting with ArcGIS

Explore at:
Dataset updated
Jan 18, 2025
Dataset authored and provided by
Education and Research
License

Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically

Description

This resource was created by Esri Canada Education and Research. To browse our full collection of higher-education learning resources, please visit https://hed.esri.ca/resourcefinder/.This Tutorial consists of four tutorials that deal with integrating the statistical programming language R with ArcGIS for Desktop. Several concepts are covered which include configuring ArcGIS with R, writing basic R scripts, writing R scripts that work with ArcGIS data, and constructing R Tools for use within ArcGIS Pro. It is recommended that the tutorials are completed in sequential order. Each of the four tutorials (as well as a version of this document), can viewed directly from your Web browser by following the links below. However, you must obtain a complete copy of the tutorial files by downloading the latest release (or by cloning the tutorial repository on GitHub) if you wish to follow the tutorials interactively using ArcGIS and R software, along with pre-configured sample data.To download the tutorial documents and datasets, click the Open button to the top right. This will automatically download a ZIP file containing all files and data required.You can also clone the tutorial documents and datasets for this GitHub repo: https://github.com/highered-esricanada/r-arcgis-tutorials.gitSoftware & Solutions Used: ArcGIS Pro 3.4 Internet browser (e.g., Mozilla Firefox, Google Chrome, Safari) R Statistical Computing Language – version 4.3.3 R-ArcGIS Bindings – version 1.0.1.311RStudio Desktop – version 2024.09.0+375Time to Complete: 2.5 h (excludes installation time)File Size: 115 MBDate Created: November 2017Last Updated: December 2024

Search
Clear search
Close search
Google apps
Main menu