Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
This resource was created by Esri Canada Education and Research. To browse our full collection of higher-education learning resources, please visit https://hed.esri.ca/resourcefinder/.This Tutorial consists of four tutorials that deal with integrating the statistical programming language R with ArcGIS for Desktop. Several concepts are covered which include configuring ArcGIS with R, writing basic R scripts, writing R scripts that work with ArcGIS data, and constructing R Tools for use within ArcGIS Pro. It is recommended that the tutorials are completed in sequential order. Each of the four tutorials (as well as a version of this document), can viewed directly from your Web browser by following the links below. However, you must obtain a complete copy of the tutorial files by downloading the latest release (or by cloning the tutorial repository on GitHub) if you wish to follow the tutorials interactively using ArcGIS and R software, along with pre-configured sample data.To download the tutorial documents and datasets, click the Open button to the top right. This will automatically download a ZIP file containing all files and data required.You can also clone the tutorial documents and datasets for this GitHub repo: https://github.com/highered-esricanada/r-arcgis-tutorials.gitSoftware & Solutions Used: ArcGIS Pro 3.4 Internet browser (e.g., Mozilla Firefox, Google Chrome, Safari) R Statistical Computing Language – version 4.3.3 R-ArcGIS Bindings – version 1.0.1.311RStudio Desktop – version 2024.09.0+375Time to Complete: 2.5 h (excludes installation time)File Size: 115 MBDate Created: November 2017Last Updated: December 2024
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This data repository hosts datasets that are used for students to practice spatial operations introduced in R-as-GIS lectures and workshops.
The Digital Geomorphic-GIS Map of Gulf Islands National Seashore (5-meter accuracy and 1-foot resolution 2006-2007 mapping), Mississippi and Florida is composed of GIS data layers and GIS tables, and is available in the following GRI-supported GIS data formats: 1.) a 10.1 file geodatabase (guis_geomorphology.gdb), a 2.) Open Geospatial Consortium (OGC) geopackage, and 3.) 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. The file geodatabase format is supported with a 1.) ArcGIS Pro map file (.mapx) file (guis_geomorphology.mapx) and individual Pro layer (.lyrx) files (for each GIS data layer), as well as with a 2.) 10.1 ArcMap (.mxd) map document (guis_geomorphology.mxd) and individual 10.1 layer (.lyr) files (for each GIS data layer). The OGC geopackage is supported with a QGIS project (.qgz) file. Upon request, the GIS data is also available in ESRI 10.1 shapefile format. Contact Stephanie O'Meara (see contact information below) to acquire the GIS data in these GIS data formats. In addition to the GIS data and supporting GIS files, three additional files comprise a GRI digital geologic-GIS dataset or map: 1.) A GIS readme file (guis_geology_gis_readme.pdf), 2.) the GRI ancillary map information document (.pdf) file (guis_geomorphology.pdf) which contains geologic unit descriptions, as well as other ancillary map information and graphics from the source map(s) used by the GRI in the production of the GRI digital geologic-GIS data for the park, and 3.) a user-friendly FAQ PDF version of the metadata (guis_geomorphology_metadata_faq.pdf). Please read the guis_geology_gis_readme.pdf for information pertaining to the proper extraction of the GIS data and other map files. Google Earth software is available for free at: https://www.google.com/earth/versions/. QGIS software is available for free at: https://www.qgis.org/en/site/. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). For a complete listing of GRI products visit the GRI publications webpage: For a complete listing of GRI products visit the GRI publications webpage: https://www.nps.gov/subjects/geology/geologic-resources-inventory-products.htm. For more information about the Geologic Resources Inventory Program visit the GRI webpage: https://www.nps.gov/subjects/geology/gri,htm. At the bottom of that webpage is a "Contact Us" link if you need additional information. You may also directly contact the program coordinator, Jason Kenworthy (jason_kenworthy@nps.gov). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (guis_geomorphology_metadata.txt or guis_geomorphology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:26,000 and United States National Map Accuracy Standards features are within (horizontally) 13.2 meters or 43.3 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS, QGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.3. (available at: https://www.nps.gov/articles/gri-geodatabase-model.htm).
Earth Data Analysis Center (EDAC) at The University of New Mexico (UNM) develops, manages, and enhances the New Mexico Resource Geographic Information System (RGIS) Program and Clearinghouse. Nationally, NM RGIS is among the largest of state-based programs for digital geospatial data and information and continues to add to its data offerings, services, and technology.
The RGIS Program mission is to develop and expand geographic information and use of GIS technology, creating a comprehensive GIS resource for state and local governments, educational institutions, nonprofit organizations, and private businesses; to promote geospatial information and GIS technology as primary analytical tools for decision makers and researchers; and to provide a central Clearinghouse to avoid duplication and improve information transfer efficiency.
As a repository for digital geospatial data acquired from local and national public agencies and data created expressly for RGIS, the clearinghouse serves as a major hub in New Mexico’s network for inter-agency and intergovernmental coordination, data sharing, information transfer, and electronic communication. Data sets available for download include political and administrative boundaries, place names and locations, census data (current and historical), 30 years of digital orthophotography, 80 years of historic aerial photography, satellite imagery, elevation data, transportation data, wildfire boundaries and natural resource data.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Input data for openSTARS.
A complete copy of the source files and sample data used during this workshop, arranged into a step-by-step tutorial series, can be obtained from the repository page on GitHub: https://esricanada-ce.github.io/r-arcgis-tutorials/
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Crowther_Nature_Files.zip This description pertains to the original download. Details on revised (newer) versions of the datasets are listed below. When more than one version of a file exists in Figshare, the original DOI will take users to the latest version, though each version technically has its own DOI. -- Two global maps (raster files) of tree density. These maps highlight how the number of trees varies across the world. One map was generated using biome-level models of tree density, and applied at the biome scale. The other map was generated using ecoregion-level models of tree density, and applied at the ecoregion scale. For this reason, transitions between biomes or between ecoregions may be unrealistically harsh, but large-scale estimates are robust (see Crowther et al 2015 and Glick et al 2016). At the outset, this study was intended to generate reliable estimates at broad spatial scales, which inherently comes at the cost of fine-scale precision. For this reason, country-scale (or larger) estimates are generally more robust than individual pixel-level estimates. Additionally, due to data limitations, estimates for Mangroves and Tropical coniferous forest (as identified by WWF and TNC) were generated using models constructed from Topical moist broadleaf forest data and Temperate coniferous forest data, respectively. Because we used ecological analogy, the estimates for these two biomes should be considered less reliable than those of other biomes . These two maps initially appeared in Crowther et al (2015), with the biome map being featured more prominently. Explicit publication of the data is associated with Glick et al (2016). As they are produced, updated versions of these datasets, as well as alternative formats, will be made available under Additional Versions (see below).
Methods: We collected over 420,000 ground-sources estimates of tree density from around the world. We then constructed linear regression models using vegetative, climatic, topographic, and anthropogenic variables to produce forest tree density estimates for all locations globally. All modeling was done in R. Mapping was done using R and ArcGIS 10.1.
Viewing Instructions: Load the files into an appropriate geographic information system (GIS). For the original download (ArcGIS geodatabase files), load the files into ArcGIS to view or export the data to other formats. Because these datasets are large and have a unique coordinate system that is not read by many GIS, we suggest loading them into an ArcGIS dataframe whose coordinate system matches that of the data (see File Format). For GeoTiff files (see Additional Versions), load them into any compatible GIS or image management program.
Comments: The original download provides a zipped folder that contains (1) an ArcGIS File Geodatabase (.gdb) containing one raster file for each of the two global models of tree density – one based on biomes and one based on ecoregions; (2) a layer file (.lyr) for each of the global models with the symbology used for each respective model in Crowther et al (2015); and an ArcGIS Map Document (.mxd) that contains the layers and symbology for each map in the paper. The data is delivered in the Goode homolosine interrupted projected coordinate system that was used to compute biome, ecoregion, and global estimates of the number and density of trees presented in Crowther et al (2015). To obtain maps like those presented in the official publication, raster files will need to be reprojected to the Eckert III projected coordinate system. Details on subsequent revisions and alternative file formats are list below under Additional Versions.----------
Additional Versions: Crowther_Nature_Files_Revision_01.zip contains tree density predictions for small islands that are not included in the data available in the original dataset. These predictions were not taken into consideration in production of maps and figures presented in Crowther et al (2015), with the exception of the values presented in Supplemental Table 2. The file structure follows that of the original data and includes both biome- and ecoregion-level models.
Crowther_Nature_Files_Revision_01_WGS84_GeoTiff.zip contains Revision_01 of the biome-level model, but stored in WGS84 and GeoTiff format. This file was produced by reprojecting the original Goode homolosine files to WGS84 using nearest neighbor resampling in ArcMap. All areal computations presented in the manuscript were computed using the Goode homolosine projection. This means that comparable computations made with projected versions of this WGS84 data are likely to differ (substantially at greater latitudes) as a product of the resampling. Included in this .zip file are the primary .tif and its visualization support files.
References:
Crowther, T. W., Glick, H. B., Covey, K. R., Bettigole, C., Maynard, D. S., Thomas, S. M., Smith, J. R., Hintler, G., Duguid, M. C., Amatulli, G., Tuanmu, M. N., Jetz, W., Salas, C., Stam, C., Piotto, D., Tavani, R., Green, S., Bruce, G., Williams, S. J., Wiser, S. K., Huber, M. O., Hengeveld, G. M., Nabuurs, G. J., Tikhonova, E., Borchardt, P., Li, C. F., Powrie, L. W., Fischer, M., Hemp, A., Homeier, J., Cho, P., Vibrans, A. C., Umunay, P. M., Piao, S. L., Rowe, C. W., Ashton, M. S., Crane, P. R., and Bradford, M. A. 2015. Mapping tree density at a global scale. Nature, 525(7568): 201-205. DOI: http://doi.org/10.1038/nature14967Glick, H. B., Bettigole, C. B., Maynard, D. S., Covey, K. R., Smith, J. R., and Crowther, T. W. 2016. Spatially explicit models of global tree density. Scientific Data, 3(160069), doi:10.1038/sdata.2016.69.
This dataset reflects reported incidents of crime (with the exception of murders where data exists for each victim) that occurred in the City of Chicago from 2001 to present, minus the most recent seven days. Data is extracted from the Chicago Police Department's CLEAR (Citizen Law Enforcement Analysis and Reporting) system. In order to protect the privacy of crime victims, addresses are shown at the block level only and specific locations are not identified. Should you have questions about this dataset, you may contact the Research & Development Division of the Chicago Police Department at 312.745.6071 or RandD@chicagopolice.org. Disclaimer: These crimes may be based upon preliminary information supplied to the Police Department by the reporting parties that have not been verified. The preliminary crime classifications may be changed at a later date based upon additional investigation and there is always the possibility of mechanical or human error. Therefore, the Chicago Police Department does not guarantee (either expressed or implied) the accuracy, completeness, timeliness, or correct sequencing of the information and the information should not be used for comparison purposes over time. The Chicago Police Department will not be responsible for any error or omission, or for the use of, or the results obtained from the use of this information. All data visualizations on maps should be considered approximate and attempts to derive specific addresses are strictly prohibited. The Chicago Police Department is not responsible for the content of any off-site pages that are referenced by or that reference this web page other than an official City of Chicago or Chicago Police Department web page. The user specifically acknowledges that the Chicago Police Department is not responsible for any defamatory, offensive, misleading, or illegal conduct of other users, links, or third parties and that the risk of injury from the foregoing rests entirely with the user. The unauthorized use of the words "Chicago Police Department," "Chicago Police," or any colorable imitation of these words or the unauthorized use of the Chicago Police Department logo is unlawful. This web page does not, in any way, authorize such use. Data is updated daily Tuesday through Sunday. The dataset contains more than 65,000 records/rows of data and cannot be viewed in full in Microsoft Excel. Therefore, when downloading the file, select CSV from the Export menu. Open the file in an ASCII text editor, such as Wordpad, to view and search. To access a list of Chicago Police Department - Illinois Uniform Crime Reporting (IUCR) codes, go to http://data.cityofchicago.org/Public-Safety/Chicago-Police-Department-Illinois-Uniform-Crime-R/c7ck-438e
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This software version has been superseded: please note a more recent version of the MCAS-S software is now available. See the ABARES website for details. \r \r MCAS-S version 3.2 \r The Multi-Criteria Analysis Shell for Spatial Decision Support (MCAS-S) is a tool to view and combine mapped information. MCAS-S can inform spatial decision making and help with stakeholder engagement and communication. MCAS-S is powerful and easy to use. GIS (geographic information system) programming is not required, removing the usual technical obstacles to non-GIS users. \r \r MCAS-S projects are: \r • transparent - you can see all the inputs used to meet an objective and how these are combined \r • flexible - you can use MCAS-S to compare options and explore trade-offs. You can use your own input data \r • fast - you can immediately see changes to your objective when any input or combination method changes. \r The new version 3.2 has: • improved performance \r • a user guide incorporated into the software \r • live links to metadata \r • more options for processing and analysing time series data \r • simpler options for labelling and classifying data inputs. \r \r MCAS-S 3.2 is made freely available with the support of the MCAS-S development partners: ABARES, the NSW Office of Environment and Heritage, Barry Consulting, the Australian Collaborative Land Use and Management Program, the National Environmental Research Program Landscapes and Policy Hub at University of Tasmania and the Terrestrial Ecosystems Research Network.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
A continous, multiscale stream steepness raster dataset produced for the INVAFISH project (Norges forskningsråd 243910) with the following script: connectivity.py
It covers Norway, Sweden and Finland. The stream network has been derived with the GRASS GIS r.stream.extract module from a 10m digital elevation model (DEM). Slope has been calculated with r.slope.direction module at 10, 30, 50, 70, 110, and 150 m steps following the direction of the stream network.
Resolution of the raster data follows the pixels of the underlying 10m DEM. Raster values represent slope in degree * 100, so for example a value of 732 refers to 7.32 degree in slope. Negative slope values indicated artifacts in the underlying DEM and occure where the r.stream.extract module had to hydrologically enforce overland flow through pits or over ridges.
Data format is LZW-compressed GeoTiff in EPSG: 25833 coordinate system.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Grid Garage Toolbox is designed to help you undertake the Geographic Information System (GIS) tasks required to process GIS data (geodata) into a standard, spatially aligned format. This format is required by most, grid or raster, spatial modelling tools such as the Multi-criteria Analysis Shell for Spatial Decision Support (MCAS-S) . Grid Garage contains 36 tools designed to save you time by batch processing repetitive GIS tasks as well diagnosing problems with data and capturing a record of processing step and any errors encountered.\r \r Grid Garage provides tools that function using a list based approach to batch processing where both inputs and outputs are specified in tables to enable selective batch processing and detailed result reporting. In many cases the tools simply extend the functionality of standard ArcGIS tools, providing some or all of the inputs required by these tools via the input table to enable batch processing on a 'per item' basis. This approach differs slightly from normal batch processing in ArcGIS, instead of manually selecting single items or a folder on which to apply a tool or model you provide a table listing target datasets. In summary the\r Grid Garage allows you to:\r \r * List, describe and manage very large volumes of geodata.\r * Batch process repetitive GIS tasks such as managing (renaming, describing etc.) or processing (clipping, resampling, reprojecting etc.) many geodata inputs such as time-series geodata derived from satellite imagery or climate models.\r * Record any errors when batch processing and diagnose errors by interrogating the input geodata that failed.\r * Develop your own models in ArcGIS ModelBuilder that allow you to automate any GIS workflow utilising one or more of the Grid Garage tools that can process an unlimited number of inputs.\r * Automate the process of generating MCAS-S TIP metadata files for any number of input raster datasets.\r \r The Grid Garage is intended for use by anyone with an understanding of GIS principles and an intermediate to advanced level of GIS skills. Using the Grid Garage tools in ArcGIS ModelBuilder requires skills in the use of the ArcGIS ModelBuilder tool.\r \r Download Instructions: Create a new folder on your computer or network and then download and unzip the zip file from the GitHub Release page for each of the following items in the 'Data and Resources' section below. There is a folder in each zip file that contains all the files. See the Grid Garage User Guide for instructions on how to install and use the Grid Garage Toolbox with the sample data provided. \r \r
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This seminar is an applied study of deep learning methods for extracting information from geospatial data, such as aerial imagery, multispectral imagery, digital terrain data, and other digital cartographic representations. We first provide an introduction and conceptualization of artificial neural networks (ANNs). Next, we explore appropriate loss and assessment metrics for different use cases followed by the tensor data model, which is central to applying deep learning methods. Convolutional neural networks (CNNs) are then conceptualized with scene classification use cases. Lastly, we explore semantic segmentation, object detection, and instance segmentation. The primary focus of this course is semantic segmenation for pixel-level classification. The associated GitHub repo provides a series of applied examples. We hope to continue to add examples as methods and technologies further develop. These examples make use of a vareity of datasets (e.g., SAT-6, topoDL, Inria, LandCover.ai, vfillDL, and wvlcDL). Please see the repo for links to the data and associated papers. All examples have associated videos that walk through the process, which are also linked to the repo. A variety of deep learning architectures are explored including UNet, UNet++, DeepLabv3+, and Mask R-CNN. Currenlty, two examples use ArcGIS Pro and require no coding. The remaining five examples require coding and make use of PyTorch, Python, and R within the RStudio IDE. It is assumed that you have prior knowledge of coding in the Python and R enviroinments. If you do not have experience coding, please take a look at our Open-Source GIScience and Open-Source Spatial Analytics (R) courses, which explore coding in Python and R, respectively. After completing this seminar you will be able to: explain how ANNs work including weights, bias, activation, and optimization. describe and explain different loss and assessment metrics and determine appropriate use cases. use the tensor data model to represent data as input for deep learning. explain how CNNs work including convolutional operations/layers, kernel size, stride, padding, max pooling, activation, and batch normalization. use PyTorch, Python, and R to prepare data, produce and assess scene classification models, and infer to new data. explain common semantic segmentation architectures and how these methods allow for pixel-level classification and how they are different from traditional CNNs. use PyTorch, Python, and R (or ArcGIS Pro) to prepare data, produce and assess semantic segmentation models, and infer to new data.
The MS&R Plan identifies the general location and size of existing and proposed freeways, arterial and collector streets, future rights-of-way, setback requirements, typical intersections and cross sections, and gateway and scenic routes. The City’s Department of Transportation and the Planning and Development Services Department (PDSD) implement the MS&R Plan. The MS&R Plan is considered a Land Use Plan as defined in the Unified Development Code (UDC) Section 3.6, and, therefore, is subject to amendment in accordance with the standard Land Use Plan and Adoption and Amendment Procedures. The MS&R right-of-way lines are used in determining the setback for development through the MS&R Overlay provisions of the UDC. As stated in the current MS&R Plan, page 4, “The purpose of the Major Streets and Routes Plan is to facilitate future street widening, to inform the public which streets are the main thoroughfares, so that land use decisions can be based accordingly, and to reduce the disruption of existing uses on a property. By stipulating the required right-of-way, new development can be located so as to prepare for planned street improvements without demolition of buildings or loss of necessary parking.”PurposeThe major purposes of the Major Streets and Routes Plan are to identify street classifications, the width of public rights-of-way, to designate special routes, and to guide land use decisions. General Plan policies stipulate that planning and developing new transportation facilities be accomplished by identifying rights-of-way in the Major Streets and Routes Plan. The policies also aim to encourage bicycle and pedestrian travel, "minimize disruption of the environment," and "coordinate land use patterns with transportation plans" by using the street classification as a guide to land use decisions.Dataset ClassificationLevel 0 - OpenKnown UsesThis layer is intended to be used in the Open Data portal and not for regular use in ArcGIS Online and ArcGIS Enterprise.Known ErrorsLorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.Data ContactLorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.Update FrequencyAs needed
Historical ownership data of GIS by Delaney Dennis R
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
scripts.zip
arcgisTools.atbx: terrainDerivatives: make terrain derivatives from digital terrain model (Band 1 = TPI (50 m radius circle), Band 2 = square root of slope, Band 3 = TPI (annulus), Band 4 = hillshade, Band 5 = multidirectional hillshades, Band 6 = slopeshade). rasterizeFeatures: convert vector polygons to raster masks (1 = feature, 0 = background).
makeChips.R: R function to break terrain derivatives and chips into image chips of a defined size. makeTerrainDerivatives.R: R function to generated 6-band terrain derivatives from digital terrain data (same as ArcGIS Pro tool). merge_logs.R: R script to merge training logs into a single file. predictToExtents.ipynb: Python notebook to use trained model to predict to new data. trainExperiments.ipynb: Python notebook used to train semantic segmentation models using PyTorch and the Segmentation Models package. assessmentExperiments.ipynb: Python code to generate assessment metrics using PyTorch and the torchmetrics library. graphs_results.R: R code to make graphs with ggplot2 to summarize results. makeChipsList.R: R code to generate lists of chips in a directory. makeMasks.R: R function to make raster masks from vector data (same as rasterizeFeatures ArcGIS Pro tool).
terraceDL.zip
dems: LiDAR DTM data partitioned into training, testing, and validation datasets based on HUC8 watershed boundaries. Original DTM data were provided by the Iowa BMP mapping project: https://www.gis.iastate.edu/BMPs. extents: extents of the training, testing, and validation areas as defined by HUC 8 watershed boundaries. vectors: vector features representing agricultural terraces and partitioned into separate training, testing, and validation datasets. Original digitized features were provided by the Iowa BMP Mapping Project: https://www.gis.iastate.edu/BMPs.
Hydrologic models are growing in complexity: spatial representations, model coupling, process representations, software structure, etc. New and emerging datasets are growing, supporting even more detailed modeling use cases. This complexity is leading to the reproducibility crisis in hydrologic modeling and analysis. We argue that moving hydrologic modeling to the cloud can help to address this reproducibility crisis. - We create two notebooks: 1. The first notebook demonstrates the process of collecting and manipulating GIS and Time-series data using GRASS GIS, Python and R to create RHESsys Model input. 2. The second notebook demonstrates the process of model compilation, simulation, and visualization.
The first notebook includes:
The second notebook includes:
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Historical holdings data showing quarterly positions, market values, shares held, and portfolio percentages for GIS held by Delaney Dennis R from Q1 2016 to Q1 2025
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Summary of topics to be covered in an ideal workshop as identified by workshop applicants in the workshop call for participation. We incorporated as many as possible that also fit our scope.
Attribution 3.0 (CC BY 3.0)https://creativecommons.org/licenses/by/3.0/
License information was derived automatically
The SR43-44 dataset is a topographic dataset, detailing features within Princess Elizabeth Land, more specifically along the Ingrid Christensen Coast.\r \r The area includes Prydz Bay and the Amery Ice Shelf.\r \r The database contains all natural features. Attributes are held for line, point and polygon features.\r \r The dataset conforms to the Australian Antarctic Spatial model.\r \r The dataset was originally produced as a base to supply data for the second edition hard copy map series.\r \r It was updated in 2001/02 with the integration of data from the AAT Coastline 2001 dataset.
This dataset comprises a series of geotiff grids of modelled solar radiation (Wh m-2 day-1) for a portion of the Western Antarctic Peninsula. The grids were generated using the r.sun module in Grass GIS. In addition to the a geotiff grid representing the average daily global horizontal irradiance for an entire year, the dataset also includes geotiffs containing daily values of direct beam irradiance, diffuse irradiance, ground reflected irradiance, and global (total) irradiance (all in Wh m-2 day-1) as well as insolation time (hours). This dataset was created in support of projects ANT-1744550, -1744570, -1744584, and -1744602.
Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
This resource was created by Esri Canada Education and Research. To browse our full collection of higher-education learning resources, please visit https://hed.esri.ca/resourcefinder/.This Tutorial consists of four tutorials that deal with integrating the statistical programming language R with ArcGIS for Desktop. Several concepts are covered which include configuring ArcGIS with R, writing basic R scripts, writing R scripts that work with ArcGIS data, and constructing R Tools for use within ArcGIS Pro. It is recommended that the tutorials are completed in sequential order. Each of the four tutorials (as well as a version of this document), can viewed directly from your Web browser by following the links below. However, you must obtain a complete copy of the tutorial files by downloading the latest release (or by cloning the tutorial repository on GitHub) if you wish to follow the tutorials interactively using ArcGIS and R software, along with pre-configured sample data.To download the tutorial documents and datasets, click the Open button to the top right. This will automatically download a ZIP file containing all files and data required.You can also clone the tutorial documents and datasets for this GitHub repo: https://github.com/highered-esricanada/r-arcgis-tutorials.gitSoftware & Solutions Used: ArcGIS Pro 3.4 Internet browser (e.g., Mozilla Firefox, Google Chrome, Safari) R Statistical Computing Language – version 4.3.3 R-ArcGIS Bindings – version 1.0.1.311RStudio Desktop – version 2024.09.0+375Time to Complete: 2.5 h (excludes installation time)File Size: 115 MBDate Created: November 2017Last Updated: December 2024