This dataset provides shapefile of outlines of the 68 lakes where temperature was modeled as part of this study. The format is a shapefile for all lakes combined (.shp, .shx, .dbf, and .prj files). This dataset is part of a larger data release of lake temperature model inputs and outputs for 68 lakes in the U.S. states of Minnesota and Wisconsin (http://dx.doi.org/10.5066/P9AQPIVD).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In order to use the standard color legend for Romanian soil type maps in the ESRI ArcMap-10 electronic format, a dataset consisting a shapefile set (.dbf, .shp, .shx, .sbn, and .sbx files), four different .lyr files, and three different .style files have been prepared (ESRI, 2016). The shapefile set is not a “real” georeferenced layer/coverage; it is designed only to handle all the instants of soil types from the standard legend. This legend contains 67 standard items: 63 proper colors (different color hues, each of them having, generally, 2 - 4 degrees of lightness and/or chroma, four shades of grey, and white color), and four hatching patterns on white background (ESRI, 2016). The “color difference DE*ab” between any two legend colors, calculated with the color perceptually-uniform model CIELAB , is greater than 10 units, thus ensuring acceptably-distinguishable colors in the legend. The 67 standard items are assigned to 60 main soils existing in Romania, four main nonsoils, and three special cases of unsurveyed land. The soils are specified in terms of the current Romanian system of soil taxonomy, SRTS-2012+, and of the international soil classification system WRB-2014. The four different .lyr files presented here are: legend_soilcode_srts_wrb.lyr, legend_soilcode_wrb.lyr, legend_colourcode_srts_wrb.lyr, and legend_colourcode_wrb.lyr. The first two of them are built using as value field the ‘Soil_codes’ field, and as labels (explanation texts) the ‘Soil_name’ field (storing the soil types according to SRTS/WRB classification), respectively, the ‘WRB’ field (the soil type according to WRB classification), while the last two .lyr files are built using as value field the ‘colour_code’ field (storing the color codes) and as labels the soil name in SRTS and WRB, respectively, in WRB classification. In order to exemplify how the legend is displayed, two .jpg files are also presented: legend_soil_srts_wrb.jpg and legend_colour_wrb.jpg. The first displays the legend (symbols and labels) according to the SRTS classification order, the second according to the WRB classification. The three different .style files presented here are: soil_symbols.style, wrb_codes.style, and colour_codes.style. They use as name the soil acronym in SRTS classification, soil acronym in WRB classification, and, respectively, the color code.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In order to use the Romanian color standard for soil type map legends, a dataset of ESRI ArcMap-10 files, consisting of a shapefile set (.dbf, .shp, .shx, .sbn, and .sbx files), four different .lyr files, and three different .style files (https://desktop.arcgis.com/en/arcmap/10.3/map/ : saving-layers-and-layer-packages, about-creating-new-symbols, what-are-symbols-and-styles-), have been prepared. The shapefile set is not a “real” georeferenced layer/coverage; it is designed only to handle all the instants of soil types from the standard legend.
This legend contains 67 standard items: 63 proper colors (different color hues, each of them having, generally, 2 - 4 degrees of lightness and/or chroma, four shades of grey, and white color), and four hatching patterns on white background. The “color difference DE*ab” between any two legend colors, calculated with the color perceptually-uniform model CIELAB, is greater than 10 units, thus ensuring acceptably-distinguishable colors in the legend. The 67 standard items are assigned to 60 main soils existing in Romania, four main nonsoils, and three special cases of unsurveyed land. The soils are specified in terms of the current Romanian system of soil taxonomy, SRTS-2012+, and of the international system WRB-2014.
The four different .lyr files presented here are: legend_soilcode_srts_wrb.lyr, legend_soilcode_wrb.lyr, legend_colorcode_srts_wrb.lyr, and legend_colorcode_wrb.lyr. The first two of them are built using as value field the “Soil_codes” field, and as labels (explanation texts) the “Soil_name” field (storing the soil types according to SRTS/WRB classification), respectively, the “WRB” field (the soil type according to WRB classification), while the last two .lyr files are built using as value field the “color_code” field (storing the color codes) and as labels the soil name in SRTS and WRB, respectively, in WRB classification.
In order to exemplify how the legend is displayed, two .jpg files are also presented: legend_soil_srts_wrb.jpg and legend_color_wrb.jpg. The first displays the legend (symbols and labels) according to the SRTS classification order, the second according to the WRB classification.
The three different .style files presented here are: soil_symbols.style, wrb_codes.style, and color_codes.style. They use as name the soil acronym in SRTS classification, soil acronym in WRB classification, and, respectively, the color code.
The presented file set may be used to directly implement the Romanian color standard in digital soil type map legends, or may be adjusted/modified to other specific requirements.
GIS Grid Files For The Sampling (NCEI Accession 0169400 and NCEI Accession 0169401) - DRTO_Grid.dbf 2017-05-12 10:32 2.2M DRTO_Grid.prj 2017-05-12 10:29 424 DRTO_Grid.sbn 2017-05-12 10:29 291K DRTO_Grid.sbx 2017-05-12 10:29 17K DRTO_Grid.shp 2017-05-12 10:29 4.0M DRTO_Grid.shp.xml 2017-12-18 11:07 21K DRTO_Grid.shx 2017-05-12 10:29 243K FlaKeys_Grid.dbf 2017-05-17 09:19 16M FlaKeys_Grid.prj 2017-05-17 09:01 424 FlaKeys_Grid.sbn 2017-05-17 09:01 666K FlaKeys_Grid.sbx 2017-05-17 09:01 10K FlaKeys_Grid.shp 2017-05-17 09:18 8.8M FlaKeys_Grid.shp.xml 2017-12-18 11:07 9.1K FlaKeys_Grid.shx 2017-05-17 09:18 528K SEFCRI_Grid_100m.dbf 2017-05-17 13:08 2.1M SEFCRI_Grid_100m.prj 2017-05-17 13:08 424 SEFCRI_Grid_100m.sbn 2017-05-17 13:08 224K SEFCRI_Grid_100m.sbx 2017-05-17 13:08 5.5K SEFCRI_Grid_100m.shp 2017-05-17 13:08 3.1M SEFCRI_Grid_100m.shp.xml 2017-12-18 11:07 8.4K SEFCRI_Grid_100m.shx 2017-05-17 13:08 188K
The Ontario government, generates and maintains thousands of datasets. Since 2012, we have shared data with Ontarians via a data catalogue. Open data is data that is shared with the public. Click here to learn more about open data and why Ontario releases it. Ontario’s Open Data Directive states that all data must be open, unless there is good reason for it to remain confidential. Ontario’s Chief Digital and Data Officer also has the authority to make certain datasets available publicly. Datasets listed in the catalogue that are not open will have one of the following labels: If you want to use data you find in the catalogue, that data must have a licence – a set of rules that describes how you can use it. A licence: Most of the data available in the catalogue is released under Ontario’s Open Government Licence. However, each dataset may be shared with the public under other kinds of licences or no licence at all. If a dataset doesn’t have a licence, you don’t have the right to use the data. If you have questions about how you can use a specific dataset, please contact us. The Ontario Data Catalogue endeavors to publish open data in a machine readable format. For machine readable datasets, you can simply retrieve the file you need using the file URL. The Ontario Data Catalogue is built on CKAN, which means the catalogue has the following features you can use when building applications. APIs (Application programming interfaces) let software applications communicate directly with each other. If you are using the catalogue in a software application, you might want to extract data from the catalogue through the catalogue API. Note: All Datastore API requests to the Ontario Data Catalogue must be made server-side. The catalogue's collection of dataset metadata (and dataset files) is searchable through the CKAN API. The Ontario Data Catalogue has more than just CKAN's documented search fields. You can also search these custom fields. You can also use the CKAN API to retrieve metadata about a particular dataset and check for updated files. Read the complete documentation for CKAN's API. Some of the open data in the Ontario Data Catalogue is available through the Datastore API. You can also search and access the machine-readable open data that is available in the catalogue. How to use the API feature: Read the complete documentation for CKAN's Datastore API. The Ontario Data Catalogue contains a record for each dataset that the Government of Ontario possesses. Some of these datasets will be available to you as open data. Others will not be available to you. This is because the Government of Ontario is unable to share data that would break the law or put someone's safety at risk. You can search for a dataset with a word that might describe a dataset or topic. Use words like “taxes” or “hospital locations” to discover what datasets the catalogue contains. You can search for a dataset from 3 spots on the catalogue: the homepage, the dataset search page, or the menu bar available across the catalogue. On the dataset search page, you can also filter your search results. You can select filters on the left hand side of the page to limit your search for datasets with your favourite file format, datasets that are updated weekly, datasets released by a particular organization, or datasets that are released under a specific licence. Go to the dataset search page to see the filters that are available to make your search easier. You can also do a quick search by selecting one of the catalogue’s categories on the homepage. These categories can help you see the types of data we have on key topic areas. When you find the dataset you are looking for, click on it to go to the dataset record. Each dataset record will tell you whether the data is available, and, if so, tell you about the data available. An open dataset might contain several data files. These files might represent different periods of time, different sub-sets of the dataset, different regions, language translations, or other breakdowns. You can select a file and either download it or preview it. Make sure to read the licence agreement to make sure you have permission to use it the way you want. Read more about previewing data. A non-open dataset may be not available for many reasons. Read more about non-open data. Read more about restricted data. Data that is non-open may still be subject to freedom of information requests. The catalogue has tools that enable all users to visualize the data in the catalogue without leaving the catalogue – no additional software needed. Have a look at our walk-through of how to make a chart in the catalogue. Get automatic notifications when datasets are updated. You can choose to get notifications for individual datasets, an organization’s datasets or the full catalogue. You don’t have to provide and personal information – just subscribe to our feeds using any feed reader you like using the corresponding notification web addresses. Copy those addresses and paste them into your reader. Your feed reader will let you know when the catalogue has been updated. The catalogue provides open data in several file formats (e.g., spreadsheets, geospatial data, etc). Learn about each format and how you can access and use the data each file contains. A file that has a list of items and values separated by commas without formatting (e.g. colours, italics, etc.) or extra visual features. This format provides just the data that you would display in a table. XLSX (Excel) files may be converted to CSV so they can be opened in a text editor. How to access the data: Open with any spreadsheet software application (e.g., Open Office Calc, Microsoft Excel) or text editor. Note: This format is considered machine-readable, it can be easily processed and used by a computer. Files that have visual formatting (e.g. bolded headers and colour-coded rows) can be hard for machines to understand, these elements make a file more human-readable and less machine-readable. A file that provides information without formatted text or extra visual features that may not follow a pattern of separated values like a CSV. How to access the data: Open with any word processor or text editor available on your device (e.g., Microsoft Word, Notepad). A spreadsheet file that may also include charts, graphs, and formatting. How to access the data: Open with a spreadsheet software application that supports this format (e.g., Open Office Calc, Microsoft Excel). Data can be converted to a CSV for a non-proprietary format of the same data without formatted text or extra visual features. A shapefile provides geographic information that can be used to create a map or perform geospatial analysis based on location, points/lines and other data about the shape and features of the area. It includes required files (.shp, .shx, .dbt) and might include corresponding files (e.g., .prj). How to access the data: Open with a geographic information system (GIS) software program (e.g., QGIS). A package of files and folders. The package can contain any number of different file types. How to access the data: Open with an unzipping software application (e.g., WinZIP, 7Zip). Note: If a ZIP file contains .shp, .shx, and .dbt file types, it is an ArcGIS ZIP: a package of shapefiles which provide information to create maps or perform geospatial analysis that can be opened with ArcGIS (a geographic information system software program). A file that provides information related to a geographic area (e.g., phone number, address, average rainfall, number of owl sightings in 2011 etc.) and its geospatial location (i.e., points/lines). How to access the data: Open using a GIS software application to create a map or do geospatial analysis. It can also be opened with a text editor to view raw information. Note: This format is machine-readable, and it can be easily processed and used by a computer. Human-readable data (including visual formatting) is easy for users to read and understand. A text-based format for sharing data in a machine-readable way that can store data with more unconventional structures such as complex lists. How to access the data: Open with any text editor (e.g., Notepad) or access through a browser. Note: This format is machine-readable, and it can be easily processed and used by a computer. Human-readable data (including visual formatting) is easy for users to read and understand. A text-based format to store and organize data in a machine-readable way that can store data with more unconventional structures (not just data organized in tables). How to access the data: Open with any text editor (e.g., Notepad). Note: This format is machine-readable, and it can be easily processed and used by a computer. Human-readable data (including visual formatting) is easy for users to read and understand. A file that provides information related to an area (e.g., phone number, address, average rainfall, number of owl sightings in 2011 etc.) and its geospatial location (i.e., points/lines). How to access the data: Open with a geospatial software application that supports the KML format (e.g., Google Earth). Note: This format is machine-readable, and it can be easily processed and used by a computer. Human-readable data (including visual formatting) is easy for users to read and understand. This format contains files with data from tables used for statistical analysis and data visualization of Statistics Canada census data. How to access the data: Open with the Beyond 20/20 application. A database which links and combines data from different files or applications (including HTML, XML, Excel, etc.). The database file can be converted to a CSV/TXT to make the data machine-readable, but human-readable formatting will be lost. How to access the data: Open with Microsoft Office Access (a database management system used to develop application software). A file that keeps the original layout and
This dataset provides shapefile outlines of the 7,150 lakes that had temperature modeled as part of this study. The format is a shapefile for all lakes combined (.shp, .shx, .dbf, and .prj files). A csv file of lake metadata is also included. This dataset is part of a larger data release of lake temperature model inputs and outputs for 7,150 lakes in the U.S. states of Minnesota and Wisconsin (http://dx.doi.org/10.5066/P9CA6XP8).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Update information can be found within the layer’s attributes and in a table on the Utah Parcel Data webpage under LIR Parcels.In Spring of 2016, the Land Information Records work group, an informal committee organized by the Governor’s Office of Management and Budget’s State Planning Coordinator, produced recommendations for expanding the sharing of GIS-based parcel information. Participants in the LIR work group included representatives from county, regional, and state government, including the Utah Association of Counties (County Assessors and County Recorders), Wasatch Front Regional Council, Mountainland and Bear River AOGs, Utah League of Cities and Towns, UDOT, DNR, AGRC, the Division of Emergency Management, Blue Stakes, economic developers, and academic researchers. The LIR work group’s recommendations set the stage for voluntary sharing of additional objective/quantitative parcel GIS data, primarily around tax assessment-related information. Specifically the recommendations document establishes objectives, principles (including the role of local and state government), data content items, expected users, and a general process for data aggregation and publishing. An important realization made by the group was that ‘parcel data’ or ‘parcel record’ products have a different meaning to different users and data stewards. The LIR group focused, specifically, on defining a data sharing recommendation around a tax year parcel GIS data product, aligned with the finalization of the property tax roll by County Assessors on May 22nd of each year. The LIR recommendations do not impact the periodic sharing of basic parcel GIS data (boundary, ID, address) from the County Recorders to AGRC per 63F-1-506 (3.b.vi). Both the tax year parcel and the basic parcel GIS layers are designed for general purpose uses, and are not substitutes for researching and obtaining the most current, legal land records information on file in County records. This document, below, proposes a schedule, guidelines, and process for assembling county parcel and assessment data into an annual, statewide tax parcel GIS layer. gis.utah.gov/data/sgid-cadastre/ It is hoped that this new expanded parcel GIS layer will be put to immediate use supporting the best possible outcomes in public safety, economic development, transportation, planning, and the provision of public services. Another aim of the work group was to improve the usability of the data, through development of content guidelines and consistent metadata documentation, and the efficiency with which the data sharing is distributed.GIS Layer Boundary Geometry:GIS Format Data Files: Ideally, Tax Year Parcel data should be provided in a shapefile (please include the .shp, .shx, .dbf, .prj, and .xml component files) or file geodatabase format. An empty shapefile and file geodatabase schema are available for download at:At the request of a county, AGRC will provide technical assistance to counties to extract, transform, and load parcel and assessment information into the GIS layer format.Geographic Coverage: Tax year parcel polygons should cover the area of each county for which assessment information is created and digital parcels are available. Full coverage may not be available yet for each county. The county may provide parcels that have been adjusted to remove gaps and overlaps for administrative tax purposes or parcels that retain these expected discrepancies that take their source from the legally described boundary or the process of digital conversion. The diversity of topological approaches will be noted in the metadata.One Tax Parcel Record Per Unique Tax Notice: Some counties produce an annual tax year parcel GIS layer with one parcel polygon per tax notice. In some cases, adjacent parcel polygons that compose a single taxed property must be merged into a single polygon. This is the goal for the statewide layer but may not be possible in all counties. AGRC will provide technical support to counties, where needed, to merge GIS parcel boundaries into the best format to match with the annual assessment information.Standard Coordinate System: Parcels will be loaded into Utah’s statewide coordinate system, Universal Transverse Mercator coordinates (NAD83, Zone 12 North). However, boundaries stored in other industry standard coordinate systems will be accepted if they are both defined within the data file(s) and documented in the metadata (see below).Descriptive Attributes:Database Field/Column Definitions: The table below indicates the field names and definitions for attributes requested for each Tax Parcel Polygon record.FIELD NAME FIELD TYPE LENGTH DESCRIPTION EXAMPLE SHAPE (expected) Geometry n/a The boundary of an individual parcel or merged parcels that corresponds with a single county tax notice ex. polygon boundary in UTM NAD83 Zone 12 N or other industry standard coordinates including state plane systemsCOUNTY_NAME Text 20 - County name including spaces ex. BOX ELDERCOUNTY_ID (expected) Text 2 - County ID Number ex. Beaver = 1, Box Elder = 2, Cache = 3,..., Weber = 29ASSESSOR_SRC (expected) Text 100 - Website URL, will be to County Assessor in most all cases ex. webercounty.org/assessorBOUNDARY_SRC (expected) Text 100 - Website URL, will be to County Recorder in most all cases ex. webercounty.org/recorderDISCLAIMER (added by State) Text 50 - Disclaimer URL ex. gis.utah.gov...CURRENT_ASOF (expected) Date - Parcels current as of date ex. 01/01/2016PARCEL_ID (expected) Text 50 - County designated Unique ID number for individual parcels ex. 15034520070000PARCEL_ADD (expected, where available) Text 100 - Parcel’s street address location. Usually the address at recordation ex. 810 S 900 E #304 (example for a condo)TAXEXEMPT_TYPE (expected) Text 100 - Primary category of granted tax exemption ex. None, Religious, Government, Agriculture, Conservation Easement, Other Open Space, OtherTAX_DISTRICT (expected, where applicable) Text 10 - The coding the county uses to identify a unique combination of property tax levying entities ex. 17ATOTAL_MKT_VALUE (expected) Decimal - Total market value of parcel's land, structures, and other improvements as determined by the Assessor for the most current tax year ex. 332000LAND _MKT_VALUE (expected) Decimal - The market value of the parcel's land as determined by the Assessor for the most current tax year ex. 80600PARCEL_ACRES (expected) Decimal - Parcel size in acres ex. 20.360PROP_CLASS (expected) Text 100 - Residential, Commercial, Industrial, Mixed, Agricultural, Vacant, Open Space, Other ex. ResidentialPRIMARY_RES (expected) Text 1 - Is the property a primary residence(s): Y'(es), 'N'(o), or 'U'(nknown) ex. YHOUSING_CNT (expected, where applicable) Text 10 - Number of housing units, can be single number or range like '5-10' ex. 1SUBDIV_NAME (optional) Text 100 - Subdivision name if applicable ex. Highland Manor SubdivisionBLDG_SQFT (expected, where applicable) Integer - Square footage of primary bldg(s) ex. 2816BLDG_SQFT_INFO (expected, where applicable) Text 100 - Note for how building square footage is counted by the County ex. Only finished above and below grade areas are counted.FLOORS_CNT (expected, where applicable) Decimal - Number of floors as reported in county records ex. 2FLOORS_INFO (expected, where applicable) Text 100 - Note for how floors are counted by the County ex. Only above grade floors are countedBUILT_YR (expected, where applicable) Short - Estimated year of initial construction of primary buildings ex. 1968EFFBUILT_YR (optional, where applicable) Short - The 'effective' year built' of primary buildings that factors in updates after construction ex. 1980CONST_MATERIAL (optional, where applicable) Text 100 - Construction Material Types, Values for this field are expected to vary greatly by county ex. Wood Frame, Brick, etc Contact: Sean Fernandez, Cadastral Manager (email: sfernandez@utah.gov; office phone: 801-209-9359)
The Randolph Glacier Inventory (RGI) is a globally complete inventory of glacier outlines. It is supplemental to the database compiled by the Global Land Ice Measurements from Space initiative (GLIMS). While GLIMS is a multi-temporal database with an extensive set of attributes, the RGI is intended to be a snapshot of the world’s glaciers as they were near the beginning of the 21st century (although in fact its range of dates is still substantial). Production of the RGI was motivated by the preparation of the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR5).
Version 1.0 of the RGI was released in February 2012. It included a considerable number of unsubdivided ice bodies, which we refer to as glacier complexes, and a considerable number of nominal glaciers, which are glaciers for which only a location and an area are known; they are represented by circles of the appropriate area at the given location. Version 6.0, released in July 2017, has improved coverage of the conterminous US (regions 02-05 and 02-06), Scandinavia (region 08) and Iran (region 12-2). In Scandinavia several hundred smaller glaciers have been added and most glaciers now have exact dates. The flag attributes RGIFlag and GlacType were reorganized. Surging codes have been added from Sevestre and Benn (2015).
For version 1.0, we visualized the data in a geographic information system by overlaying outlines on modern satellite imagery, and assessed their quality relative to other available products. In several regions the outlines already in GLIMS were used for the RGI. Data from the World Glacier Inventory (WGI, http://nsidc.org/data/docs/noaa/g01130_glacier_inventory/; WGI, 1989) and the related WGI-XF (http://people.trentu.ca/~gcogley/glaciology; Cogley, 2009) were used for some nominal glaciers, mainly in the Pyrenees and in northern Asia. Where no other data were available we relied on data from the Digital Chart of the World (Danko, 1992).
The RGI is provided as shapefiles containing the outlines of glaciers in geographic coordinates (longitude and latitude, in degrees) which are referenced to the WGS84 datum. Data are organized by first-order region. For each region there is one shapefile (.SHP with accompanying .DBF, .PRJ and .SHX files) containing all glaciers and one ancillary .CSV file containing all hypsometric data. The attribute (.DBF) and hypsometric files contain one record per glacier. Each object in the RGI conforms to the data-model conventions of ESRI ArcGIS shapefiles. That is, each object consists of an outline encompassing the glacier, followed immediately by outlines representing all of its nunataks (ice-free areas enclosed by the glacier). In each object successive vertices are ordered such that glacier ice is on the right. This data model is not the same as the current GLIMS data model, in which nunataks are independent objects. The outlines of the RGI regions are provided as two shapefiles, one for first-order and one for second-order regions. A summary file containing glacier counts, glacierized area and a hypsometric list for each first-order and each second-order region is also provided. The 0.5°×0.5° grid is provided as a plain-text .DAT file in which zonal records of blank-separated glacierized areas in km2 are ordered from north to south. Information about RGI glaciers that are present in the mass balance tables of the WGMS database Fluctuations of Glaciers is provided as an ancillary .CSV file. The 19 regional attribute (.DBF) files are also provided in .CSV format.ReferencesRGI Consortium, 2017, Randolph Glacier Inventory (RGI) – A Dataset of Global Glacier Outlines: Version 6.0. Technical Report, Global Land Ice Measurements from Space, Boulder, Colorado, USA. Digital Media. DOI: https://doi.org/10.7265/N5-RGI-60 Pfeffer, W. T., Arendt, A. A., Bliss, A., Bolch, T., Cogley, J. G., Gardner, A. S., Hagen, J-O., Hock, R., Kaser, G., Kienholz, C., Miles, E. S., Moholdt, G., Molg, N., Paul, F., Radic, V., Rastner, P., Raup, B. H., Rich, J., Sharp, M. J. Glasser, N. (2014). The Randolph Glacier Inventory: A globally complete inventory of glaciers. Journal of Glaciology, 60 (221), 537-552.https://www.cambridge.org/core/services/aop-cambridge-core/content/view/730D4CC76E0E3EC1832FA3F4D90691CE/S002214300020600Xa.pdf/randolph_glacier_inventory_a_globally_complete_inventory_of_glaciers.pdf
Mule deer populations continue to decline across much of the western United States due to loss of habitat, starvation, and severe climate patterns, such as drought. In order to track the home range size and ecological preferences of mule deer, an important species for culture, economy, and ecosystems, the New Mexico Bureau of Land Management Taos Field Office captured mule deer, attached collars to them, and released them into Rio Grande del Norte National Monument. Collected from 2015-2017, each unique entry is one deer during one year, for a total of 23 entries. The point data was then intersected with vegetation data in the area, and the density of points was determined through Kernel Density Estimation (KDE). Reclassified BLM Vegetation Treatment data was used for zonal statistics on the KDE data and offered insights into mule deer response to treatments. This project was conducted as a joint project between the NMBLM TFO, Fort Collins USGS Science Center, and Kent State University’s Biogeography & Landscape Dynamics lab. This dataset includes all spatial data (CPG, DBF, XLSX, PRJ, SBN, SBX, SHP, and SHX) files for the comprehensive location fix shapefile, the convex hulls, the reclassified LANDFIRE EVT raster, the analysis area, the reclassified BLM Vegetation Treatment groups, the Kernel Density Estimation result, and the hill shade and state boundary data.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Description
This dataset consist of two vector files which show the change in the building stock of the City of DaNang retrieved from satellite image analysis. Buildings were first identified from a Pléiades satellite image from 24.10.2015 and classified into 9 categories in a semi-automatic workflow desribed by Warth et al. (2019) and Vetter-Gindele et al. (2019).
In a second step, these buildings were inspected for changes based on a second Pléiades satellite image acquired on 13.08.2017 based on visual interpretation. Changes were also classified into 5 categories and aggregated by administrative wards (first dataset: adm) and a hexagon grid of 250 meter length (second dataset: hex).
The full workflow of the generation of this dataset, including a detailled description of its contents and a discussion on its potential use is published by Braun et al. 2020: Changes in the building stock of DaNang between 2015 and 2017
Contents
Both datasets (adm and hex) are stored as ESRI shapefiles which can be used in common Geographic Information Systems (GIS) and consist of the following parts:
shp: polygon geometries (geometries of the administrative boundaries and hexagons)
dbf: attribute table (containing the number of buildings per class for 2015 and 2017 and the underlying changes (e.g. number of new buildings, number of demolished buildings, ect.)
shx: index file combining the geometries with the attributes
cpg: encoding of the attributes (UTF-8)
prj: spatial reference of the datasets (UTM zone 49 North, EPSG:32649) for ArcGIS
qpj: spatial reference of the datasets (UTM zone 49 North, EPSG:32649) for QGIS
lyr: symbology suggestion for the polygons(predefined is the number of local type shophouses in 2017) for ArcGIS
qml: symbology suggestion for the polygons (predefined is the number of new buildings between 2015 and 2017) for QGIS
Citation and documentation
To cite this dataset, please refer to the publication
Braun, A.; Warth, G.; Bachofer, F.; Quynh Bui, T.T.; Tran, H.; Hochschild, V. (2020): Changes in the Building Stock of Da Nang between 2015 and 2017. Data, 5, 42. doi:10.3390/data5020042
This article contains a detailed description of the dataset, the defined building type classes and the types of changes which were analyzed. Furthermore, the article makes recommendations on the use of the datasets and discusses potential error sources.
These data are a collection of benthic habitat data from studies conducted in the coastal Long Island Sound, NY region in GIS shapefile (.shp, .dbf, .shx, and .prj files) with associated Federal Geographic Data Committee (FGDC) metadata. Generalized browse graphics were generated by the NODC and are included with the data. Individual subdirectories include data as follows - 2002 Long Island South Shore Estuary Benthic Habitat Polygon Data Set, 1995 benthic grab, sediment grab, and sediment profile image GIS point data files from inland harbor bays (Jamaica Bay), and 1994-1995 benthic grab, sediment grab, and sediment profile image GIS point data files from lower inland harbor bays.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Update information can be found within the layer’s attributes and in a table on the Utah Parcel Data webpage under LIR Parcels.In Spring of 2016, the Land Information Records work group, an informal committee organized by the Governor’s Office of Management and Budget’s State Planning Coordinator, produced recommendations for expanding the sharing of GIS-based parcel information. Participants in the LIR work group included representatives from county, regional, and state government, including the Utah Association of Counties (County Assessors and County Recorders), Wasatch Front Regional Council, Mountainland and Bear River AOGs, Utah League of Cities and Towns, UDOT, DNR, AGRC, the Division of Emergency Management, Blue Stakes, economic developers, and academic researchers. The LIR work group’s recommendations set the stage for voluntary sharing of additional objective/quantitative parcel GIS data, primarily around tax assessment-related information. Specifically the recommendations document establishes objectives, principles (including the role of local and state government), data content items, expected users, and a general process for data aggregation and publishing. An important realization made by the group was that ‘parcel data’ or ‘parcel record’ products have a different meaning to different users and data stewards. The LIR group focused, specifically, on defining a data sharing recommendation around a tax year parcel GIS data product, aligned with the finalization of the property tax roll by County Assessors on May 22nd of each year. The LIR recommendations do not impact the periodic sharing of basic parcel GIS data (boundary, ID, address) from the County Recorders to AGRC per 63F-1-506 (3.b.vi). Both the tax year parcel and the basic parcel GIS layers are designed for general purpose uses, and are not substitutes for researching and obtaining the most current, legal land records information on file in County records. This document, below, proposes a schedule, guidelines, and process for assembling county parcel and assessment data into an annual, statewide tax parcel GIS layer. gis.utah.gov/data/sgid-cadastre/It is hoped that this new expanded parcel GIS layer will be put to immediate use supporting the best possible outcomes in public safety, economic development, transportation, planning, and the provision of public services. Another aim of the work group was to improve the usability of the data, through development of content guidelines and consistent metadata documentation, and the efficiency with which the data sharing is distributed.GIS Layer Boundary Geometry:GIS Format Data Files: Ideally, Tax Year Parcel data should be provided in a shapefile (please include the .shp, .shx, .dbf, .prj, and .xml component files) or file geodatabase format. An empty shapefile and file geodatabase schema are available for download at:At the request of a county, AGRC will provide technical assistance to counties to extract, transform, and load parcel and assessment information into the GIS layer format.Geographic Coverage: Tax year parcel polygons should cover the area of each county for which assessment information is created and digital parcels are available. Full coverage may not be available yet for each county. The county may provide parcels that have been adjusted to remove gaps and overlaps for administrative tax purposes or parcels that retain these expected discrepancies that take their source from the legally described boundary or the process of digital conversion. The diversity of topological approaches will be noted in the metadata.One Tax Parcel Record Per Unique Tax Notice: Some counties produce an annual tax year parcel GIS layer with one parcel polygon per tax notice. In some cases, adjacent parcel polygons that compose a single taxed property must be merged into a single polygon. This is the goal for the statewide layer but may not be possible in all counties. AGRC will provide technical support to counties, where needed, to merge GIS parcel boundaries into the best format to match with the annual assessment information.Standard Coordinate System: Parcels will be loaded into Utah’s statewide coordinate system, Universal Transverse Mercator coordinates (NAD83, Zone 12 North). However, boundaries stored in other industry standard coordinate systems will be accepted if they are both defined within the data file(s) and documented in the metadata (see below).Descriptive Attributes:Database Field/Column Definitions: The table below indicates the field names and definitions for attributes requested for each Tax Parcel Polygon record.FIELD NAME FIELD TYPE LENGTH DESCRIPTION EXAMPLE SHAPE (expected) Geometry n/a The boundary of an individual parcel or merged parcels that corresponds with a single county tax notice ex. polygon boundary in UTM NAD83 Zone 12 N or other industry standard coordinates including state plane systemsCOUNTY_NAME Text 20 - County name including spaces ex. BOX ELDERCOUNTY_ID (expected) Text 2 - County ID Number ex. Beaver = 1, Box Elder = 2, Cache = 3,..., Weber = 29ASSESSOR_SRC (expected) Text 100 - Website URL, will be to County Assessor in most all cases ex. webercounty.org/assessorBOUNDARY_SRC (expected) Text 100 - Website URL, will be to County Recorder in most all cases ex. webercounty.org/recorderDISCLAIMER (added by State) Text 50 - Disclaimer URL ex. gis.utah.gov...CURRENT_ASOF (expected) Date - Parcels current as of date ex. 01/01/2016PARCEL_ID (expected) Text 50 - County designated Unique ID number for individual parcels ex. 15034520070000PARCEL_ADD (expected, where available) Text 100 - Parcel’s street address location. Usually the address at recordation ex. 810 S 900 E #304 (example for a condo)TAXEXEMPT_TYPE (expected) Text 100 - Primary category of granted tax exemption ex. None, Religious, Government, Agriculture, Conservation Easement, Other Open Space, OtherTAX_DISTRICT (expected, where applicable) Text 10 - The coding the county uses to identify a unique combination of property tax levying entities ex. 17ATOTAL_MKT_VALUE (expected) Decimal - Total market value of parcel's land, structures, and other improvements as determined by the Assessor for the most current tax year ex. 332000LAND _MKT_VALUE (expected) Decimal - The market value of the parcel's land as determined by the Assessor for the most current tax year ex. 80600PARCEL_ACRES (expected) Decimal - Parcel size in acres ex. 20.360PROP_CLASS (expected) Text 100 - Residential, Commercial, Industrial, Mixed, Agricultural, Vacant, Open Space, Other ex. ResidentialPRIMARY_RES (expected) Text 1 - Is the property a primary residence(s): Y'(es), 'N'(o), or 'U'(nknown) ex. YHOUSING_CNT (expected, where applicable) Text 10 - Number of housing units, can be single number or range like '5-10' ex. 1SUBDIV_NAME (optional) Text 100 - Subdivision name if applicable ex. Highland Manor SubdivisionBLDG_SQFT (expected, where applicable) Integer - Square footage of primary bldg(s) ex. 2816BLDG_SQFT_INFO (expected, where applicable) Text 100 - Note for how building square footage is counted by the County ex. Only finished above and below grade areas are counted.FLOORS_CNT (expected, where applicable) Decimal - Number of floors as reported in county records ex. 2FLOORS_INFO (expected, where applicable) Text 100 - Note for how floors are counted by the County ex. Only above grade floors are countedBUILT_YR (expected, where applicable) Short - Estimated year of initial construction of primary buildings ex. 1968EFFBUILT_YR (optional, where applicable) Short - The 'effective' year built' of primary buildings that factors in updates after construction ex. 1980CONST_MATERIAL (optional, where applicable) Text 100 - Construction Material Types, Values for this field are expected to vary greatly by county ex. Wood Frame, Brick, etc Contact: Sean Fernandez, Cadastral Manager (email: sfernandez@utah.gov; office phone: 801-209-9359)
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Update information can be found within the layer’s attributes and in a table on the Utah Parcel Data webpage under LIR Parcels.In Spring of 2016, the Land Information Records work group, an informal committee organized by the Governor’s Office of Management and Budget’s State Planning Coordinator, produced recommendations for expanding the sharing of GIS-based parcel information. Participants in the LIR work group included representatives from county, regional, and state government, including the Utah Association of Counties (County Assessors and County Recorders), Wasatch Front Regional Council, Mountainland and Bear River AOGs, Utah League of Cities and Towns, UDOT, DNR, AGRC, the Division of Emergency Management, Blue Stakes, economic developers, and academic researchers. The LIR work group’s recommendations set the stage for voluntary sharing of additional objective/quantitative parcel GIS data, primarily around tax assessment-related information. Specifically the recommendations document establishes objectives, principles (including the role of local and state government), data content items, expected users, and a general process for data aggregation and publishing. An important realization made by the group was that ‘parcel data’ or ‘parcel record’ products have a different meaning to different users and data stewards. The LIR group focused, specifically, on defining a data sharing recommendation around a tax year parcel GIS data product, aligned with the finalization of the property tax roll by County Assessors on May 22nd of each year. The LIR recommendations do not impact the periodic sharing of basic parcel GIS data (boundary, ID, address) from the County Recorders to AGRC per 63F-1-506 (3.b.vi). Both the tax year parcel and the basic parcel GIS layers are designed for general purpose uses, and are not substitutes for researching and obtaining the most current, legal land records information on file in County records. This document, below, proposes a schedule, guidelines, and process for assembling county parcel and assessment data into an annual, statewide tax parcel GIS layer. gis.utah.gov/data/sgid-cadastre/It is hoped that this new expanded parcel GIS layer will be put to immediate use supporting the best possible outcomes in public safety, economic development, transportation, planning, and the provision of public services. Another aim of the work group was to improve the usability of the data, through development of content guidelines and consistent metadata documentation, and the efficiency with which the data sharing is distributed.GIS Layer Boundary Geometry:GIS Format Data Files: Ideally, Tax Year Parcel data should be provided in a shapefile (please include the .shp, .shx, .dbf, .prj, and .xml component files) or file geodatabase format. An empty shapefile and file geodatabase schema are available for download at:At the request of a county, AGRC will provide technical assistance to counties to extract, transform, and load parcel and assessment information into the GIS layer format.Geographic Coverage: Tax year parcel polygons should cover the area of each county for which assessment information is created and digital parcels are available. Full coverage may not be available yet for each county. The county may provide parcels that have been adjusted to remove gaps and overlaps for administrative tax purposes or parcels that retain these expected discrepancies that take their source from the legally described boundary or the process of digital conversion. The diversity of topological approaches will be noted in the metadata.One Tax Parcel Record Per Unique Tax Notice: Some counties produce an annual tax year parcel GIS layer with one parcel polygon per tax notice. In some cases, adjacent parcel polygons that compose a single taxed property must be merged into a single polygon. This is the goal for the statewide layer but may not be possible in all counties. AGRC will provide technical support to counties, where needed, to merge GIS parcel boundaries into the best format to match with the annual assessment information.Standard Coordinate System: Parcels will be loaded into Utah’s statewide coordinate system, Universal Transverse Mercator coordinates (NAD83, Zone 12 North). However, boundaries stored in other industry standard coordinate systems will be accepted if they are both defined within the data file(s) and documented in the metadata (see below).Descriptive Attributes:Database Field/Column Definitions: The table below indicates the field names and definitions for attributes requested for each Tax Parcel Polygon record.FIELD NAME FIELD TYPE LENGTH DESCRIPTION EXAMPLE SHAPE (expected) Geometry n/a The boundary of an individual parcel or merged parcels that corresponds with a single county tax notice ex. polygon boundary in UTM NAD83 Zone 12 N or other industry standard coordinates including state plane systemsCOUNTY_NAME Text 20 - County name including spaces ex. BOX ELDERCOUNTY_ID (expected) Text 2 - County ID Number ex. Beaver = 1, Box Elder = 2, Cache = 3,..., Weber = 29ASSESSOR_SRC (expected) Text 100 - Website URL, will be to County Assessor in most all cases ex. webercounty.org/assessorBOUNDARY_SRC (expected) Text 100 - Website URL, will be to County Recorder in most all cases ex. webercounty.org/recorderDISCLAIMER (added by State) Text 50 - Disclaimer URL ex. gis.utah.gov...CURRENT_ASOF (expected) Date - Parcels current as of date ex. 01/01/2016PARCEL_ID (expected) Text 50 - County designated Unique ID number for individual parcels ex. 15034520070000PARCEL_ADD (expected, where available) Text 100 - Parcel’s street address location. Usually the address at recordation ex. 810 S 900 E #304 (example for a condo)TAXEXEMPT_TYPE (expected) Text 100 - Primary category of granted tax exemption ex. None, Religious, Government, Agriculture, Conservation Easement, Other Open Space, OtherTAX_DISTRICT (expected, where applicable) Text 10 - The coding the county uses to identify a unique combination of property tax levying entities ex. 17ATOTAL_MKT_VALUE (expected) Decimal - Total market value of parcel's land, structures, and other improvements as determined by the Assessor for the most current tax year ex. 332000LAND _MKT_VALUE (expected) Decimal - The market value of the parcel's land as determined by the Assessor for the most current tax year ex. 80600PARCEL_ACRES (expected) Decimal - Parcel size in acres ex. 20.360PROP_CLASS (expected) Text 100 - Residential, Commercial, Industrial, Mixed, Agricultural, Vacant, Open Space, Other ex. ResidentialPRIMARY_RES (expected) Text 1 - Is the property a primary residence(s): Y'(es), 'N'(o), or 'U'(nknown) ex. YHOUSING_CNT (expected, where applicable) Text 10 - Number of housing units, can be single number or range like '5-10' ex. 1SUBDIV_NAME (optional) Text 100 - Subdivision name if applicable ex. Highland Manor SubdivisionBLDG_SQFT (expected, where applicable) Integer - Square footage of primary bldg(s) ex. 2816BLDG_SQFT_INFO (expected, where applicable) Text 100 - Note for how building square footage is counted by the County ex. Only finished above and below grade areas are counted.FLOORS_CNT (expected, where applicable) Decimal - Number of floors as reported in county records ex. 2FLOORS_INFO (expected, where applicable) Text 100 - Note for how floors are counted by the County ex. Only above grade floors are countedBUILT_YR (expected, where applicable) Short - Estimated year of initial construction of primary buildings ex. 1968EFFBUILT_YR (optional, where applicable) Short - The 'effective' year built' of primary buildings that factors in updates after construction ex. 1980CONST_MATERIAL (optional, where applicable) Text 100 - Construction Material Types, Values for this field are expected to vary greatly by county ex. Wood Frame, Brick, etc Contact: Sean Fernandez, Cadastral Manager (email: sfernandez@utah.gov; office phone: 801-209-9359)
Not seeing a result you expected?
Learn how you can add new datasets to our index.
This dataset provides shapefile of outlines of the 68 lakes where temperature was modeled as part of this study. The format is a shapefile for all lakes combined (.shp, .shx, .dbf, and .prj files). This dataset is part of a larger data release of lake temperature model inputs and outputs for 68 lakes in the U.S. states of Minnesota and Wisconsin (http://dx.doi.org/10.5066/P9AQPIVD).