Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Polygon layer representing United States counties with name attributes.About Natural EarthNatural Earth is a convenient resource for creating custom maps. Unlike other map data intended for analysis or detailed government mapping, it is designed to meet the needs of cartographers and designers to make generalized maps. Maximum flexibility is a goal.Natural Earth is a public domain collection of map datasets available at 1:10 million (larger scale/more detailed), 1:50 million (medium scale/moderate detail), and 1:110 million (small scale/coarse detail) scales. It features tightly integrated vector and raster data to create a variety of visually pleasing, well-crafted maps with cartography or GIS software. Natural Earth data is made possible by many volunteers and supported by the North American Cartographic Information Society (NACIS).Convenience – Natural Earth solves a problem: finding suitable data for making small-scale maps. In a time when the web is awash in geospatial data, cartographers are forced to waste time sifting through confusing tangles of poorly attributed data to make clean, legible maps. Because your time is valuable, Natural Earth data comes ready to use.Neatness Counts–The carefully generalized linework maintains consistent, recognizable geographic shapes at 1:10m, 1:50m, and 1:110m scales. Natural Earth was built from the ground up, so you will find that all data layers align precisely with one another. For example, where rivers and country borders are one and the same, the lines are coincident.GIS Attributes – Natural Earth, however, is more than just a collection of pretty lines. The data attributes are equally important for mapmaking. Most data contain embedded feature names, which are ranked by relative importance. Other attributes facilitate faster map production, such as width attributes assigned to river segments for creating tapers. Intelligent dataThe attributes assigned to Natural Earth vectors make for efficient mapmaking. Most lines and areas contain embedded feature names, which are ranked by relative importance. Up to eight rankings per data theme allow easy custom map “mashups” to emphasize your subject while de-emphasizing reference features. Other attributes focus on map design. For example, width attributes assigned to rivers allow you to create tapered drainages. Assigning different colors to contiguous country polygons is another task made easier thanks to data attribution.Other key featuresVector features include name attributes and bounding box extents. Know that the Rocky Mountains are larger than the Ozarks.Large polygons are split for more efficient data handling—such as bathymetric layers.Projection-friendly vectors precisely match at 180 degrees longitude. Lines contain enough data points for smooth bending in conic projections, but not so many that computer processing speed suffers.Raster data includes grayscale-shaded relief and cross-blended hypsometric tints derived from the latest NASA SRTM Plus elevation data and tailored to register with Natural Earth Vector.Optimized for use in web mapping applications, with built-in scale attributes to assist features to be shown at different zoom levels.
Facebook
TwitterThese are the main layers that were used in the mapping and analysis for the Santa Monica Mountains Local Coastal Plan, which was adopted by the Board of Supervisors on August 26, 2014, and certified by the California Coastal Commission on October 10, 2014. Below are some links to important documents and web mapping applications, as well as a link to the actual GIS data:
Plan Website – This has links to the actual plan, maps, and a link to our online web mapping application known as SMMLCP-NET. Click here for website. Online Web Mapping Application – This is the online web mapping application that shows all the layers associated with the plan. These are the same layers that are available for download below. Click here for the web mapping application. GIS Layers – This is a link to the GIS layers in the form of an ArcGIS Map Package, click here (LINK TO FOLLOW SOON) for ArcGIS Map Package (version 10.3). Also, included are layers in shapefile format. Those are included below.
Below is a list of the GIS Layers provided (shapefile format):
Recreation (Zipped - 5 MB - click here)
Coastal Zone Campground Trails (2012 National Park Service) Backbone Trail Class III Bike Route – Existing Class III Bike Route – Proposed
Scenic Resources (Zipped - 3 MB - click here)
Significant Ridgeline State-Designated Scenic Highway State-Designated Scenic Highway 200-foot buffer Scenic Route Scenic Route 200-foot buffer Scenic Element
Biological Resources (Zipped - 45 MB - click here)
National Hydrography Dataset – Streams H2 Habitat (High Scrutiny) H1 Habitat H1 Habitat 100-foot buffer H1 Habitat Quiet Zone H2 Habitat H3 Habitat
Hazards (Zipped - 8 MB - click here)
FEMA Flood Zone (100-year flood plain) Liquefaction Zone (Earthquake-Induced Liquefaction Potential) Landslide Area (Earthquake-Induced Landslide Potential) Fire Hazard and Responsibility Area
Zoning and Land Use (Zipped - 13 MB - click here)
Malibu LCP – LUP (1986) Malibu LCP – Zoning (1986) Land Use Policy Zoning
Other Layers (Zipped - 38 MB - click here)
Coastal Commission Appeal Jurisdiction Community Names Santa Monica Mountains (SMM) Coastal Zone Boundary Pepperdine University Long Range Development Plan (LRDP) Rural Village
Contact the L.A. County Dept. of Regional Planning's GIS Section if you have questions. Send to our email.
Facebook
TwitterOpen Government Licence - Canada 2.0https://open.canada.ca/en/open-government-licence-canada
License information was derived automatically
Have you ever wanted to create your own maps, or integrate and visualize spatial datasets to examine changes in trends between locations and over time? Follow along with these training tutorials on QGIS, an open source geographic information system (GIS) and learn key concepts, procedures and skills for performing common GIS tasks – such as creating maps, as well as joining, overlaying and visualizing spatial datasets. These tutorials are geared towards new GIS users. We’ll start with foundational concepts, and build towards more advanced topics throughout – demonstrating how with a few relatively easy steps you can get quite a lot out of GIS. You can then extend these skills to datasets of thematic relevance to you in addressing tasks faced in your day-to-day work.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Dataset for: Bedding scale correlation on Mars in western Arabia Terra
A.M. Annex et al.
Data Product Overview
This repository contains all source data for the publication. Below is a description of each general data product type, software that can load the data, and a list of the file names along with the short description of the data product.
HiRISE Digital Elevation Models (DEMs).
HiRISE DEMs produced using the Ames Stereo Pipeline are in geotiff format ending with ‘*X_0_DEM-adj.tif’, the “X” prefix denotes the spatial resolution of the data product in meters. Geotiff files are able to be read by free GIS software like QGIS.
HiRISE map-projected imagery (DRGs).
Map-projected HiRISE images produced using the Ames Stereo Pipeline are in geotiff format ending with ‘*0_Y_DRG-cog.tif’, the “Y” prefix denotes the spatial resolution of the data product in centimeters. Geotiff files are able to be read by free GIS software like QGIS. The DRG files are formatted as COG-geotiffs for enhanced compression and ease of use.
3D Topography files (.ply).
Traingular Mesh versions of the HiRISE/CTX topography data used for 3D figures in “.ply” format. Meshes are greatly geometrically simplified from source files. Topography files can be loaded in a variety of open source tools like ParaView and Meshlab. Textures can be applied using embedded texture coordinates.
3D Geological Model outputs (.vtk)
VTK 3D file format files of model output over the spatial domain of each study site. VTK files can be loaded by ParaView open source software. The “block” files contain the model evaluation over a regular grid over the model extent. The “surfaces” files contain just the bedding surfaces as interpolated from the “block” files using the marching cubes algorithm.
Geological Model geologic maps (geologic_map.tif).
Geologic maps from geological models are standard geotiffs readable by conventional GIS software. The maximum value for each geologic map is the “no-data” value for the map. Geologic maps are calculated at a lower resolution than the topography data for storage efficiency.
Beds Geopackage File (.gpkg).
Geopackage vector data file containing all mapped layers and associated metadata including dip corrected bed thickness as well as WKB encoded 3D linestrings representing the sampled topography data to which the bedding orientations were fit. Geopackage files can be read using GIS software like QGIS and ArcGIS as well as the OGR/GDAL suite. A full description of each column in the file is provided below.
| Column | Type | Description |
|---|---|---|
| uuid | String | unique identifier |
| stratum_order | Real | 0-indexed bed order |
| section | Real | section number |
| layer_id | Real | bed number/index |
| layer_id_bk | Real | unused backup bed number/index |
| source_raster | String | dem file path used |
| raster | String | dem file name |
| gsd | Real | ground sampling distant for dem |
| wkn | String | well known name for dem |
| rtype | String | raster type |
| minx | Real | minimum x position of trace in dem crs |
| miny | Real | minimum y position of trace in dem crs |
| maxx | Real | maximum x position of trace in dem crs |
| maxy | Real | maximum y position of trace in dem crs |
| method | String | internal interpolation method |
| sl | Real | slope in degrees |
| az | Real | azimuth in degrees |
| error | Real | maximum error ellipse angle |
| stdr | Real | standard deviation of the residuals |
| semr | Real | standard error of the residuals |
| X | Real | mean x position in CRS |
| Y | Real | mean y position in CRS |
| Z | Real | mean z position in CRS |
| b1 | Real | plane coefficient 1 |
| b2 | Real | plane coefficient 2 |
| b3 | Real | plane coefficient 3 |
| b1_se | Real | standard error plane coefficient 1 |
| b2_se | Real | standard error plane coefficient 2 |
| b3_se | Real | standard error plane coefficient 3 |
| b1_ci_low | Real | plane coefficient 1 95% confidence interval low |
| b1_ci_high | Real | plane coefficient 1 95% confidence interval high |
| b2_ci_low | Real | plane coefficient 2 95% confidence interval low |
| b2_ci_high | Real | plane coefficient 2 95% confidence interval high |
| b3_ci_low | Real | plane coefficient 3 95% confidence interval low |
| b3_ci_high | Real | plane coefficient 3 95% confidence interval high |
| pca_ev_1 | Real | pca explained variance ratio pc 1 |
| pca_ev_2 | Real | pca explained variance ratio pc 2 |
| pca_ev_3 | Real | pca explained variance ratio pc 3 |
| condition_number | Real | condition number for regression |
| n | Integer64 | number of data points used in regression |
| rls | Integer(Boolean) | unused flag |
| demeaned_regressions | Integer(Boolean) | centering indicator |
| meansl | Real | mean section slope |
| meanaz | Real | mean section azimuth |
| angular_error | Real | angular error for section |
| mB_1 | Real | mean plane coefficient 1 for section |
| mB_2 | Real | mean plane coefficient 2 for section |
| mB_3 | Real | mean plane coefficient 3 for section |
| R | Real | mean plane normal orientation vector magnitude |
| num_valid | Integer64 | number of valid planes in section |
| meanc | Real | mean stratigraphic position |
| medianc | Real | median stratigraphic position |
| stdc | Real | standard deviation of stratigraphic index |
| stec | Real | standard error of stratigraphic index |
| was_monotonic_increasing_layer_id | Integer(Boolean) | monotonic layer_id after projection to stratigraphic index |
| was_monotonic_increasing_meanc | Integer(Boolean) | monotonic meanc after projection to stratigraphic index |
| was_monotonic_increasing_z | Integer(Boolean) | monotonic z increasing after projection to stratigraphic index |
| meanc_l3sigma_std | Real | lower 3-sigma meanc standard deviation |
| meanc_u3sigma_std | Real | upper 3-sigma meanc standard deviation |
| meanc_l2sigma_sem | Real | lower 3-sigma meanc standard error |
| meanc_u2sigma_sem | Real | upper 3-sigma meanc standard error |
| thickness | Real | difference in meanc |
| thickness_fromz | Real | difference in Z value |
| dip_cor | Real | dip correction |
| dc_thick | Real | thickness after dip correction |
| dc_thick_fromz | Real | z thickness after dip correction |
| dc_thick_dev | Integer(Boolean) | dc_thick <= total mean dc_thick |
| dc_thick_fromz_dev | Integer(Boolean) | dc_thick <= total mean dc_thick_fromz |
| thickness_fromz_dev | Integer(Boolean) | dc_thick <= total mean thickness_fromz |
| dc_thick_dev_bg | Integer(Boolean) | dc_thick <= section mean dc_thick |
| dc_thick_fromz_dev_bg | Integer(Boolean) | dc_thick <= section mean |
Facebook
TwitterLinks to recordings of the Integrated Services Program and 9-1-1 & Geospatial Services Bureau webinar series, including NG9-1-1 GIS topics such as: data preparation; data provisioning and maintenance; boundary best practices; and extract, transform, and load (ETL). Offerings include:Topic: Virginia Next Generation 9-1-1 Dashboard and Resources Update Description: Virginia recently updated the NG9-1-1 Dashboard with some new tabs and information sources and continues to develop new resources to assist the GIS data work. This webinar provides an overview of changes, a demonstration of new functionality, and a guide to finding and using new resources that will benefit Virginia public safety and GIS personnel with roles in their NG9-1-1 projects. Wednesday 16 June 2021. Recording available at: https://vimeo.com/566133775Topic: Emergency Service Boundary GIS Data Layers and Functions in your NG9-1-1 PSAP Description: Law, Fire, and Emergency Medical Service (EMS) Emergency Service Boundary (ESB) polygons are required elements of the NENA NG9-1-1 GIS data model stack that indicate which agency is responsible for primary response. While this requirement must be met in your Virginia NG9-1-1 deployment with AT&T and Intrado, there are quite a few ways you could choose to implement these polygons. PSAPs and their GIS support must work together to understand how this information will come into a NG9-1-1 i3 PSAP and how it will replace traditional ESN information in order to make good choices while implementing these layers. This webinar discusses:the function of ESNs in your legacy 9-1-1 environment, the role of ESBs in NG9-1-1, and how ESB information appears in your NG9-1-1 PSAP. Wednesday, 22 July 2020. Recording available at: https://vimeo.com/441073056#t=360sTopic: "The GIS Folks Handle That": What PSAP Professionals Need to Know about the GIS Project Phase of Next Generation 9-1-1 DeploymentDescription: Next Generation 9-1-1 (NG9-1-1) brings together the worlds of emergency communication and spatial data and mapping. While it may be tempting for PSAPs to outsource cares and concerns about road centerlines and GIS data provisioning to 'the GIS folks', GIS staff are crucial to the future of emergency call routing and location validation. Data required by NG9-1-1 usually builds on data that GIS staff already know and use for other purposes, so the transition requires them to learn more about PSAP operations and uses of core data. The goal of this webinar is to help the PSAP and GIS worlds come together by explaining the role of the GIS Project in the Virginia NG9-1-1 Deployment Steps, exploring how GIS professionals view NG9-1-1 deployment as a project, and fostering a mutual understanding of how GIS will drive NG9-1-1. 29 January 2020. Recording available at: https://vimeo.com/showcase/9791882/video/761225474Topic: Getting Your GIS Data from Here to There: Processes and Best Practices for Extract, Transform and Load (ETL) Description: During the fall of 2019, VITA-ISP staff delivered workshops on "Tools and Techniques for Managing the Growing Role of GIS in Enterprise Software." This session presents information from the workshops related to the process of extracting, transforming, and loading data (ETL), best practices for ETL, and methods for data schema comparison and field mapping as a webinar. These techniques and skills assist GIS staff with their growing role in Next Generation 9-1-1 but also apply to many other projects involving the integration and maintenance of GIS data. 19 February 2020. Recording available at: https://vimeo.com/showcase/9791882/video/761225007Topic: NG9-1-1 GIS Data Provisioning and MaintenanceDescription: VITA ISP pleased to announce an upcoming webinar about the NG9-1-1 GIS Data Provisioning and Maintenance document provided by Judy Doldorf, GISP with the Fairfax County Department of Information Technology and RAC member. This document was developed by members of the NG9-1-1 GIS workgroup within the VITA Regional Advisory Council (RAC) and is intended to provide guidance to local GIS and PSAP authorities on the GIS datasets and associated GIS to MSAG/ALI validation and synchronization required for NG9-1-1 services. The document also provides guidance on geospatial call routing readiness and the short- and long-term GIS data maintenance workflow procedures. In addition, some perspective and insight from the Fairfax County experience in GIS data preparation for the AT&T and West solution will be discussed in this webinar. 31 July 2019. Recording available at: https://vimeo.com/showcase/9791882/video/761224774Topic: NG9-1-1 Deployment DashboardDescription: I invite you to join us for a webinar that will provide an overview of our NG9-1-1 Deployment Dashboard and information about other online ISP resources. The ISP website has been long criticized for being difficult to use and find information. The addition of the Dashboard and other changes to the website are our attempt to address some of these concerns and provide an easier way to find information especially as we undertake NG9-1-1 deployment. The Dashboard includes a status map of all Virginia PSAPs as it relates to the deployment of NG9-1-1, including the total amount of funding requested by the localities and awards approved by the 9-1-1 Services Board. During this webinar, Lyle Hornbaker, Regional Coordinator for Region 5, will navigate through the dashboard and provide tips on how to more effectively utilize the ISP website. 12 June 2019. Recording not currently available. Please see the Virginia Next Generation 9-1-1 Dashboard and Resources Update webinar recording from 16 June 2021. Topic: PSAP Boundary Development Tools and Process RecommendationDescription: This webinar will be presented by Geospatial Program Manager Matt Gerike and VGIN Coordinator Joe Sewash. With the release of the PSAP boundary development tools and PSAP boundary segment compilation guidelines on the VGIN Clearinghouse in March, this webinar demonstrates the development tools, explains the process model, and discusses methods, tools, and resources available for you as you work to complete PSAP boundary segments with your neighbors. 15 May 2019. Recording available at: https://www.youtube.com/watch?v=kI-1DkUQF9Q&feature=youtu.beTopic: NG9-1-1 Data Preparation - Utilizing VITA's GIS Data Report Card ToolDescription: This webinar, presented by VGIN Coordinator Joe Sewash, Geospatial Program Manager Matt Gerike, and Geospatial Analyst Kenny Brevard will provide an overview of the first version of the tools that were released on March 25, 2019. These tools will allow localities to validate their GIS data against the report card rules, the MSAG and ALI checks used in previous report cards, and the analysis listed in the NG9-1-1 migration proposal document. We will also discuss the purpose of the tools, input requirements, initial configuration, how to run them, and how to make sense of your results. 10 April 2019. Recording available at: https://vimeo.com/showcase/9791882/video/761224495Topic: NG9-1-1 PSAP Boundary Best Practice WebinarDescription: During the months of November and December, VITA ISP staff hosted regional training sessions about best practices for PSAP boundaries as they relate to NG9-1-1. These sessions were well attended and very interactive, therefore we feel the need to do a recap and allow those that may have missed the training to attend a makeup session. 30 January 2019. Recording not currently available. Please see the PSAP Boundary Development Tools and Process Recommendation webinar recording from 15 May 2019.Topic: NG9-1-1 GIS Overview for ContractorsDescription: The Commonwealth of Virginia has started its migration to next generation 9-1-1 (NG9-1-1). This migration means that there will be a much greater reliance on geographic information (GIS) to locate and route 9-1-1 calls. VITA ISP has conducted an assessment of current local GIS data and provided each locality with a report. Some of the data from this report has also been included in the localities migration proposal, which identifies what data issues need to be resolved before the locality can migrate to NG9-1-1. Several localities in Virginia utilize a contractor to maintain their GIS data. This webinar is intended for those contractors to review the data in the report, what is included in the migration proposal and how they may be called on to assist the localities they serve. It will still ultimately be up to each locality to determine whether they engage a contractor for assistance, but it is important for the contractor community to understand what is happening and have an opportunity to ask questions about the intent and goals. This webinar will provide such an opportunity. 22 August 2018. Recording not currently available. Please contact us at NG911GIS@vdem.virginia.gov if you are interested in this content.
Facebook
TwitterGeotweet Archive v2.0 The Harvard Center for Geographic Analysis (CGA) maintains the Geotweet Archive, a global record of tweets spanning time, geography, and language. The primary purpose of the Archive is to make a comprehensive collection of geo-located tweets available to the academic research community. The Archive extends from 2010 to the present and is updated daily. The number of tweets in the collection totals approximately 10 billion, and it is stored on Harvard University’s High Performance Computing (HPC) cluster. The Harvard HPC supports many applications for working with big spatio-temporal datasets, including two geospatial tools recently deployed by the CGA: OmniSci Immerse, and PostGIS. The Geotweet Archive consists of tweets which carry two types of geospatial signature: 1) GPS-based longitude/latitude generated by the originating device 2) Place-name-centroid-based longitude/latitude from the bounding box provided by Twitter, based on the user-define place designation (typically a town name). Any tweet which carries one or both of these signatures is included in the Archive. Approximately 1-2% of all tweets contain such geographic coordinates, (this percentage needs verification and may vary over time). The current version of the Archive is Version 2.0. The original Version 1.0 archive began in 2012 as part of a project with Ben Lewis of CGA and then Harvard graduate student Todd Mostak, to develop a GPU-powered spatial database called GEOPS. GEOPS formed the basis for technology startup MapD Technologies, which is now OmniSci. OmniSci Immerse software now runs on Harvard’s High Performance Computing (HPC) environment to support interactive exploration and analytics with the Geotweet Archive and any other large datasets. Version 2.0 of the archive represents the results of a merge between the CGA archive, and an archive developed by the Department of Geoinformatics at the University of Salzburg in Austria, as well as several other archives. Clemens Havas and Bernd Resch at University of Salzburg, and Devika Kakkar of Harvard CGA collaborated to deploy Version 2.0. ======================================================== Schema of Geotweet Archive v2.0 Field name_TYPE_Description message_id----BIGINT----Tweet ID tweet_date----TIMESTAMP----Date and time of tweet from Twitter (utc) tweet_text----TEXT ENCODING----Text content of tweet tags----TEXT ENCODING DICT----Tweet hashtags tweet_lang----TEXT ENCODING DICT----Language that the tweet is in source ----TEXT ENCODING DICT----Operating system or application type used to create the tweet place*----TEXT ENCODING NONE----The geographic place as defined by the user, usually a town name. A bounding box determined by Twitter based on this field, from which centroids (see longitude and latitude fields) and the spatial_error field are derived, and used when not overridden by a GPS coordinate. See Twitter tweet object for place. retweets ----SMALLINT----Number of retweets as of last time it was checked tweet_favorites----SMALLINT----Now known as ‘likes’ photo_url----TEXT ENCODING DICT----URL of any image referenced quoted_status_id ----BIGINT----ID number for quote status user_id ----BIGINT----User ID number user_name----TEXT ENCODING NONE----User name user_location*----TEXT ENCODING NONE----User defined location, usually a city or town. See Twitter user object. followers ----SMALLINT----Followers as of the last time checked friends ----SMALLINT----Number of users followed by this user user_favorites----INT----Number of topics the user is interested in status----INT----Code for what user is doing as of last time it was checked user_lang----TEXT ENCODING DICT----User defined language latitude----FLOAT----Latitude from GPS or bounding box based on Place field longitude----FLOAT----Longitude from GPS or bounding box based on Place field data_source*----TEXT ENCODING DICT----The source crawler or dataset for the tweet gps----TEXT ENCODING DICT----Flag for whether lon/lat is from GPS or town name bounding box (SRID – 4326). When both are present, the GPS coordinate takes priority. spatialerror----FLOAT----Estimate in meters horizontal error for lon/lat coordinate. 10m for GPS coordinates, error for bounding boxes calculated as radius of circle with area of bounding box. ===================================================== *data_source_Code U. Salzburg REST API crawler----1 Harvard CGA streaming crawler----2 U. Salzburg streaming API crawler----3 Ryan Qi Wang and Harvard Medical School datasets----4 U. Heidelberg dataset----5 Archive.org dataset----6 ---------------------------------------------------------------------------------------------- Note: Before April of 2015 the default for GPS coordinate capture was turned on for Twitter users. After this date users have had to opt-in to share their precise location. This is one reason for the large decrease in volume of geotweets after this date. A number of automated...
Facebook
Twitterhttps://geohub.cityoftacoma.org/pages/disclaimerhttps://geohub.cityoftacoma.org/pages/disclaimer
Tacoma 2012 - 4 inch Aerials for ArcGIS Online/Bing Maps/Google Maps, etc.Contact Info: Name: GIS Team Email: GISteam@cityoftacoma.orgCompany: Pictometry International Corp.Title: WA City of Tacoma 2012 Accuplus ProjectFlight Height: 3,500 feetFlight Dates: Between May 6th, 2012 and May 7th, 2012Dataset is orthoimagery. Logical consistency is not applicable.The 2012 City of Tacoma, WA Orthogonal data associated with this metadata file completely covers the project-specified boundary.GPS/INS processing: Data was post-processed using NGS CORS base station data.AeroTriangulation: Imagery was aerotriangulated using Inpho's Match-AT software.Rectification: Ortho-rectification was performed using Inpho's OrthoMaster software. A LiDAR based DEM was used as the rectification surface.Mosaicing: Mosaiking was performed using Inpho's OrthoVista software and SeamEditor was employed for manual corrections.Original ArcGIS coordinate system: Type: Projected Geographic coordinate reference: GCS_North_American_1983_HARN Projection: NAD_1983_HARN_StatePlane_Washington_South_FIPS_4602_Feet Well-known identifier: 2927Geographic extent - Bounding rectangle: West longitude: -122.570169 East longitude: -122.334799 North latitude: 47.323225 South latitude: 47.154908Extent in the item's coordinate system: West longitude: 1127000.000000 East longitude: 1184000.000000 South latitude: 671000.000000 North latitude: 731000.000000
Facebook
Twitter🇺🇸 미국 English Where to Fish For Application New to the state? New to fishing? Want to find a new place to fish? Have you always wanted to catch a Black Crappie but don’t know where to go? Well then, check out CT DEEP Fisheries Division’s “Where to fish for…” application. This simple interactive map, co-developed by CT DEEP GIS and the Fisheries Division is intended to augment our “Fish with CARE” learn to fish events. At the events, the most common question by far was, “Where can I fish near me…”. This map provides the ideal solution for both novice and experienced anglers alike. The data powering the map was collected over the past decade through routine monitoring by the Fisheries Division of public lakes, ponds and the CT River. This new map joins two previously developed interactive maps, one for Rivers/Streams (Interactive Trout Stocking Map) and one for our shoreline access, charter boats, and tackle shops (Saltwater Resources map) also help to get people onto the fish. This video provides a quick tutorial on some of the key features.
Facebook
TwitterA feature layer of Manitoba's provincial boundaries:Manitoba/Ontario boundary,Manitoba/Saskatchewan boundary,Manitoba/Nunavut boundary,International boundary.Excluding the International Boundary, the graphical data was computed from original boundary survey measurements published in respective official boundary commission reports using least squares adjustment software "Manor". The adjustments were constrained to known NAD83 [nmip94 adj.] federal/provincial boundary marker positions. For the International Boundary, the graphics were created by converting the official published NAD27 marker positions for the boundary into NAD83 using datum conversion software NTv2 and interconnecting the plotted marker positions with straight lines using CARIS map software.The purpose is to provide end users with a digital map of Manitoba's boundaries. This data layer is suitable for most medium and small scale digital map applications as well as GIS georeferencing in general.This Manitoba provincial boundary was originally published on January 12, 2004. It was uploaded to Manitoba Maps as a feature layer on December 15, 2016.Use Constraints:The Hudson Bay shoreline for this product was taken from 1:500,000 scale digital mapping and is intended for generalized small scale mapping of this portion of the provincial boundary. Estimated accuracy if plus or minus 125m.Fields Included:FID: Sequential unique whole numbers that are automatically generated,AREA: GIS area in square-metres calculated in the NAD83 Universal Transverse Mercator Zone 14 coordinate system,PERIMETER: GIS perimeter in metres calculated in the NAD83 Universal Transverse Mercator Zone 14 coordinate system,NAME: province name
Facebook
TwitterThis nowCOAST time-enabled map service provides maps of NOAA/National Weather Service RIDGE2 mosaics of base reflectivity images across the Continental United States (CONUS) as well as Puerto Rico, Hawaii, Guam and Alaska with a 2 kilometer (1.25 mile) horizontal resolution. The mosaics are compiled by combining regional base reflectivity radar data obtained from 158 Weather Surveillance Radar 1988 Doppler (WSR-88D) also known as NEXt-generation RADar (NEXRAD) sites across the country operated by the NWS and the Dept. of Defense and also from data from Terminal Doppler Weather Radars (TDWR) at major airports. The colors on the map represent the strength of the energy reflected back toward the radar. The reflected intensities (echoes) are measured in dBZ (decibels of z). The color scale is very similar to the one used by the NWS RIDGE2 map viewer. The radar data itself is updated by the NWS every 10 minutes during non-precipitation mode, but every 4-6 minutes during precipitation mode. To ensure nowCOAST is displaying the most recent data possible, the latest mosaics are downloaded every 5 minutes. For more detailed information about the update schedule, see: http://new.nowcoast.noaa.gov/help/#section=updateschedule
Background InformationReflectivity is related to the power, or intensity, of the reflected radiation that is sensed by the radar antenna. Reflectivity is expressed on a logarithmic scale in units called dBZ. The "dB" in the dBz scale is logarithmic and is unit less, but is used only to express a ratio. The "z" is the ratio of the density of water drops (measured in millimeters, raised to the 6th power) in each cubic meter (mm^6/m^3). When the "z" is large (many drops in a cubic meter), the reflected power is large. A small "z" means little returned energy. In fact, "z" can be less than 1 mm^6/m^3 and since it is logarithmic, dBz values will become negative, as often in the case when the radar is in clear air mode and indicated by earth tone colors. dBZ values are related to the intensity of rainfall. The higher the dBZ, the stronger the rain rate. A value of 20 dBZ is typically the point at which light rain begins. The values of 60 to 65 dBZ is about the level where 3/4 inch hail can occur. However, a value of 60 to 65 dBZ does not mean that severe weather is occurring at that location. The best reflectivity is lowest (1/2 degree elevation angle) reflectivity scan from the radar. The source of the base reflectivity mosaics is the NWS Southern Region Radar Integrated Display with Geospatial Elements (RIDGE2).
Time InformationThis map is time-enabled, meaning that each individual layer contains time-varying data and can be utilized by clients capable of making map requests that include a time component.
This particular service can be queried with or without the use of a time component. If the time parameter is specified in a request, the data or imagery most relevant to the provided time value, if any, will be returned. If the time parameter is not specified in a request, the latest data or imagery valid for the present system time will be returned to the client. If the time parameter is not specified and no data or imagery is available for the present time, no data will be returned.
In addition to ArcGIS Server REST access, time-enabled OGC WMS 1.3.0 access is also provided by this service.
Due to software limitations, the time extent of the service and map layers displayed below does not provide the most up-to-date start and end times of available data. Instead, users have three options for determining the latest time information about the service:
Facebook
TwitterThis layer is deprecated as of April 3, 2023. Use this layer as a replacement: https://noaa.maps.arcgis.com/home/item.html?id=b0cdf263cea24544b0da2fc00fb2b259This nowCOAST time-enabled map service provides maps of NOAA/National Weather Service RIDGE2 mosaics of base reflectivity images across the Continental United States (CONUS) as well as Puerto Rico, Hawaii, Guam and Alaska with a 2 kilometer (1.25 mile) horizontal resolution. The mosaics are compiled by combining regional base reflectivity radar data obtained from 158 Weather Surveillance Radar 1988 Doppler (WSR-88D) also known as NEXt-generation RADar (NEXRAD) sites across the country operated by the NWS and the Dept. of Defense and also from data from Terminal Doppler Weather Radars (TDWR) at major airports. The colors on the map represent the strength of the energy reflected back toward the radar. The reflected intensities (echoes) are measured in dBZ (decibels of z). The color scale is very similar to the one used by the NWS RIDGE2 map viewer. The radar data itself is updated by the NWS every 10 minutes during non-precipitation mode, but every 4-6 minutes during precipitation mode. To ensure nowCOAST is displaying the most recent data possible, the latest mosaics are downloaded every 5 minutes. For more detailed information about the update schedule, see: https://new.nowcoast.noaa.gov/help/#section=updateschedule
Background Information
Reflectivity is related to the power, or intensity, of the reflected radiation that is sensed by the radar antenna. Reflectivity is expressed on a logarithmic scale in units called dBZ. The "dB" in the dBz scale is logarithmic and is unit less, but is used only to express a ratio. The "z" is the ratio of the density of water drops (measured in millimeters, raised to the 6th power) in each cubic meter (mm^6/m^3). When the "z" is large (many drops in a cubic meter), the reflected power is large. A small "z" means little returned energy. In fact, "z" can be less than 1 mm^6/m^3 and since it is logarithmic, dBz values will become negative, as often in the case when the radar is in clear air mode and indicated by earth tone colors. dBZ values are related to the intensity of rainfall. The higher the dBZ, the stronger the rain rate. A value of 20 dBZ is typically the point at which light rain begins. The values of 60 to 65 dBZ is about the level where 3/4 inch hail can occur. However, a value of 60 to 65 dBZ does not mean that severe weather is occurring at that location. The best reflectivity is lowest (1/2 degree elevation angle) reflectivity scan from the radar. The source of the base reflectivity mosaics is the NWS Southern Region Radar Integrated Display with Geospatial Elements (RIDGE2).
Time Information
This map is time-enabled, meaning that each individual layer contains time-varying data and can be utilized by clients capable of making map requests that include a time component.
This particular service can be queried with or without the use of a time component. If the time parameter is specified in a request, the data or imagery most relevant to the provided time value, if any, will be returned. If the time parameter is not specified in a request, the latest data or imagery valid for the present system time will be returned to the client. If the time parameter is not specified and no data or imagery is available for the present time, no data will be returned.
In addition to ArcGIS Server REST access, time-enabled OGC WMS 1.3.0 access is also provided by this service.
Due to software limitations, the time extent of the service and map layers displayed below does not provide the most up-to-date start and end times of available data. Instead, users have three options for determining the latest time information about the service:
Issue a returnUpdates=true request for an individual layer or for
the service itself, which will return the current start and end times of
available data, in epoch time format (milliseconds since 00:00 January 1,
1970). To see an example, click on the "Return Updates" link at the bottom of
this page under "Supported Operations". Refer to the ArcGIS REST API Map Service Documentation for more information.
Issue an Identify (ArcGIS REST) or GetFeatureInfo (WMS) request against
the proper layer corresponding with the target dataset. For raster
data, this would be the "Image Footprints with Time Attributes" layer
in the same group as the target "Image" layer being displayed. For
vector (point, line, or polygon) data, the target layer can be queried
directly. In either case, the attributes returned for the matching
raster(s) or vector feature(s) will include the following:
validtime: Valid timestamp.
starttime: Display start time.
endtime: Display end time.
reftime: Reference time (sometimes reffered to as
issuance time, cycle time, or initialization time).
projmins: Number of minutes from reference time to valid
time.
desigreftime: Designated reference time; used as a
common reference time for all items when individual reference
times do not match.
desigprojmins: Number of minutes from designated
reference time to valid time.
Query the nowCOAST LayerInfo web service, which has been created to
provide additional information about each data layer in a service,
including a list of all available "time stops" (i.e. "valid times"),
individual timestamps, or the valid time of a layer's latest available
data (i.e. "Product Time"). For more information about the LayerInfo
web service, including examples of various types of requests, refer to
the nowCOAST help documentation at:https://new.nowcoast.noaa.gov/help/#section=layerinfo
References
NWS, 2003: NWS Product Description Document for Radar Integrated Display with Geospatial Elements Version 2- RIDGE2, NWS/SRH, Fort Worth, Texas (Available at https://products.weather.gov/PDD/RIDGE_II_PDD_ver2.pdf). NWS, 2013: Radar Images for GIS Software (https://www.srh.noaa.gov/jetstream/doppler/gis.htm).
Facebook
TwitterFeature layer of the Manitoba provincial boundary. A feature layer of Manitoba's provincial boundaries:Manitoba/Ontario boundary, Manitoba/Saskatchewan boundary, Manitoba/Nunavut boundary, International boundary. Excluding the International Boundary, the graphical data was computed from original boundary survey measurements published in respective official boundary commission reports using least squares adjustment software "Manor". The adjustments were constrained to known NAD83 [nmip94 adj.] federal/provincial boundary marker positions. For the International Boundary, the graphics were created by converting the official published NAD27 marker positions for the boundary into NAD83 using datum conversion software NTv2 and interconnecting the plotted marker positions with straight lines using CARIS map software. The purpose is to provide end users with a digital map of Manitoba's boundaries. This data layer is suitable for most medium and small scale digital map applications as well as GIS georeferencing in general. This Manitoba provincial boundary was originally published on January 12, 2004. It was uploaded to Manitoba Maps as a feature layer on December 15, 2016. Use Constraints: The Hudson Bay shoreline for this product was taken from 1:500,000 scale digital mapping and is intended for generalized small scale mapping of this portion of the provincial boundary. Estimated accuracy if plus or minus 125m. Fields Included: FID: Sequential unique whole numbers that are automatically generated, AREA: GIS area in square-metres calculated in the NAD83 Universal Transverse Mercator Zone 14 coordinate system, PERIMETER: GIS perimeter in metres calculated in the NAD83 Universal Transverse Mercator Zone 14 coordinate system, NAME: province name
Facebook
TwitterThis resource is a metadata compilation for 20 map data layers including vector and raster data developed to assess site suitability for heat pump development including karst potential, known wells, contaminant sources, hydrogeologic properties in the Dubuque County area. The data layers are hosted by the Iowa Geological and Water Survey. Data are stored in Web Mercator projection ESPG:102113 and are available as ESRI ArcGIS Server layers at http://programs.iowadnr.gov/arcgis/rest/services/Projects/Geothermal/MapServer. This dataset is delivered as an ESRI file geodatabase. The data were provided by the Iowa Geological and Water Survey under the AASG Geothermal Data project for distribution.
Facebook
Twitterhttps://www.energy.ca.gov/conditions-of-usehttps://www.energy.ca.gov/conditions-of-use
Clean Transportation Program Data 2022. The Clean Transportation Program (also known as Alternative and Renewable Fuel and Vehicle Technology Program) invests up to $100 million annually in a broad portfolio of transportation and fuel transportation projects throughout the state. The Energy Commission leverages public and private investments to support adoption of cleaner transportation powered by alternative and renewable fuels. The program plays an important role in achieving California’s ambitious goals on climate change, petroleum reduction, and adoption of zero-emission vehicles, as well as efforts to reach air quality standards. The program also supports the state’s sustainable, long-term economic development.Data within this application was last updated August 2024.For more information on the Clean Transportation Program, visit:https://www.energy.ca.gov/programs-and-topics/programs/clean-transportation-program
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
These line shapefiles trace apparent topographic and air-photo lineaments in various counties in Colorado. It was made in order to identify possible fault and fracture systems that might be conduits for geothermal fluids, as part of a DOE reconnaissance geothermal exploration program. Geothermal fluids commonly utilize fault and fractures in competent rocks as conduits for fluid flow. Geothermal exploration involves finding areas of high near-surface temperature gradients, along with a suitable "plumbing system" that can provide the necessary permeability. Geothermal power plants can sometimes be built where temperature and flow rates are high. This line shapefile is an attempt to use desktop GIS to delineate possible faults and fracture orientations and locations in highly prospective areas prior to an initial site visit. Geochemical sampling and geologic mapping could then be centered around these possible faults and fractures. To do this, georeferenced topographic maps and aerial photographs were utilized in an existing GIS, using ESRI ArcMap 10.0 software. The USA_Topo_Maps and World_Imagery map layers were chosen from the GIS Server at server.arcgisonline.com, using a UTM Zone 13 NAD27 projection. This line shapefile was then constructed over that which appeared to be through-going structural lineaments in both the aerial photographs and topographic layers, taking care to avoid manmade features such as roads, fence lines, and utility right-of-ways. Still, it is unknown what actual features these lineaments, if they exist, represent. Although the shapefiles are arranged by county, not all areas within any county have been examined for lineaments. Work was focused on either satellite thermal infrared anomalies, known hot springs or wells, or other evidence of geothermal systems. Finally, lineaments may be displaced somewhat from their actual location, due to such factors as shadow effects with low sun angles in the aerial photographs. Credits: These lineament shapefile was created by Geothermal Development Associates, as part of a geothermal geologic reconnaissance performed by Flint Geothermal, LLC, of Denver Colorado. Use Limitation: These shapefiles were constructed as an aid to geothermal exploration in preparation for a site visit for field checking. We make no claims as to the existence of the lineaments, their location, orientation, and/or nature.
Facebook
TwitterFor the full FGDC metadata record, please click here. These data have been created to represent General/Suggested Oil Spill Protective Booming Strategies designed to protect areas that are environmentally and economically sensitive to oil and hazardous material spills (Oil Spill Sensitive Areas). These data were originally created and assembled by the NOAA Scientific Support Coordinator for US Coast Guard District Seven in circa 1992-1993 in cooperation with local Area Committees in accordance with regulations set forth by the National Response Plan of the Oil Pollution Act of 1990. They were provided to FWC-FWRI (Florida Fish and Wildlife Conservation Commission - Fish and Wildlife Research Institute, (at that time known as the Florida Marine Research Institute) in the fall of 2003 as paper maps and PDF maps for each of the US Coast Guard's Marine Safety Office (MSO) Areas of Responsibility (Captain of the Port Zones for Miami (at that time consisting of both what are now known as Sector Miami and Sector Key West), Tampa (now Sector Saint Petersburg), Jacksonville, Savannah, Charleston, and San Juan (Puerto Rico/US Virgin Islands)). In 1999-2000, FWC-FWRI began the process of digitizing the boom strategies depicted on these paper and PDF maps into arc (line) shapefiles, beginning with the maps from MSO Tampa, followed by MSO Miami, then MSO Jacksonville. In the Winter & Spring of 2003-2004 FWC-FWRI mapjoined these data to expand and improve upon the database so it could be used as a core business data layer for the Marine Resources Geographic Information System (MRGIS) library. Using various spatial coding functions, such as "calculate length" and "build geometry", additional attribute information has been added to the spatial database to generate length in feet and meters for summary and reporting purposes. An example of where this can be useful is when performing a spatial selection a summary of the total length of boom can be easily generated. These data are maintained as a part of the MRGIS Library and used with automated map production software to create new printed Geographic Response Plan maps for spill contingency planning and response purposes. Through the years of 2008-2009, FWC-FWRI partnered with the US Coast Guard and Florida Department of Environmental Protection - Bureau of Emergency Response to conduct a series of workshops to review and update these detailed Geographic Response Plan (GRP) data and maps for revised Digital Area Contingency Plans. The GRP revision workshop attendees were from or determined by the specific Area Committee of each Sector. The process of data entry and maintenance is ongoing at FWRI as of July 2011. Data will be entered and undergo quality assurance/quality control processes before new maps are re-produced for distribution and inclusion into Digital Area Contingency Plans and other GIS and/or map products. A versioned geodatabase has been created in SQL/SDE to track changes and manage data entry as well as digital QA/QC processes, such as consistency checks. A map service has also been created that is available to all the public and stakeholder community to view the latest version of this geodata. The map service displays data directly from the Enterprise versioned database.
The spatial data is used to produce response maps and in a GIS (The Florida Marine Spill Analysis System and Digital Area Contingency Plans) to provide timely, accurate, and valuable information to oil spill responders. Maps are produced (as PDF) with the sensitive area sites and protective boom strategies depicted on them. The maps are then "hyperlinked" in PDF to the sensitive area detail data sheets that contain the attribute data for the site in a data report form. The report form contains information on key stakeholders for the area, wildlife resources to be protected, nearby staging areas, recommended protection strategies (a verbal description of the booming strategy depicted on the map), the latitude/longitude of the site, and other response related information needed by first responders. The Booming Strategies have been developed by professional oil spill responders who have participated in the Geographic Response Plan Revision Workshops described. Please see process steps for more information about the history of the GRP revision workshops. NOTE: Booming Strategies were not done at the Sector Mobile (USCG District 7) GRP Workshop. These have been compiled from approved booming strategies related to the Deepwater Horizon oil spill response and are NOT YET approved as "Official Area Contingency Plan" booming strategies.
Facebook
Twitterhttps://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html
The nine-banded Armadillo (Dasypus novemcinctus) is the only species of Armadillo in the United States and alters ecosystems by excavating extensive burrows used by many other wildlife species. Relatively little is known about its habitat use or population densities, particularly in developed areas, which may be key to facilitating its range expansion. We evaluated Armadillo occupancy and density in relation to anthropogenic and landcover variables in the Ozark Mountains of Arkansas along an urban to rural gradient. Armadillo detection probability was best predicted by temperature (positively) and precipitation (negatively). Contrary to expectations, occupancy probability of Armadillos was best predicted by slope (negatively) and elevation (positively) rather than any landcover or anthropogenic variables. Armadillo density varied considerably between sites (ranging from a mean of 4.88 – 46.20 Armadillos per km2) but was not associated with any environmental or anthropogenic variables. Methods Site Selection Our study took place in Northwest Arkansas, USA, in the greater Fayetteville metropolitan area. We deployed trail cameras (Spypoint Force Dark (Spypoint Inc, Victoriaville, Quebec, Canada) and Browning Strikeforce XD cameras (Browning, Morgan, Utah, USA) over the course of two winter seasons, December 2020-March 2021, and November 2021-March 2022. We sampled 10 study sites in year one, and 12 study sites in year two. All study sites were located in the Ozark Mountains ecoregion in Northwest Arkansas. Sites were all Oak Hickory dominated hardwood forests at similar elevation (213.6 – 541 m). Devils Eyebrow and ONSC are public natural areas managed by the Arkansas Natural heritage Commission (ANHC). Devil’s Den and Hobbs are managed by the Arkansas state park system. Markham Woods (Markham), Ninestone Land Trust (Ninestone) and Forbes, are all privately owned, though Markham has a publicly accessible trail system throughout the property. Lake Sequoyah, Mt. Sequoyah Woods, Kessler Mountain, Lake Fayetteville, and Millsaps Mountain are all city parks and managed by the city of Fayetteville. Lastly, both Weddington and White Rock are natural areas within Ozark National Forest and managed by the U.S. Forest Service. We sampled 5 sites in both years of the study including Devils Eyebrow, Markham Hill, Sequoyah Woods, Ozark Natural Science Center (ONSC), and Kessler Mountain. We chose our study sites to represent a gradient of human development, based primarily on Anthropogenic noise values (Buxton et al. 2017, Mennitt and Fristrup 2016). We chose open spaces that were large enough to accommodate camera trap research, as well as representing an array of anthropogenic noise values. Since anthropogenic noise is able to permeate into natural areas within the urban interface, introducing human disturbance that may not be detected by other layers such as impervious surface and housing unit density (Buxton et al. 2017), we used dB values for each site as an indicator of the level of urbanization. Camera Placement We sampled ten study sites in the first winter of the study. At each of the 10 study sites, we deployed anywhere between 5 and 15 cameras. Larger study areas received more cameras than smaller sites because all cameras were deployed a minimum of 150m between one another. We avoided placing cameras on roads, trails, and water sources to artificially bias wildlife detections. We also avoided placing cameras within 15m of trails to avoid detecting humans. At each of the 12 study areas we surveyed in the second winter season, we deployed 12 to 30 cameras. At each study site, we used ArcGIS Pro (Esri Inc, Redlands, CA) to delineate the trail systems and then created a 150m buffer on each side of the trail. We then created random points within these buffered areas to decide where to deploy cameras. Each random point had to occur within the buffered areas and be a minimum of 150m from the next nearest camera point, thus the number of cameras at each site varied based upon site size. We placed all cameras within 50m of the random points to ensure that cameras were deployed on safe topography and with a clear field of view, though cameras were not set in locations that would have increased animal detections (game trails, water sources, burrows etc.). Cameras were rotated between sites after 5 or 10 week intervals to allow us to maximize camera locations with a limited number of trail cameras available to us. Sites with more than 25 cameras were active for 5 consecutive weeks while sites with fewer than 25 cameras were active for 10 consecutive weeks. We placed all cameras on trees or tripods 50cm above ground and at least 15m from trails and roads. We set cameras to take a burst of three photos when triggered. We used Timelapse 2.0 software (Greenberg et al. 2019) to extract metadata (date and time) associated with all animal detections. We manually identified all species occurring in photographs and counted the number of individuals present. Because density estimation requires the calculation of detection rates (number of Armadillo detections divided by the total sampling period), we wanted to reduce double counting individuals. Therefore, we grouped photographs of Armadillos into “episodes” of 5 minutes in length to reduce double counting individuals that repeatedly triggered cameras (DeGregorio et al. 2021, Meek et al. 2014). A 5 min threshold is relatively conservative with evidence that even 1-minute episodes adequately reduces double counting (Meek et al. 2014). Landcover Covariates To evaluate occupancy and density of Armadillos based on environmental and anthropogenic variables, we used ArcGIS Pro to extract variables from 500m buffers placed around each camera (Table 2). This spatial scale has been shown to hold biological meaning for Armadillos and similarly sized species (DeGregorio et al. 2021, Fidino et al. 2016, Gallo et al. 2017, Magle et al. 2016). At each camera, we extracted elevation, slope, and aspect from the base ArcGIS Pro map. We extracted maximum housing unit density (HUD) using the SILVIS housing layer (Radeloff et al. 2018, Table 2). We extracted anthropogenic noise from the layer created by Mennitt and Fristrup (2016, Buxton et al. 2017, Table 2) and used the “L50” anthropogenic sound level estimate, which was calculated by taking the difference between predicted environmental noise and the calculated noise level. Therefore, we assume that higher levels of L50 sound corresponded to higher human presence and activity (i.e. voices, vehicles, and other sources of anthropogenic noise; Mennitt and Fristrup 2016). We derived the area of developed open landcover, forest area, and distance to forest edge from the 2019 National Land Cover Database (NLDC, Dewitz 2021, Table 2). Developed open landcover refers to open spaces with less than 20% impervious surface such as residential lawns, cemeteries, golf courses, and parks and has been shown to be important for medium-sized mammals (Gallo et al. 2017, Poessel et al. 2012). Forest area was calculated by combing all forest types within the NLCD layer (deciduous forest, mixed forest, coniferous forest), and summarizing the total area (km2) within the 500m buffer. Distance to forest edge was derived by creating a 30m buffer on each side of all forest boundaries and calculating the distance from each camera to the nearest forest edge. We calculated distance to water by combining the waterbody and flowline features in the National Hydrogeography Dataset (U.S. Geological Survey) for the state of Arkansas to capture both permanent and ephemeral water sources that may be important to wildlife. We measured the distance to water and distance to forest edge using the geoprocessing tool “near” in ArcGIS Pro which calculates the Euclidean distance between a point and the nearest feature. We extracted Average Daily Traffic (ADT) from the Arkansas Department of Transportation database (Arkansas GIS Office). The maximum value for ADT was calculated using the Summarize Within tool in ArcGIS Pro. We tested for correlation between all covariates using a Spearman correlation matrix and removed any variable with correlation greater than 0.6. Pairwise comparisons between distance to roads and HUD and between distance to forest edge and forest area were both correlated above 0.6; therefore, we dropped distance to roads and distance to forest edge from analyses as we predicted that HUD and forest area would have larger biological impacts on our focal species (Kretser et al. 2008). Occupancy Analysis In order to better understand habitat associations while accounting for imperfect detection of Armadillos, we used occupancy modeling (Mackenzie et al. 2002). We used a single-species, single-season occupancy model (Mackenzie et al. 2002) even though we had two years of survey data at 5 of the study sites. We chose to do this rather than using a multi-season dynamic occupancy model because most sites were not sampled during both years of the study. Even for sites that were sampled in both years, cameras were not placed in the same locations each year. We therefore combined all sampling into one single-season model and created unique site by year combinations as our sampling locations and we used year as a covariate for analysis to explore changes in occupancy associated with the year of study. For each sampling location, we created a detection history with 7 day sampling periods, allowing presence/absence data to be recorded at each site for each week of the study. This allowed for 16 survey periods between 01 December 2020, and 11 March 2021 and 22 survey periods between 01 November 2021 and 24 March 2022. We treated each camera as a unique survey site, resulting in a total of 352 sites. Because not all cameras were deployed at the same time and for the same length of time, we used a staggered entry approach. We used a multi-stage fitting approach in which we
Facebook
TwitterBelow is a quick rundown of the tools available in the web map! The first new thing you may notice is the ability to search from in the splash window that appears. This hopefully reduces the number of clicks people will need to get to their information. There's the same search bar in the upper left once you click out of the splash screen. The Query tool has existed in this form on the sub-maps, but now it is here with all the layers. I want to highlight "Search by Legal Description" as a nifty way to find parcels associated with a specific subdivision. I also want to highlight the "find tax parcels/addresses within specified distance" queries. Those let you select every tax parcel or address within a feature you draw (a point, line, or polygon). This is good for finding what properties within a distance need to be notified of something. That can then be exported as an Excel table (csv). This can also help you identify whether something falls within certain setbacks. The Basemaps is the same as it was before. I haven't gotten the Virginia Geographic Information Network imagery from 2017 and 2021 to successfully appear here, but you can find that in the map layers at the bottom. We have a lot of data layers! I currently have the default as every group expanded out, so you can scroll and see all the layers, but you can go through and click to collapse any groups you don't want expanded. Okay, the select tool is super cool, and lets you really dive into some fun GIS attribute querying! As an example, you can select all the FEMA Flood Zones that are AO, then select all the tax parcels that are affected by (intersect) those AO zones! These results can also be exported into an Excel table. A great deal of GIS analysis is possible just using Select by Attributes and Select by Location, so this tool really ramps up the power of the web map so it can do some of what the desktop GIS software can do! Continuing our tour of the tools, we come to the coordinates tool. This one also existed already in the sub-maps, but is now with all the layers. Unfortunately, the tool is a little annoying, and won't retain my defaults. You have to click the little plus sign target thing, then you can click on the map to get the coordinates. The coordinate system defaults to WGS 1984 Web Mercator (the same thing Google Maps uses), but much of our data uses NAD 1983 State Plane Virginia South, so you can click the dropdown arrow to the right to select either one. Exciting news related to this: in 2026 they are releasing the new coordinate system on which they've been working! It should make the data in GIS more closely align with features in reality, but you will not need to change any of the ways you interact with the data. The next tool is the Elevation Profile tool. It's very nifty! You can draw a profile to see how the elevation changes, and as you move your cursor along the graph, it shows where along your transect you are! It helps explain some of the floodplain and sea level rise boundaries. You know the measure tool well, but this one retains the defaults in feet and acres, which is very exciting! No more having to change the units every time you want to measure (unless you want other than feet and acres). The draw tool is our penultimate stop on the tour! It is largely the same as what existed on the old public web map, so I shan't delve into it here. When you draw a feature now though, it appears in the layers tab (until you close the map), which can let you toggle the drawing on and off to work with what is beneath it. It can help as you plan in where you might want to put new constructions. The print tool is also largely the same, but I've been finding the tool in this new Experience Builder format is less buggy than the one in the retired Web App Builder that made the old Public Web Map.
Facebook
TwitterLast Rev. 01/24/08 - E.Foster, P.E. - FSU/BSRCThe Historic Shoreline Database on the Web contains many directories of related types of information about beach changes in Florida over the past 150 or so years. The historic shoreline map images (see the Drawings directory) show precision-digitized approximate mean high water (mhw) shorelines, from the US government coastal topographic maps listed in the associated map bibliography files (see the Sourcebibs directory). These generally show data extending from the mid to late 1800’s to the mid to late 1970’s. The mhw positions have been extracted and tabulated (see the MWHfiles directory) relative to fixed reference “R” points along the beach, spaced approximately 1000 feet (300 meters) apart. Reference points not actually corresponding to actual “in the ground” survey markers are virtual “V” points. Mean high water positions have been and continue to be extracted from FDEP beach profile surveys from the 1970’s through the present and added to the tables. The beach profile data files from which mhw data have been extracted and added into the mhw tables can be found in the ProfileData directory and visually (for many areas) in the ClickOnProfiles directory. The beach profile files include elevation information along the entire length of the profiles. This profile data set has undergone up to fifteen additional quality control checks to ensure accuracy, reliability, and consistency with the historic database coordinate and bearing set. Note that any data deeper than wading depth have not yet undergone any extra quality control checks. Note also that there are *.cod text files of notes associated with the review of the profile data files.The digital historic shoreline map image files are given in a DWG autocad-based format, which should be usable on most versions, as well as many GIS systems. The Florida State Plane 1927/79-adjusted and 1983/90 horizontal coordinate systems are used. These are not metric systems, but with the proper software can be converted to whatever systems you may need. Each map image DWG file contains many layers, documented in an ASCII layer list archived with the DWG file.The database has been maintained and greatly expanded by E. Foster since approximately 1987 and by N. Nguyen since 1995. The initial map digitizing effort was done for FDEP at Florida State University, primarily by S. Demirpolat. Final processing and editing of the original map files to make them user-friendly was performed by N. Nguyen and E. Foster in 1995-7. Extensive quality control and update work has been performed by E. Foster since 1987, and by N. Nguyen since 1995. Field profile surveys have been performed by the FDEP Coastal Data Acquisition section since the early 1970’s, and by a number of commercial surveyors in recent years.The formats of the mhw tables and profile files are explained in text files included in the respective directories.Note that the digitized map image files were originally created in the UTM coordinate system on Intergraph equipment. The translation from UTM to the State Plane coordinate systems has resulted in some minor textual and other visual shifts in the northwest Florida area map image files.The dates in the map legends in the map images are generally composite dates. It is necessary to use the mhw data tables and map bibliographies for accurate dates for any specific location. The date ranges in the data tables relate to specific information given in the map bibliography files.2Generally it may be assumed that the historic shorelines have been digitized as carefully as possible from the source maps. If a historic shoreline does not contain a systematic position error and is feasible in a physical sense, the accuracy of the mhw position is estimated at plus or minus 15 to 50 feet (5 to 15 m), depending on the source and scale. This is as a position in time, NOT as an average mhw position. Data added from field surveys are estimated at plus or minus 10 feet (3 m) or better.It is to be noted that from the 1920’s onward, aerial photographs have usually been the basis of the US government’s coastal topographic maps. Prior to that, the method was plane table surveying. Along higher wave energy coasts, especially the Florida east coast, if there was significant wave activity in the source photography, it is very possible that the mhw was mapped in a more landward location than was probably correct. Alternatively, the use of photography sets with excessive sun glare may have caused the mhw to be mapped in a more seaward location than was probably correct. These effects have been frequently observed in comparisons of close-in-time FDEP controlled aerial photography with FDEP profile surveys. The use of some photography sets containing high wave uprush or sun glare is probable within the historic data. For example, on the east coast the 1940’s series maps tend to show the mhw more seaward than expected, possibly due to sun glare, and the 1960’s series tend to show the mhw more landward than expected. In the latter case, the effect may be due to the 1960’s being a decade of frequent storms. It is recommended that the analyst be aware that some of these effects may exist in the historic data. A questionable historic shoreline is NOT necessarily one to be discarded, just considered with allowance for its’ potential limitations.Using this database, it can readily be observed that the historic trends in shoreline evolution are very consistent with behavior expected from the longshore transport equation, well known to coastal engineers. This is a non-linear equation. Shoreline change can be expected to be linear or constant only in certain situations. It is NOT recommended that any analyst arbitrarily assume constant or linear shoreline change rates over long periods of time, which is often done but not supported by the evidence. The three primary factors controlling shoreline change are sand supply, wave climate, and local geographic features. In some parts of Florida, major storms since 1995 have also become important factors.
Facebook
TwitterThis webmap features the USGS GAP application of the vegetation cartography design based on NVCS mapping being done at the Alliance level by the California
Native Plant Society (CNPS), the California Dept of Fish and Game (CDFG), and the US National Park Service, combined with Ecological Systems Level mapping being done by USGS GAP, Landfire and Natureserve. Although the latter are using 3 different approaches to mapping, this project adopted a common cartography and a common master crossover in order to allow them to be used intercheangably as complements to the detailed NVCS Alliance & Macrogroup Mapping being done in Calif by the California Native Plant Society (CNPS) and Calif Dept of Fish & Wildlife (CDFW). A primary goal of this project was to develop ecological layers to use
as overlays on top of high-resolution imagery, in order to help
interpret and better understand the natural landscape. You can see the
source national GAP rasters by clicking on either of the "USGS GAP Landcover Source RASTER" layers at
the bottom of the contents list.Using polygons has several advantages: Polygons are how most
conservation plans and land decisions/managment are done so
polygon-based outputs are more directly useable in management and
planning. Unlike rasters, Polygons permit webmaps with clickable links
to provide additional information about that ecological community. At
the analysis level, polygons allow vegetation/ecological systems
depicted to be enriched with additional ecological attributes for each
polygon from multiple overlay sources be they raster or vector. In this map, the "Gap Mac base-mid scale" layers are enriched with links to USGS/USNVC macrogroup summary reports, and the "Gap Eco base scale" layers are enriched with links to the Naturserve Ecological Systems summary reports.Comparsion with finer scale ground ecological mapping is provided by the "Ecol Overlay" layers of Alliance and Macrogroup Mapping from CNPS/CDFW. The CNPS Vegetation
Program has worked for over 15 years to provide standards and tools for
identifying and representing vegetation, as an important feature of California's
natural heritage and biodiversity. Many knowledgeable ecologists and botanists
support the program as volunteers and paid staff. Through grants, contracts,
and grass-roots efforts, CNPS collects field data and compiles information into
reports, manuals, and maps on California's vegetation, ecology and rare plants in order to better protect and manage
them. We provide these services to governmental, non-governmental and other
organizations, and we collaborate on vegetation resource assessment projects
around the state. CNPS is also the publisher of the authoritative Manual of
California Vegetation, you can purchase a copy HERE. To support the work of the CNPS, please JOIN NOW
and become a member!The CDFG Vegetation
Classification and Mapping Program develops
and maintains California's expression of the National Vegetation Classification
System. We implement its use through assessment and mapping projects in
high-priority conservation and management areas, through training programs, and
through working continuously on best management practices for field assessment,
classification of vegetation data, and fine-scale vegetation mapping.HOW THE OVERLAY LAYERS WERE CREATED:Nserve and GapLC Sources:
Early shortcomings
in the NVC standard led to Natureserve's development of a mid-scale
mapping-friendly "Ecological Systems" standard roughly corresponding to
the "Group" level of the NVC, which facilitated NVC-based mapping of
entire continents. Current scientific work is leading to the
incorporation of Ecological Systems into the NVC as group and macrogroup
concepts are revised. Natureserve and Gap Ecological Systems layers
differ slightly even though both were created from 30m landsat data and
both follow the NVC-related Ecological Systems Classification curated by
Natureserve. In either case, the vector overlay was created by first
enforcing a .3ha minimum mapping unit, that required deleting any
classes consisting of fewer than 4 contiguous landsat cells either
side-side or cornerwise. This got around the statistical problem of
numerous single-cell classes with types that seemed improbable given
their matrix, and would have been inaccurate to use as an n=1 sample
compared to the weak but useable n=4 sample. A primary goal in this
elimination was to best preserve riparian and road features that might
only be one pixel wide, hence the use of cornerwise contiguous
groupings. Eliminated cell groups were absorbed into whatever
neighboring class they shared the longest boundary with. The remaining
raster groups were vectorized with light simplification to smooth out
the stairstep patterns of raster data and hopefully improve the fidelity
of the boundaries with the landscape. The resultant vectors show a
range of fidelity with the landscape, where there is less apparent
fidelity it must be remembered that ecosystems are normally classified
with a mixture of visible and non-visible characteristics including
soil, elevation and slope. Boundaries can be assigned based on the
difference between 10% shrub cover and 20% shrub cover. Often large landscape areas would create "godzilla" polygons of more than 50,000 vertices, which can affect performance. These were eliminated using SIMPLIFY POLYGONS to reduce vertex spacing from 30m down to 50-60m where possible. Where not possible DICE was used, which bisects all large polygons with arbitrary internal divisions until no polygon has more than 50,000 vertices. To create midscale layers, ecological systems were dissolved into the macrogroups that they belonged to and resymbolized on macrogroup. This was another frequent source for godzillas as larger landscape units were delineate, so simplify and dice were then run again. Where the base ecol system tiles could only be served up by individual partition tile, macrogroups typically exhibited a 10-1 or 20-1 reduction in feature count allowing them to be assembled into single integrated map services by region, ie NW, SW. CNPS
/ CDFW / National Park Service Sources: (see also base service definition page) Unlike the Landsat-based raster
modelling of the Natureserve and Gap national ecological systems, the
CNPS/CDFW/NPS data date back to the origin of the National Vegetation
Classification effort to map the US national parks in the mid 1990's.
These mapping efforts are a hybrid of photo-interpretation, satellite
and corollary data to create draft ecological land units, which are then
sampled by field crews and traditional vegetation plot surveys to
quantify and analyze vegetation composition and distribution into the
final vector boundaries of the formal NVC classes identified and
classified. As such these are much more accurate maps, but the tradeoff
is they are only done on one field project area at a time so there is
not yet a national or even statewide coverage of these detailed maps.
However, with almost 2/3d's of California already mapped, that time is
approaching. The challenge in creating standard map layers for this
wide diversity of projects over the 2 decades since NVC began is the
extensive evolution in the NVC standard itself as well as evolution in
the field techniques and tools. To create a consistent set of map
layers, a master crosswalk table was built using every different
classification known at the time each map was created and then
crosswalking each as best as could be done into a master list of the
currently-accepted classifications. This field is called the "NVC_NAME"
in each of these layers, and it contains a mixture of scientific names
and common names at many levels of the classification from association
to division, whatever the ecologists were able to determine at the
time. For further precision, this field is split out into scientific
name equivalents and common name equivalents.MAP LAYER NAMING: The data sublayers in this webmap are all based on the
US National Vegetation Classification, a partnership of the USGS GAP
program, US Forest Service, Ecological Society of America and
Natureserve, with adoption and support from many federal & state
agencies and nonprofit conservation groups. The USNVC grew out of the
US National Park Service
Vegetation Mapping Program, a mid-1990's effort led by The Nature
Conservancy, Esri and the University of California. The classification
standard is now an international standard, with
associated ecological mapping occurring around the world. NVC is a hierarchical taxonomy of 8
levels, from top down: Class, Subclass, Formation, Division, Macrogroup,
Group, Alliance, Association. The layers in this webmap represent 4 distinct programs: 1. The California Native Plant Society/Calif Dept of Fish & Wildlife Vegetation Classification and Mapping Program (Full Description of these layers is at the CNPS MS10 Service Registration Page and Cnps MS10B Service Registration Page . 2. USGS Gap Protected Areas Database, full description at the PADUS registration page . 3. USGS Gap Landcover, full description below 4. Natureserve Ecological Systems, full description belowLAYER NAMING: All Layer names follow this pattern: Source - Program - Level - Scale - RegionSource - Program
= who created the data: Nserve = Natureserve, GapLC = USGS Gap
Program Landcover Data PADUS = USGS Gap Protected Areas of the USA
program Cnps/Cdfw = California Native Plant Society/Calif Dept of Fish
& Wildlife, often followed by the project name such as: SFhill =
Sierra Foothills, Marin Open Space, MMWD = Marin Municipal Water
District etc. National Parks are included and may be named by their
standard 4-letter code ie YOSE = Yosemite, PORE = Point Reyes.Level:
The level in the NVC Hierarchy which this layer is based on: Base =
Alliances and Associations Mac =
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Polygon layer representing United States counties with name attributes.About Natural EarthNatural Earth is a convenient resource for creating custom maps. Unlike other map data intended for analysis or detailed government mapping, it is designed to meet the needs of cartographers and designers to make generalized maps. Maximum flexibility is a goal.Natural Earth is a public domain collection of map datasets available at 1:10 million (larger scale/more detailed), 1:50 million (medium scale/moderate detail), and 1:110 million (small scale/coarse detail) scales. It features tightly integrated vector and raster data to create a variety of visually pleasing, well-crafted maps with cartography or GIS software. Natural Earth data is made possible by many volunteers and supported by the North American Cartographic Information Society (NACIS).Convenience – Natural Earth solves a problem: finding suitable data for making small-scale maps. In a time when the web is awash in geospatial data, cartographers are forced to waste time sifting through confusing tangles of poorly attributed data to make clean, legible maps. Because your time is valuable, Natural Earth data comes ready to use.Neatness Counts–The carefully generalized linework maintains consistent, recognizable geographic shapes at 1:10m, 1:50m, and 1:110m scales. Natural Earth was built from the ground up, so you will find that all data layers align precisely with one another. For example, where rivers and country borders are one and the same, the lines are coincident.GIS Attributes – Natural Earth, however, is more than just a collection of pretty lines. The data attributes are equally important for mapmaking. Most data contain embedded feature names, which are ranked by relative importance. Other attributes facilitate faster map production, such as width attributes assigned to river segments for creating tapers. Intelligent dataThe attributes assigned to Natural Earth vectors make for efficient mapmaking. Most lines and areas contain embedded feature names, which are ranked by relative importance. Up to eight rankings per data theme allow easy custom map “mashups” to emphasize your subject while de-emphasizing reference features. Other attributes focus on map design. For example, width attributes assigned to rivers allow you to create tapered drainages. Assigning different colors to contiguous country polygons is another task made easier thanks to data attribution.Other key featuresVector features include name attributes and bounding box extents. Know that the Rocky Mountains are larger than the Ozarks.Large polygons are split for more efficient data handling—such as bathymetric layers.Projection-friendly vectors precisely match at 180 degrees longitude. Lines contain enough data points for smooth bending in conic projections, but not so many that computer processing speed suffers.Raster data includes grayscale-shaded relief and cross-blended hypsometric tints derived from the latest NASA SRTM Plus elevation data and tailored to register with Natural Earth Vector.Optimized for use in web mapping applications, with built-in scale attributes to assist features to be shown at different zoom levels.