Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains data used to test the protocol for high-resolution mapping and monitoring of recreational impacts in protected natural areas (PNAs) using unmanned aerial vehicle (UAV) surveys, Structure-from-Motion (SfM) data processing and geographic information systems (GIS) analysis to derive spatially coherent information about trail conditions (Tomczyk et al., 2023). Dataset includes the following folders:
Cocora_raster_data (~3GB) and Vinicunca_raster_data (~32GB) - a very high-resolution (cm-scale) dataset derived from UAV-generated images. Data covers selected recreational trails in Colombia (Valle de Cocora) and Peru (Vinicunca). UAV-captured images were processed using the structure-from-motion approach in Agisoft Metashape software. Data are available as GeoTIFF files in the UTM projected coordinate system (UTM 18N for Colombia, UTM 19S for Peru). Individual files are named as follows [location]_[year]_[product]_[raster cell size].tif, where:
[location] is the place of data collection (e.g., Cocora, Vinicucna)
[year] is the year of data collection (e.g., 2023)
[product] is the tape of files: DEM = digital elevation model; ortho = orthomosaic; hs = hillshade
[raster cell size] is the dimension of individual raster cell in mm (e.g., 15mm)
Cocora_vector_data. and Vinicunca_vector_data – mapping of trail tread and conditions in GIS environment (ArcPro). Data are available as shp files. Data are in the UTM projected coordinate system (UTM 18N for Colombia, UTM 19S for Peru).
Structure-from-motio n processing was performed in Agisoft Metashape (https://www.agisoft.com/, Agisoft, 2023). Mapping was performed in ArcGIS Pro (https://www.esri.com/en-us/arcgis/about-arcgis/overview, Esri, 2022). Data can be used in any GIS software, including commercial (e.g. ArcGIS) or open source (e.g. QGIS).
Tomczyk, A. M., Ewertowski, M. W., Creany, N., Monz, C. A., & Ancin-Murguzur, F. J. (2023). The application of unmanned aerial vehicle (UAV) surveys and GIS to the analysis and monitoring of recreational trail conditions. International Journal of Applied Earth Observations and Geoinformation, 103474. doi: https://doi.org/10.1016/j.jag.2023.103474
Facebook
TwitterHuman life is precious and in the event of any unfortunate occurrence, highest efforts are made to safeguard it. To provide timely aid or undertake extraction of humans in distress, it is critical to accurately locate them. There has been an increased usage of drones to detect and track humans in such situations. Drones are used to capture high resolution images during search and rescue purposes. It is possible to find survivors from drone feed, but that requires manual analysis. This is a time taking process and is prone to human errors. This model can detect humans by looking at drone imagery and can draw bounding boxes around the location. This model is trained on IPSAR and SARD datasets where humans are on macadam roads, in quarries, low and high grass, forest shade, and Mediterranean and Sub-Mediterranean landscapes. Deep learning models are highly capable of learning complex semantics and can produce superior results. Use this deep learning model to automate the task of detection, reducing the time and effort required significantly.Using the modelFollow the guide to use the model. Before using this model, ensure that the supported deep learning libraries are installed. For more details, check Deep Learning Libraries Installer for ArcGIS.Fine-tuning the modelThis model can be fine-tuned using the Train Deep Learning Model tool. Follow the guide to fine-tune this model.InputHigh resolution (1-5 cm) individual drone images or an orthomosaic.OutputFeature class containing detected humans.Applicable geographiesThe model is expected to work well in Mediterranean and Sub-Mediterranean landscapes but can also be tried in other areas.Model architectureThis model uses the FasterRCNN model architecture implemented in ArcGIS API for Python.Accuracy metricsThis model has an average precision score of 82.2 percent for human class.Training dataThis model is trained on search and rescue dataset provided by IPSAR and SARD.LimitationsThis model has a tendency to maximize detection of humans and errors towards producing false positives in rocky areas.Sample resultsHere are a few results from the model.
Facebook
TwitterAttribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
This remote sensing image dataset includes orthomosaics and Spartina alterniflora spatial data, derived from drone-based RGB photos over Zhangjiang Estuary, Fujian Province from 2013 to 2022. The drone photos were collected via automatic flight planning mainly during daytime low-tide periods. Based on the structure-from-motion three-dimension reconstruction technique, for each campaign the drone photos can be mosaiced into a digital orthophoto map, which is then used for extracting the spatial distribution of Spartina alterniflora. The dataset contains 2 folders including 10 orthomosaics and 10 Spartina alterniflora data, respectively. The file sizes are 1.7GB and 10.7MB for the orthomosaics and Spartina alterniflora data, respectively. All the data are in TIF format, and you can use GIS or remote sensing softwares like ArcGIS and ENVI to open them. The orthomosaics are named as "date-DOM" or the datasets with network RTK positioning service are named as "date-DOM-RTK". For example, the orthmosaic in June, 2022 is named as "202206-DOM-RTK.tif". Spartina alterniflora datasets are named "date-classified". The resolution of all data is 20 cm and the coordinate system is WGS84/UTM zone 50N. The drones used are different for these flights, and there is a slight deviation in positioning accuracy.
Facebook
TwitterAs we soar into the world of drones, we extend a warm welcome from our team in beautiful Northeastern Pennsylvania.Join us as we uncover the Northeastern Pennsylvania Alliance's innovative use of drone technology. We'll explore the 'why' and 'how' behind their UAS Program, revealing its transformative potential and outlining pathways for you to engage with this rapidly evolving field that is shaping the future and how you, too, can take part in this evolving technology. So buckle up and get ready for takeoff!
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Two detailed geomorphological maps (1:2000) depicting landscape changes as a result of a glacial lake outburst flood were produced for the 2.1-km-long section of the Zackenberg river, NE Greenland. The maps document the riverscape before the flood (5 August 2017) and immediately after the flood (8 August 2017), illustrating changes to the riverbanks and morphology of the channel. A series of additional maps (1:800) represent case studies of different types of riverbank responses, emphasising the importance of the lateral thermo-erosion and bank collapsing as significant immediate effects of the flood. The average channel width increased from 40.75 m pre-flood to 44.59 m post-flood, whereas the length of active riverbanks decreased from 1729 to 1657 m. The new deposits related to 2017 flood covered 93,702 m2. The developed maps demonstrated the applicability of small Unmanned Aerial Vehicles (UAVs) for investigating the direct effects of floods, even in the harsh Arctic environment.
Facebook
TwitterDrone products captured by CDFW staff for Hope Valley Wildlife Area.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
We collected aerial imagery to ground-truth and place our physical characteristics survey data in a wider, reach-scale context. Aerial imagery was collected for each sample reach using a commercially-available unmanned aerial vehicle (UAV). The Utah State University Ecogeomorphology and Topographic Analysis Laboratory (USU ETAL) used the UAV to photograph three hectares of river at each of the five study reaches for a total of fifteen hectares. The USU ETAL lab photographed river reaches in early August 2020 to capture conditions at approximately peak macrophyte coverage. The USU ETAL used the program DroneDeploy to process raw images into Orthomosaics and Digital Elevation Models (DEMs).Please see the metadata file (*.txt, *.xml, or *.aux.xml) for more specific information.
Facebook
TwitterAttribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
Flight paths of drone surveys used to capture imagery and video for the June 5, 2023, Peerless, SK downburst. Ground survey conducted June 6 & 7, 2023. DJI Mavic 2 Pro performed 3 flights. Please note that drones are also used for scouting the initial area of interest using a live view on the controller, meaning that some flight paths may not be associated with any imagery.
Facebook
TwitterNPM Bangladesh has produced a number of tools based on its regular data collection activities and drone flights. The package of June 2018 is based on NPM Baseline Assessment 11 (as of 14 June) and NPM most updated drone imagery (as of 21 June).
Here below, the complete package by camp:
The full image and shapefiles are available at this link.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains a time series of very high-resolution dataset derived from UAV-generated images. Data covers the distal part of Zackenberg River, northeast Greenland. Images were captured using an unmanned aerial vehicle (UAV) immediately before (5th August 2017), during (6th August 2017), and after (8th August 2017) the glacier lake outburst flood.
UAV-generated images were processed using the structure-from-motion approach in Agisoft Metashape software. Resultant products include digital elevation models, orthomosaics, and hillshade models. The second group of files (GIS_data.zip) contains vector files resultant from geomorphological mapping.
Potential applications of the presented dataset include:
assessment and quantification of landscape changes as an immediate result of glacier lake outburst flood;
long-term monitoring of high-Arctic river valley development (in conjunction with other datasets);
establishing a baseline for quantification of geomorphological impacts of future glacier lake outburst floods;
assessment of geohazards related to bank erosion and debris flow development (hazards for research station infrastructure – station buildings and bridge);
monitoring of permafrost degradation; and
modelling flood impacts on river ecosystem, transport capacity, and channel stability.
This dataset contains photogrammetric products derived from the UAV-generated images (orthomosaics, digital elevation models, hillshade models) and GIS products. Unprocessed images, which were the basis for structure-from-motion processing, can be accessed at: https://doi.org/10.5281/zenodo.4495282 (Tomczyk and Ewertowski, 2021)
Tomczyk A.M., Ewertowski M.W., 2021: Before-, during-, and after-flood UAV-generated images of the distal part of Zackenberg river, northeast Greenland (August 2017), https://doi.org/10.5281/zenodo.4495282
Facebook
TwitterAttribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
this dataset inscludes shapefiles interpreted from photomosaics over the milos shallow water hydrothermal system, and instrument locaitons, from fieldwork in july and september 2019. the shapefiles are organized by study areas (agia kyria, paleochori, and spathi bays). details are provided in the associated paper (puzenat et al., 2021) in addition to information in the readme file associated.a) shapefiles of seafloor textures interpreted from drone imagery. b) shapefiles of seafloor texttures interpreted from auv imagery. c) shapefiles of instruments deployed in the study area in september 2019.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The research leading to these results has received funding from the European Union's Horizon 2020 project INTERACT, under grant agreement No. 730938, project number: 119 [ArcticFan].
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Here are a few use cases for this project:
Power Line Maintenance: Utilities companies can use the "Image Tagging" model to identify various elements of power lines such as spacers, insulators, towers, and dampers to conduct routine inspections and monitor for maintenance.
Drone Inspection System: The model can be integrated into drone inspection systems for power transmission infrastructure. The drone can capture images and the model can tag the various components, making it easier to identify possible points of damage or wear.
GIS Mapping for Energy Sector: The model can be used to tag images in GIS (Geographic Information System) applications for mapping of the power transmission infrastructure. This would provide a geographically accurate visual representation of the components of the power system.
Training Machine Learning Models: Images tagged by the model can be used to train other machine learning models that could automize the detection and prediction of potential faults or failures in power transmission infrastructure.
Infrastructure Planning and Development: Engineers can use the tagged images to understand the placement and density of power line components (like spacers, insulators, towers, damper) better helping in efficient planning and development of new infrastructure.
Facebook
TwitterThe Australian Antarctic Data Centre's Mawson Station GIS data were originally mapped from March 1996 aerial photography. Refer to the metadata record 'Mawson Station GIS Dataset'. Since then various features have been added to this data as structures have been removed, moved or established. Some of these features have been surveyed. These surveys have metadata records from which the report describing the survey can be downloaded. However, other features have been 'eyed in' as more accurate data were not available. The eyeing in has been done based on advice from Australian Antarctic Division staff and using as a guide sources such as an aerial photograph, an Engineering plan, a map or a sketch. GPS data or measurements using a measuring tape may also have been used.
The data are included in the data available for download from a Related URL below. The data conform to the SCAR Feature Catalogue which includes data quality information. See a Related URL below. Data described by this metadata record has Dataset_id = 119. Each feature has a Qinfo number which, when entered at the 'Search datasets and quality' tab, provides data quality information for the feature.
Facebook
TwitterThis is a GIS dataset of the vegetation of the Windmill Islands. Interpretation was done by Rod Seppelt (Australian Antarctic Division) based on his field work, Zeiss aerial photography flown in January 1994 and a paper: Melick, D.R., Hovenden, M.J., Seppelt, R.D. (1994) Phytogeography of bryophyte and lichen vegetation in the Windmill Islands, Wilkes Land, Continental Antarctica. Vegetatio 111. 71-87 The data have been formatted according to the SCAR Feature Catalogue (see link below).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Recent advances in unmanned aerial vehicle (UAV) remote sensing and image analysis provide large amounts of plant canopy data, but there is no method to integrate the large imagery datasets with the much smaller manually collected datasets. A simple geographic information system (GIS)-based analysis for a UAV-supported field study (GAUSS) analytical framework was developed to integrate these datasets. It has three steps: developing a model for predicting sample values from UAV imagery, field gridding and trait value prediction, and statistical testing of predicted values. A field cultivation experiment was conducted to examine the effectiveness of the GAUSS framework, using a soybean–wheat crop rotation as the model system Fourteen soybean cultivars and subsequently a single wheat cultivar were grown in the same field. The crop rotation benefits of the soybeans for wheat yield were examined using GAUSS. Combining manually sampled data (n = 143) and pixel-based UAV imagery indices produced a large amount of high-spatial-resolution predicted wheat yields (n = 8,756). Significant differences were detected among soybean cultivars in their effects on wheat yield, and soybean plant traits were associated with the increases. This is the first reported study that links traits of legume plants with rotational benefits to the subsequent crop. Although some limitations and challenges remain, the GAUSS approach can be applied to many types of field-based plant experimentation, and has potential for extensive use in future studies.
Facebook
TwitterNPM Bangladesh has produced a number of tools based on its regular data collection activities and drone flights.
SW Map package: for mobile use, this enables users to visualize the site maps and boundaries on their own mobile. Together with the relevant files, users can also find a manual showing step by step how to copy files from their own computer to SW Map running on another portable device.
KMZ file: for desktop use, this enables users to visualize the site maps and boundaries on Google Earth. By adding or removing layers, it is possible to visualize each location assessed by NPM Baseline 10. These files are available on HDX.
Historical UAV imagery of Rohingya settlements in Cox Bazar in GIS, KML Google Earth, Mbtiles (SW Maps), format. Updates of imagery will be added on top of the list.
NPM has also produced individual packages by camps:
All majhee blocks shapefiles are also available at the following link:
Facebook
TwitterMIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
The UAS Facility Maps are designed to identify permissible altitudes (above ground level) at which UAS, operating under the Small UAS Rule (14 CFR 107), can be authorized to fly within the surface areas of controlled airspace. These altitude parameters, provided by the respective air traffic control facilities, are criteria used to evaluate airspace authorization requests (14 CFR 107.41), submitted via FAA.GOV/UAS. Airspace authorization requests for altitudes in excess of the predetermined map parameters will require a lengthy coordination process. This dataset will be continually updated and expanded to include UAS Facility Maps for all controlled airspace by Fall 2017. This map is not updated in real time. Neither the map nor the information provided herein is guaranteed to be current or accurate. Reliance on this map constitutes neither FAA authorization to operate nor evidence of compliance with applicable aviation regulations in or during enforcement proceedings before the National Transportation Safety Board or any other forum. Disclaimer of Liability. The United States government will not be liable to you in respect of any claim, demand, or action—irrespective of the nature or cause of the claim, demand, or action—alleging any loss, injury, or damages, direct or indirect, that may result from the use or possession of any of the information in this draft map or any loss of profit, revenue, contracts, or savings or any other direct, indirect, incidental, special, or consequential damages arising out of any use of or reliance upon any of the information in this draft map, whether in an action in contract or tort or based on a warranty, even if the FAA has been advised of the possibility of such damages. The FAA’s total aggregate liability with respect to its obligations under this agreement or otherwise with respect to the use of this draft map or any information herein will not exceed $0. Some States, Territories, and Countries do not allow certain liability exclusions or damages limitations; to the extent of such disallowance and only to that extent, the paragraph above may not apply to you. In the event that you reside in a State, Territory, or Country that does not allow certain liability exclusions or damages limitations, you assume all risks attendant to the use of any of the information in this draft map in consideration for the provision of such information. Export Control. You agree not to export from anywhere any of the information in this draft map except in compliance with, and with all licenses and approvals required under, applicable export laws, rules, and regulations. Indemnity. You agree to indemnify, defend, and hold free and harmless the United States government from and against any liability, loss, injury (including injuries resulting in death), demand, action, cost, expense, or claim of any kind or character, including but not limited to attorney’s fees, arising out of or in connection with any use or possession by you of this draft map or the information herein. Governing Law. The above terms and conditions will be governed by the laws of each and every state within the United States, without giving effect to that state’s conflict-of-laws provisions. You agree to submit to the jurisdiction of the state or territory in which the relevant use of any of the information in this draft map occurred for any and all disputes, claims, and actions arising from or in connection with this draft map or the information herein.
Facebook
TwitterThe California Department of Fish and Wildlife (CDFW) Vegetation Classification and Mapping Program (VegCAMP) created a fine-scale vegetation map of Slinkard Valley and Little Antelope Valley Wildlife Areas in Mono County, California. The vegetation classification was derived from data collected in the field during the periods August 28-31, 2017, September 10-14, 2018, and November 5-9, 2018. Vegetation polygons were drawn using heads-up “manual” digitizing using the 2016 National Agricultural Imagery Program (NAIP) true color and color infrared (CIR) 1-meter resolution data as the base imagery. Supplemental imagery included NAIP true color and CIR 1-meter resolution data from 2009-2012, BING imagery, and current and historical imagery from Google Earth. The minimum mapping unit (MMU) is 1 acre, with the exception of wetland and riparian types, which have an MMU of ½ acre. Mapping is to the National Vegetation Classification System (NVCS) hierarchy association, alliance, or group level based on the ability of the photointerpreters to distinguish types based on all imagery available and on the field data.Field accuracy assessment surveys were collected by CDFW regional and VegCAMP staff in the fall of 2019. It was determined that the map had an overall accuracy of 89.3% before suggested adjustments were made to typing and line-work in response to the accuracy assessment. As part of the mapping process for this project we also implemented a drone component. The purpose was to test the use of drone photos to enhance and extend reconnaissance efforts for mapping, help with determining signatures on coarser imagery, use images taken above surveys as a check on cover estimates, and test whether drone imagery would allow for mapping herbaceous vegetation at a finer scale.Citations:Boul, R., Keeler-Wolf, T., J. Ratchford, T. Haynes, D. Hickson, J. Evens and R. Yacoub. Classification of the Vegetation of Modoc and Lassen Counties, California. California Department of Fish and Widlife; 2/2021.Vegetation Classification and Mapping Program, CA Dept. of Fish and Wildlife. Vegetation Map and Classification of Slinkard Valley and Little Antelope Valley Wildlife Areas, Mono County, California. California Department of Fish and Wildlife, Vegetation and Classification and Mapping Program; 8/2021.
Facebook
TwitterAttribution-NonCommercial-NoDerivs 4.0 (CC BY-NC-ND 4.0)https://creativecommons.org/licenses/by-nc-nd/4.0/
License information was derived automatically
This dataset contains a set of UAV orthophotos acquired from September 2020 to April 2023. For technical reasons the dataset was divided into 4 subsections, to be viewed together in GIS. The pictures were acquired to map soil marks. Every picture of the dataset cointains its metadata. The photos were acquired with a Mavic Air 2 and processed with the software Agisoft Metashape to generate the orthophotos. The georeferencing was made in ArcgisPro thanks to ground control points.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains data used to test the protocol for high-resolution mapping and monitoring of recreational impacts in protected natural areas (PNAs) using unmanned aerial vehicle (UAV) surveys, Structure-from-Motion (SfM) data processing and geographic information systems (GIS) analysis to derive spatially coherent information about trail conditions (Tomczyk et al., 2023). Dataset includes the following folders:
Cocora_raster_data (~3GB) and Vinicunca_raster_data (~32GB) - a very high-resolution (cm-scale) dataset derived from UAV-generated images. Data covers selected recreational trails in Colombia (Valle de Cocora) and Peru (Vinicunca). UAV-captured images were processed using the structure-from-motion approach in Agisoft Metashape software. Data are available as GeoTIFF files in the UTM projected coordinate system (UTM 18N for Colombia, UTM 19S for Peru). Individual files are named as follows [location]_[year]_[product]_[raster cell size].tif, where:
[location] is the place of data collection (e.g., Cocora, Vinicucna)
[year] is the year of data collection (e.g., 2023)
[product] is the tape of files: DEM = digital elevation model; ortho = orthomosaic; hs = hillshade
[raster cell size] is the dimension of individual raster cell in mm (e.g., 15mm)
Cocora_vector_data. and Vinicunca_vector_data – mapping of trail tread and conditions in GIS environment (ArcPro). Data are available as shp files. Data are in the UTM projected coordinate system (UTM 18N for Colombia, UTM 19S for Peru).
Structure-from-motio n processing was performed in Agisoft Metashape (https://www.agisoft.com/, Agisoft, 2023). Mapping was performed in ArcGIS Pro (https://www.esri.com/en-us/arcgis/about-arcgis/overview, Esri, 2022). Data can be used in any GIS software, including commercial (e.g. ArcGIS) or open source (e.g. QGIS).
Tomczyk, A. M., Ewertowski, M. W., Creany, N., Monz, C. A., & Ancin-Murguzur, F. J. (2023). The application of unmanned aerial vehicle (UAV) surveys and GIS to the analysis and monitoring of recreational trail conditions. International Journal of Applied Earth Observations and Geoinformation, 103474. doi: https://doi.org/10.1016/j.jag.2023.103474