The National Flood Hazard Layer (NFHL) is a geospatial database that contains current effective flood hazard data. FEMA provides the flood hazard data to support the National Flood Insurance Program. You can use the information to better understand your level of flood risk and type of flooding.The NFHL is made from effective flood maps and Letters of Map Change (LOMC) delivered to communities. NFHL digital data covers over 90 percent of the U.S. population. New and revised data is being added continuously. If you need information for areas not covered by the NFHL data, there may be other FEMA products which provide coverage for those areas.In the NFHL Viewer, you can use the address search or map navigation to locate an area of interest and the NFHL Print Tool to download and print a full Flood Insurance Rate Map (FIRM) or FIRMette (a smaller, printable version of a FIRM) where modernized data exists. Technical GIS users can also utilize a series of dedicated GIS web services that allow the NFHL database to be incorporated into websites and GIS applications. For more information on available services, go to the NFHL GIS Services User Guide.You can also use the address search on the FEMA Flood Map Service Center (MSC) to view the NFHL data or download a FIRMette. Using the “Search All Products” on the MSC, you can download the NFHL data for a County or State in a GIS file format. This data can be used in most GIS applications to perform spatial analyses and for integration into custom maps and reports. To do so, you will need GIS or mapping software that can read data in shapefile format.FEMA also offers a download of a KMZ (keyhole markup file zipped) file, which overlays the data in Google Earth™. For more information on using the data in Google Earth™, please see Using the National Flood Hazard Layer Web Map Service (WMS) in Google Earth™.
The Global Flood Database contains maps of the extent and temporal distribution of 913 flood events occurring between 2000-2018. For more information, see the associated journal article. Flood events were collected from the Dartmouth Flood Observatory and used to collect MODIS imagery. The selected 913 events are those that were successfully mapped (passed quality control as having significant inundation beyond permanent water) using 12,719 scenes from Terra and Aqua MODIS sensors. Each pixel was classified as water or non-water at 250-meter resolution during the full date range of each flood event and subsequent data products were generated including maximum flood extent ("flooded" band) and the duration of inundation in days ("duration" band). Water and non-water classifications during a flood event include permanent water (here resampling the 30-meter JRC Global Surface Water dataset representing permanent water to 250-meter resolution), which can be masked out to isolate flood water using the "jrc_perm_water" band. Extra data quality bands were added representing cloud conditions during the flood event (e.g., "clear_views" representing the number of clear days the flood was observed between its start and end dates and "clear_perc" representing the percentage of clear day observation of the total event duration in days). Each image in the ImageCollection represents the map of an individual flood. The collection can be filtered by date, country, or Dartmouth Flood Observatory original ID.
The global river flood hazard maps are a gridded data set representing inundation along the river network, for seven different flood return periods (from 1-in-10-years to 1-in-500-years). The input river flow data for the new maps are produced by means of the open-source hydrological model LISFLOOD, while inundation simulations are performed with the hydrodynamic model LISFLOOD-FP. The extent comprises the entire world with the exception of Greenland and Antarctica and small islands with river basins smaller than 500 km^2. Cell values indicate water depth (in meters). The maps can be used to assess the exposure of population and economic assets to river floods, and to perform flood risk assessments. The dataset is created as part of the Copernicus Emergency Management Service. Note: This dataset may have missing tiles. This collection will be eventually be replaced by v2.1 once it's updated by the provider.
In the NFHL Viewer, you can use the address search or map navigation to locate an area of interest and the NFHL Print Tool to download and print a full Flood Insurance Rate Map (FIRM) or FIRMette (a smaller, printable version of a FIRM) where NFHL data exists. Technical GIS users can also utilize a series of dedicated GIS web services that allow the NFHL database to be incorporated into websites and GIS applications. For more information on available services, go to the NFHL GIS Services User Guide.You can also use the address search on the FEMA Flood Map Service Center (MSC) to view the NFHL data or download a FIRMette. Using the “Search All Products” on the MSC, you can download the NFHL data for a County or State in a GIS file format. This data can be used in most GIS applications to perform spatial analyses and for integration into custom maps and reports. To do so, you will need GIS or mapping software that can read data in shapefile format.FEMA also offers a download of a KMZ (keyhole markup file zipped) file, which overlays the data in Google Earth™. For more information on using the data in Google Earth™, please see Using the National Flood Hazard Layer Web Map Service (WMS) in Google Earth™.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Fast flood extent monitoring with SAR change detection using Google Earth Engine This dataset develops a tool for near real-time flood monitoring through a novel combining of multi-temporal and multi-source remote sensing data. We use a SAR change detection and thresholding method, and apply sensitivity analytics and thresholding calibration, using SAR-based and optical-based indices in a format that is streamlined, reproducible, and geographically agile. We leverage the massive repository of satellite imagery and planetary-scale geospatial analysis tools of GEE to devise a flood inundation extent model that is both scalable and replicable. The flood extents from the 2021 Hurricane Ida and the 2017 Hurricane Harvey were selected to test the approach. The methodology provides a fast, automatable, and geographically reliable tool for assisting decision-makers and emergency planners using near real-time multi-temporal satellite SAR data sets. GEE code was developed by Ebrahim Hamidi and reviewed by Brad G. Peter; Figures were created by Brad G. Peter. This tool accompanies a publication Hamidi et al., 2023: E. Hamidi, B. G. Peter, D. F. Muñoz, H. Moftakhari and H. Moradkhani, "Fast Flood Extent Monitoring with SAR Change Detection Using Google Earth Engine," in IEEE Transactions on Geoscience and Remote Sensing, doi: 10.1109/TGRS.2023.3240097. GEE input datasets: Methodology flowchart: Sensitivity Analysis: GEE code (muti-source and multi-temporal flood monitoring): https://code.earthengine.google.com/7f4942ab0c73503e88287ad7e9187150 The threshold sensitivity analysis is automated in the below GEE code: https://code.earthengine.google.com/a3fbfe338c69232a75cbcd0eb6bc0c8e The above scripts can be run independently. The threshold automation code identifies the optimal threshold values for use in the flood monitoring procedure. GEE code for Hurricane Harvey, east of Houston Java script: // Study Area Boundaries var bounds = /* color: #d63000 */ee.Geometry.Polygon( [[[-94.5214452285728, 30.165244882083663], [-94.5214452285728, 29.56024879238989], [-93.36650748443218, 29.56024879238989], [-93.36650748443218, 30.165244882083663]]], null, false); // [before_start,before_end,after_start,after_end,k_ndfi,k_ri,k_diff,mndwi_threshold] var params = ['2017-06-01','2017-06-15','2017-08-01','2017-09-10',1.0,0.25,0.8,0.4] // SAR Input Data var before_start = params[0] var before_end = params[1] var after_start = params[2] var after_end = params[3] var polarization = "VH" var pass_direction = "ASCENDING" // k Coeficient Values for NDFI, RI and DII SAR Indices (Flooded Pixel Thresholding; Equation 4) var k_ndfi = params[4] var k_ri = params[5] var k_diff = params[6] // MNDWI flooded pixels Threshold Criteria var mndwi_threshold = params[7] // Datasets ----------------------------------- var dem = ee.Image("USGS/3DEP/10m").select('elevation') var slope = ee.Terrain.slope(dem) var swater = ee.Image('JRC/GSW1_0/GlobalSurfaceWater').select('seasonality') var collection = ee.ImageCollection('COPERNICUS/S1_GRD') .filter(ee.Filter.eq('instrumentMode', 'IW')) .filter(ee.Filter.listContains('transmitterReceiverPolarisation', polarization)) .filter(ee.Filter.eq('orbitProperties_pass', pass_direction)) .filter(ee.Filter.eq('resolution_meters', 10)) .filterBounds(bounds) .select(polarization) var before = collection.filterDate(before_start, before_end) var after = collection.filterDate(after_start, after_end) print("before", before) print("after", after) // Generating Reference and Flood Multi-temporal SAR Data ------------------------ // Mean Before and Min After ------------------------ var mean_before = before.mean().clip(bounds) var min_after = after.min().clip(bounds) var max_after = after.max().clip(bounds) var mean_after = after.mean().clip(bounds) Map.addLayer(mean_before, {min: -29.264204107025904, max: -8.938093778644141, palette: []}, "mean_before",0) Map.addLayer(min_after, {min: -29.29334290990966, max: -11.928313976797138, palette: []}, "min_after",1) // Flood identification ------------------------ // NDFI ------------------------ var ndfi = mean_before.abs().subtract(min_after.abs()) .divide(mean_before.abs().add(min_after.abs())) var ndfi_filtered = ndfi.focal_mean({radius: 50, kernelType: 'circle', units: 'meters'}) // NDFI Normalization ----------------------- var ndfi_min = ndfi_filtered.reduceRegion({ reducer: ee.Reducer.min(), geometry: bounds, scale: 10, maxPixels: 1e13 }) var ndfi_max = ndfi_filtered.reduceRegion({ reducer: ee.Reducer.max(), geometry: bounds, scale: 10, maxPixels: 1e13 }) var ndfi_rang = ee.Number(ndfi_max.get('VH')).subtract(ee.Number(ndfi_min.get('VH'))) var ndfi_subtctMin = ndfi_filtered.subtract(ee.Number(ndfi_min.get('VH'))) var ndfi_norm = ndfi_subtctMin.divide(ndfi_rang) Map.addLayer(ndfi_norm, {min: 0.3862747346632676, max: 0.7632898395906615}, "ndfi_norm",0) var histogram = ui.Chart.image.histogram({ image: ndfi_norm, region: bounds, scale: 10, maxPixels: 1e13 })...
The Governor's Office of Information Technology (OIT) is managing the Colorado Google Flood Crisis Map Colorado Google Flood Crisis Map. In partnership with the Department of Public Safety, OIT is overseeing the Statewide Digital Trunked Radio System (DTRS) which bridges state, county, local and tribal communications. Since the flooding emergency began, the DTRS system has processed more than 4.7 million radio calls and dispatched more than 150 mobile radio units to the Colorado National Guard and various search and rescue teams. Additionally, the DTRS team has deployed technicians to conduct repairs and damage assessments to the state’s 200+ DTRS towers, some of which are located in the flood zones. OIT’s Geographic Information Systems team is assisting in the coordination of aggregating data with Federal Emergency Management Agency (FEMA) and other agencies. For more information, visit www.colorado.gov/oit.
This data release comprises the raster data files and code necessary to perform all analyses presented in the associated publication. The 16 TIF raster data files are classified surface water maps created using the Dynamic Surface Water Extent (DSWE) model implemented in Google Earth Engine using published technical documents. The 16 tiles cover the country of Cambodia, a flood-prone country in Southeast Asia lacking a comprehensive stream gauging network. Each file includes 372 bands. Bands represent surface water for each month from 1988 to 2018, and are stacked from oldest (Band 1 - January 1988) to newest (Band 372 - December 2018). DSWE classifies pixels unobscured by cloud, cloud shadow, or snow into five categories of ground surface inundation; in addition to not-water (class 0) and water (class 1), the DSWE algorithm distinguishes pixels that are less distinctly inundated (class 2: “moderate confidence”), comprise a mixture of vegetation and water (class 3: “potential wetland”), or are of marginal validity (class 4: “water or wetland - low confidence”). Class 9 is applied to classify clouds, shadows and hill shade. Two additional documents accompany the raster image files and XML metadata. The first provides a key representing the general _location of each raster file. The second file includes all Google Earth Engine Javascript code, which can be used online (https://code.earthengine.google.com/) to replicate the monthly DSWE map time series for Cambodia, or for any other _location on Earth. The code block includes comments to explain how each step works. These data support the following publication: These data support the following publication: Soulard, C.E., Walker, J.J., and Petrakis, R.E., 2020, Implementation of a Surface Water Extent Model in Cambodia using Cloud-Based Remote Sensing: Remote Sensing, v. 12, no. 6, p. 984, https://doi.org/10.3390/rs12060984.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains three raster files with a spatial resolution of 10 m, derived by the Google Earth Engine:
1) DynamicWorld_Floods_2015_2023.tif: Number of days flooded for the River Basin District of Thrace (Greece) starting from 2015 until 2023
2) Thessaly_2015_August2023.tiff: Number of days flooded for the River Basin District of Thessaly (Greece) starting from 2015 until August 2023
3) Thessaly_2015_now.tiff: Number of days flooded for the River Basin District of Thessaly (Greece) starting from 2015 until January 2024
U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
As predictive accuracy of the climate response to greenhouse emissions improves, measurements of sea level rise are being coupled with modeling to better understand coastal vulnerability to flooding. Predictions of rising intensity of storm rainfall and larger tropical storms also imply increased inland flooding, and many studies conclude this is already occurring in some regions.
Most rivers experience some flooding each year: the seasonal discharge variation from low to high water can be 2-3 orders of magnitude. The mean annual flood is an important threshold: its level separates land flooded each year from land only affected by large floods. We lack adequate geospatial information on a global basis defining floodplains within the mean annual flood limit and the higher lands still subject to significant risk (e.g. with exceedance probability of greater than 3.3%; the 30 yr floodplain). This lack of knowledge concerning changing surface water affects many disciplines and remote sensing data sets, where, quite commonly, a static water 'mask' is employed to separate water from land. For example, inland bio-geochemical cycling of C and N is affected by flooding, but floodplain areas are not well constrained.
Measurements and computer models of flood inundation over large areas have been difficult to incorporate because of a scarcity of observations in compatible formats, and a lack of the detailed boundary conditions, in particular floodplain topography, required to run hydrodynamic models. However, the available data now allow such work, and the computational techniques needed to ingest such information are ready for development. Optical and SAR sensing are providing a near-global record of floodplain inundation, and passive microwave radiometry is producing a calibrated record of flood-associated discharge values, 1998-present. Also, global topographic data are of increasingly fine resolution, and techniques have been developed to facilitate their incorporation into modeling. Several of us have already demonstrated the new capability to accurately model and map floodplains on a continent scale using input discharges of various sizes and exceedance probabilities.
Work is needed to accomplish global-scale products, wherein results are extended to all continents, and downscaled to be locally accurate and useful. Floodplain mapping technologies and standards vary greatly among nations (many nations have neither): the planned effort will provide a global flood hazard infrastructure on which detailed local risk assessment can build. Our project brings together an experienced team of modeling, remote sensing, hydrology, and information technology scientists at JPL and the University of Colorado with the Google Earth Engine team to implement and disseminate a Global Floodplains and Flood Risk digital map product. This project addresses major priorities listed in the AIST program: with Google, we would identify, develop, and demonstrate advanced information system technologies that increases the accessibility and utility of NASA science data and enables new information products. The work will address the Core Topic 'Data-Centric Technologies', including 'Technologies that provide opportunities for more efficient interoperations with observations data systems, such as high end computing and modeling systems; and Capabilities that advance integrated Earth science missions by enabling discovery and access to Service Oriented Architecture'. It will also address the Special Subtopic 'Technology Enhancements for Applied Sciences Applications' in regard to natural disasters, and contribute to the GEOSS architecture for the use of remote sensing products in disaster management and risk assessment.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Data specification.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
This starter data kit collects extracts from global, open datasets relating to climate hazards and infrastructure systems.
These extracts are derived from global datasets which have been clipped to the national scale (or subnational, in cases where national boundaries have been split, generally to separate outlying islands or non-contiguous regions), using Natural Earth (2023) boundaries, and is not meant to express an opinion about borders, territory or sovereignty.
Human-induced climate change is increasing the frequency and severity of climate and weather extremes. This is causing widespread, adverse impacts to societies, economies and infrastructures. Climate risk analysis is essential to inform policy decisions aimed at reducing risk. Yet, access to data is often a barrier, particularly in low and middle-income countries. Data are often scattered, hard to find, in formats that are difficult to use or requiring considerable technical expertise. Nevertheless, there are global, open datasets which provide some information about climate hazards, society, infrastructure and the economy. This "data starter kit" aims to kickstart the process and act as a starting point for further model development and scenario analysis.
Hazards:
Exposure:
Contextual information:
The spatial intersection of hazard and exposure datasets is a first step to analyse vulnerability and risk to infrastructure and people.
To learn more about related concepts, there is a free short course available through the Open University on Infrastructure and Climate Resilience. This overview of the course has more details.
These Python libraries may be a useful place to start analysis of the data in the packages produced by this workflow:
snkit
helps clean network data
nismod-snail
is designed to help implement infrastructure
exposure, damage and risk calculations
The open-gira
repository contains a larger workflow for global-scale open-data infrastructure risk and resilience analysis.
For a more developed example, some of these datasets were key inputs to a regional climate risk assessment of current and future flooding risks to transport networks in East Africa, which has a related online visualisation tool at https://east-africa.infrastructureresilience.org/ and is described in detail in Hickford et al (2023).
References
The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.
[Summary provided by the USGS.]
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
A Google Earth Engine implementation of the Floodwater Depth Estimation Tool (FwDET) This is a Google Earth Engine implementation of the Floodwater Depth Estimation Tool (FwDET) developed by the Surface Dynamics and Modeling Lab at the University of Alabama that calculates flood depth using a flood extent layer and a digital elevation model. This research is made possible by the CyberSeed Program at the University of Alabama. Project name: WaterServ: A Cyberinfrastructure for Analysis, Visualization and Sharing of Hydrological Data. GitHub Repository (ArcMap and QGIS implementations): https://github.com/csdms-contrib/fwdet Cohen, S., A. Raney, D. Munasinghe, J.D. Loftis J, A. Molthan, J. Bell, L. Rogers, J. Galantowicz, G.R. Brakenridge7, A.J. Kettner, Y. Huang, Y. Tsang, (2019). The Floodwater Depth Estimation Tool (FwDET v2.0) for Improved Remote Sensing Analysis of Coastal Flooding. Natural Hazards and Earth System Sciences, 19, 2053–2065. https://doi.org/10.5194/nhess-19-2053-2019 Cohen, S., G. R. Brakenridge, A. Kettner, B. Bates, J. Nelson, R. McDonald, Y. Huang, D. Munasinghe, and J. Zhang (2018), Estimating Floodwater Depths from Flood Inundation Maps and Topography, Journal of the American Water Resources Association, 54 (4), 847–858. https://doi.org/10.1111/1752-1688.12609 Sample products and data availability: https://sdml.ua.edu/models/fwdet/ https://sdml.ua.edu/michigan-flood-may-2020/ https://cartoscience.users.earthengine.app/view/fwdet-gee-mi https://alabama.app.box.com/s/31p8pdh6ngwqnbcgzlhyk2gkbsd2elq0 GEE implementation output: fwdet_gee_brazos.tif ArcMap implementation output (see Cohen et al. 2019): fwdet_v2_brazos.tif iRIC validation layer (see Nelson et al. 2010): iric_brazos_hydraulic_model_validation.tif Brazos River inundation polygon access in GEE: var brazos = ee.FeatureCollection('users/cartoscience/FwDET-GEE-Public/Brazos_River_Inundation_2016') Nelson, J.M., Shimizu, Y., Takebayashi, H. and McDonald, R.R., 2010. The international river interface cooperative: public domain software for river modeling. In 2nd Joint Federal Interagency Conference, Las Vegas, June (Vol. 27). Google Earth Engine Code /* ---------------------------------------------------------------------------------------------------------------------- # FwDET-GEE calculates floodwater depth from a floodwater extent layer and a DEM Authors: Brad G. Peter, Sagy Cohen, Ronan Lucey, Dinuke Munasinghe, Austin Raney Emails: bpeter@ua.edu, sagy.cohen@ua.edu, ronan.m.lucey@nasa.gov, dsmunasinghe@crimson.ua.edu, aaraney@crimson.ua.edu Organizations: BP, SC, DM, AR - University of Alabama; RL - University of Alabama in Huntsville Last Modified: 10/08/2020 To cite this code use: Peter, Brad; Cohen, Sagy; Lucey, Ronan; Munasinghe, Dinuke; Raney, Austin, 2020, "A Google Earth Engine implementation of the Floodwater Depth Estimation Tool (FwDET-GEE)", https://doi.org/10.7910/DVN/JQ4BCN, Harvard Dataverse, V2 ------------------------------------------------------------------------------------------------------------------------- This is a Google Earth Engine implementation of the Floodwater Depth Estimation Tool (FwDETv2.0) [1] developed by the Surface Dynamics and Modeling Lab at the University of Alabama that calculates flood depth using a flood extent layer and a digital elevation model. This research is made possible by the CyberSeed Program at the University of Alabama. Project name: WaterServ: A Cyberinfrastructure for Analysis, Visualization and Sharing of Hydrological Data. GitHub Repository (ArcMap and QGIS implementations): https://github.com/csdms-contrib/fwdet ------------------------------------------------------------------------------------------------------------------------- How to run this code with your flood extent GEE asset: User of this script will need to update path to flood extent (line 32 or 33) and select from the processing options. Available DEM options (1) are USGS/NED (U.S.) and USGS/SRTMGL1_003 (global). Other options include (2) running the elevation outlier filtering algorithm, (3) adding water body data to the inundation extent, (4) add a water body data layer uploaded by the user rather than using the JRC global surface water data, (5) masking out regular water body data, (6) masking out 0 m depths, (7) choosing whether or not to export, (8) exporting additional data layers, and (9) setting an export file name. The simpleVis option (10) bypasses the time consuming processes and is meant for visualization only; set this option to false to complete the entire process and enable exporting. ------------------------------------------------------------------------------------------------------------------------- ••••••••••••••••••••••••••••••••••••••••••• USER OPTIONS •••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• Load flood extent layer | Flood extent layer must be uploaded to GEE first as an asset. If the flood extent is a shapefile, upload as a FeatureCollection; otherwise, if the flood extent layer is a raster, upload it as an image. A raster layer may be required if the flood extent is a highly complex geometry -------------------------------------- */ var flood = ee.FeatureCollection('users/username/folder/flood_extent') // comment out this line if using an Image // var flood = ee.Image('users/username/folder/flood_extent') // comment out this line if using a FeatureCollection var waterExtent = ee.FeatureCollection('users/username/folder/water_extent') // OPTIONAL comment out this line if using an Image // var waterExtent = ee.Image('users/username/folder/water_extent') // OPTIONAL comment out this line if using a FeatureCollection // Processing options - refer to the directions above /*1*/ var demSource = 'USGS/NED' // 'USGS/NED' or 'USGS/SRTMGL1_003' /*2*/ var outlierTest = 'TRUE' // 'TRUE' (default) or 'FALSE' /*3*/ var addWater = 'TRUE' // 'TRUE' (default) or 'FALSE' /*4*/ var userWater = 'FALSE' // 'TRUE' or 'FALSE' (default) /*5*/ var maskWater = 'FALSE' // 'TRUE' or 'FALSE' (default) /*6*/ var maskZero = 'FALSE' // 'TRUE' or 'FALSE' (default) /*7*/ var exportLayer = 'TRUE' // 'TRUE' (default) or 'FALSE' /*8*/ var exportAll = 'FALSE' // 'TRUE' or 'FALSE' (default) /*9*/ var outputName = 'FwDET_GEE' // text string for naming export file /*10*/ var simpleVis = 'FALSE' // 'TRUE' or 'FALSE' (default) // ••••••••••••••••••••••••••••••••• NO USER INPUT BEYOND THIS POINT •••••••••••••••••••••••••••••••••••••••••••••••••••• // Create buffer around flood area to use for clipping other layers var area = flood.geometry().bounds().buffer(1000).bounds() // Load DEM and grab projection info var dem = ee.Image(demSource).select('elevation').clip(area) // [2,3] var projection = dem.projection() var resolution = projection.nominalScale().getInfo() // Load global surface water layer var jrc = ee.Image('JRC/GSW1_1/GlobalSurfaceWater').select('occurrence').clip(area) // [4] var water_image = jrc // User uploaded flood extent layer // Identify if a raster or vector layer is being used and proceed with appropriate process if ( flood.name() == 'FeatureCollection' ) { var addProperty = function(feature) { return feature.set('val',0); }; var flood_image = flood.map(addProperty).reduceToImage(['val'],ee.Reducer.first()) .rename('flood') } else { var flood_image = flood.multiply(0) } // Optional user uploaded water extent layer if ( userWater == 'TRUE' ) { // Identify if a raster or vector layer is being used and proceed with appropriate process if ( waterExtent.name() == 'FeatureCollection' ) { var addProperty = function(feature) { return feature.set('val',0); }; var water_image = waterExtent.map(addProperty).reduceToImage(['val'],ee.Reducer.first()) .rename('flood') } else { var water_image = waterExtent.multiply(0) } } // Add water bodies to flood extent if 'TRUE' is selected if ( addWater == 'TRUE' ) { var w = water_image.reproject(projection) var waterFill = flood_image.mask().where(w.gt(0),1) flood_image = waterFill.updateMask(waterFill.eq(1)).multiply(0) } // Change processing options if 'TRUE' is selected if ( simpleVis == 'FALSE' ) { flood_image = flood_image.reproject(projection) } else { outlierTest = 'FALSE' exportLayer = 'FALSE' } // Run the outlier filtering process if 'TRUE' is selected if ( outlierTest == 'TRUE' ) { // Outlier detection and filling on complete DEM using the modified z-score and a median filter [5] var kernel = ee.Kernel.fixed(3,3,[[1,1,1],[1,1,1],[1,1,1]]) var kernel_weighted = ee.Kernel.fixed(3,3,[[1,1,1],[1,0,1],[1,1,1]]) var median = dem.focal_median({kernel:kernel}).reproject(projection) var median_weighted = dem.focal_median({kernel:kernel_weighted}).reproject(projection) var diff = dem.subtract(median) var mzscore = diff.multiply(0.6745).divide(diff.abs().focal_median({kernel:kernel}).reproject(projection)) var fillDEM = dem.where(mzscore.gt(3.5),median_weighted) // Outlier detection and filling on the flood extent border pixels var expand = flood_image.focal_max({kernel: ee.Kernel.square({ radius: projection.nominalScale(), units: 'meters' })}).reproject(projection) var demMask = fillDEM.updateMask(flood_image.mask().eq(0)) var boundary = demMask.add(expand) var medianBoundary = boundary.focal_median({kernel:kernel}).reproject(projection) var medianWeightedBoundary = boundary.focal_median({kernel:kernel_weighted}).reproject(projection) var diffBoundary = boundary.subtract(medianBoundary) var mzscoreBoundary = diffBoundary.multiply(0.6745).divide(diffBoundary.abs().focal_median({kernel:kernel}).reproject(projection)) var fill =
The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve understanding of large-scale crises, such as natural disasters. The cartographic products produced by CMT include flood inundation maps, maps of damaged or destroyed structures, forest fire maps, population density estimates, etc. CMT is designed to rapidly process large-scale data using Google Earth Engine and other geospatial data systems.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Water inundated area observed in different dates data.
The Digital Bedrock Geologic-GIS Map of Allegheny Portage Railroad National Historic Site, Johnstown Flood National Memorial and Vicinity, Pennsylvania is composed of GIS data layers and GIS tables, and is available in the following GRI-supported GIS data formats: 1.) a 10.1 file geodatabase (alpo_jofl_bedrock_geology.gdb), a 2.) Open Geospatial Consortium (OGC) geopackage, and 3.) 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. The file geodatabase format is supported with a 1.) ArcGIS Pro map file (.mapx) file (alpo_jofl_bedrock_geology.mapx) and individual Pro layer (.lyrx) files (for each GIS data layer), as well as with a 2.) 10.1 ArcMap (.mxd) map document (alpo_jofl_bedrock_geology.mxd) and individual 10.1 layer (.lyr) files (for each GIS data layer). The OGC geopackage is supported with a QGIS project (.qgz) file. Upon request, the GIS data is also available in ESRI 10.1 shapefile format. Contact Stephanie O'Meara (see contact information below) to acquire the GIS data in these GIS data formats. In addition to the GIS data and supporting GIS files, three additional files comprise a GRI digital geologic-GIS dataset or map: 1.) this file (alpo_jofl_geology_gis_readme.pdf), 2.) the GRI ancillary map information document (.pdf) file (alpo_jofl_geology.pdf) which contains geologic unit descriptions, as well as other ancillary map information and graphics from the source map(s) used by the GRI in the production of the GRI digital geologic-GIS data for the park, and 3.) a user-friendly FAQ PDF version of the metadata (alpo_jofl_bedrock_geology_metadata_faq.pdf). Please read the alpo_jofl_geology_gis_readme.pdf for information pertaining to the proper extraction of the GIS data and other map files. Google Earth software is available for free at: http://www.google.com/earth/index.html. QGIS software is available for free at: https://www.qgis.org/en/site/. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). For a complete listing of GRI products visit the GRI publications webpage: For a complete listing of GRI products visit the GRI publications webpage: https://www.nps.gov/subjects/geology/geologic-resources-inventory-products.htm. For more information about the Geologic Resources Inventory Program visit the GRI webpage: https://www.nps.gov/subjects/geology/gri,htm. At the bottom of that webpage is a "Contact Us" link if you need additional information. You may also directly contact the program coordinator, Jason Kenworthy (jason_kenworthy@nps.gov). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: Pennsylvania Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (alpo_jofl_bedrock_geology_metadata.txt or alpo_jofl_bedrock_geology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of the digital data, 1:62,500, and United States National Map Accuracy Standards features are within (horizontally) 31.75 meters or 104.17 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS, QGIS or other software used to display this dataset.
Link to the ScienceBase Item Summary page for the item described by this metadata record. Service Protocol: Link to the ScienceBase Item Summary page for the item described by this metadata record. Application Profile: Web Browser. Link Function: information
This resource is a repository of the annual subsurface drainage (so-called "Tile Drainage") maps for the Bois de Sioux Watershed (BdSW), Minnesota and the Red River of the North Basin (RRB), separately. The RRB maps cover a 101,500 km2 area in the United States, which overlies portions of North Dakota, South Daokta, and Minnesota. The maps provide annual subsurface drainage system maps for recent four years, 2009, 2011, 2014, and 2017 (In 2017, the subsurface drainage maps including the Sentinel-1 Synthetic Aperture Radar as an additional input are also provided). Please see Cho et al. (2019) in Water Resources Research (WRR) for full details.
Map Metadata (Proj=longlat +datum=WGS84) Raster value key: 0 = NoData, masked by non-agricultural areas (e.g. urban, water, forest, or wetland land) and high gradient cultivated crop areas (slope > 2%) based on the USGS National Land Cover Dataset (NLCD) and the USGS National Elevation Dataset 1 = Undrained (UD) 2 = Subsurface Drained (SD)
Preferred citation: Cho, E., Jacobs, J. M., Jia, X., & Kraatz, S. (2019). Identifying Subsurface Drainage using Satellite Big Data and Machine Learning via Google Earth Engine. Water Resources Research, 55. https://doi.org/10.1029/2019WR024892
Corresponding author: Eunsang Cho (ec1072@wildcats.unh.edu)
This web map is designed to provide an enriched geospatial platform to ascertain the flood potential status of our local place of residence and other land-use activities. Information on the flood risk distribution can be extracted by 5 major magnitudes (very high, high, moderate, low, and very low). The buildings, roads, and rail tracks that are susceptible to flooding based on the identified magnitudes are also included in the web map. In addition, the historical or flood inventory layer, which contains information on the previous flooding disasters that have occurred within the river basin, is included.
This web map is the result of extensive research using available data, open source and custom datasets that are extremely reliable.The collaborative study was done by Dr. Felix Ndidi Nkeki (GIS-Unit, BEDC Electricity PLC, 5, Akpakpava Road, Benin City, Nigeria and Department of Geography and Regional Planning, University of Benin, Nigeria), Dr. Ehiaguina Innocent Bello (National Space Research and Development Agency, Obasanjo Space Centre, FCT-Abuja, Nigeria) and Dr. Ishola Ganiy Agbaje (Centre for Space Science Technology Education, Obafemi Awolowo University, Ile-Ife, Nigeria). The study results are published in a reputable leading world-class journal known as the International Journal of Disaster Risk Reduction. The methodology, datasets, and full results of the study can be found in the paper.
The major sources of data are: ALOS PALSAR DEM; soil data from Harmonised World Soil Database-Food and Agriculture Organisation of the United Nations (FAO); land-use and surface geologic datasets from CSSTE, OAU Campus, Ile-Ife, Nigeria and Ibadan Urban Flood Management Project (IUFMP), Oyo State, Nigeria; transport network data was extracted from Open Street Map; building footprint data was mined from Google open building; and finally, rainfall grid data was downloaded from the Centre for Hydrometeorology and Remote Sensing (CHRS).
https://data.gov.tw/licensehttps://data.gov.tw/license
The Water Resources Agency's disaster emergency response team of the Ministry of Economic Affairs further combines real-time data such as rainfall, water level, and reservoir information with long-term disaster response experience and computer technology to provide reservoir alerts for the public and relevant units. This helps the public understand the risk of home flooding, prepare early, and reduce the occurrence of disasters. This dataset is linked to a Keyhole Markup Language (KML) file list. This format is a markup language based on the XML (eXtensible Markup Language) syntax standard, developed and maintained by Keyhole, a subsidiary of Google, to express geographic annotations. Documents written in the KML language are KML files, which use the XML file format and are used in Google Earth related software (Google Earth, Google Map, Google Maps for mobile...) to display geographic data (including points, lines, polygons, polyhedra, and models...). Many GIS-related systems now also use this format for the exchange of geographic data, and the fields and codes of this data are all in UTF-8.
The National Flood Hazard Layer (NFHL) is a geospatial database that contains current effective flood hazard data. FEMA provides the flood hazard data to support the National Flood Insurance Program. You can use the information to better understand your level of flood risk and type of flooding.The NFHL is made from effective flood maps and Letters of Map Change (LOMC) delivered to communities. NFHL digital data covers over 90 percent of the U.S. population. New and revised data is being added continuously. If you need information for areas not covered by the NFHL data, there may be other FEMA products which provide coverage for those areas.In the NFHL Viewer, you can use the address search or map navigation to locate an area of interest and the NFHL Print Tool to download and print a full Flood Insurance Rate Map (FIRM) or FIRMette (a smaller, printable version of a FIRM) where modernized data exists. Technical GIS users can also utilize a series of dedicated GIS web services that allow the NFHL database to be incorporated into websites and GIS applications. For more information on available services, go to the NFHL GIS Services User Guide.You can also use the address search on the FEMA Flood Map Service Center (MSC) to view the NFHL data or download a FIRMette. Using the “Search All Products” on the MSC, you can download the NFHL data for a County or State in a GIS file format. This data can be used in most GIS applications to perform spatial analyses and for integration into custom maps and reports. To do so, you will need GIS or mapping software that can read data in shapefile format.FEMA also offers a download of a KMZ (keyhole markup file zipped) file, which overlays the data in Google Earth™. For more information on using the data in Google Earth™, please see Using the National Flood Hazard Layer Web Map Service (WMS) in Google Earth™.