This bucket contains multiple datasets (as Quilt packages) created by the Center for Geospatial Sciences (CGS) at the University of California-Riverside. The data in this bucket contains the following:
https://www.marketreportanalytics.com/privacy-policyhttps://www.marketreportanalytics.com/privacy-policy
The global spatiotemporal big data platform market is experiencing robust growth, projected to reach $23.83 billion in 2025 and maintain a Compound Annual Growth Rate (CAGR) of 9.2% from 2025 to 2033. This expansion is driven by several key factors. The increasing volume and velocity of geospatial data generated from IoT devices, satellite imagery, and sensor networks are creating a significant demand for platforms capable of efficiently storing, processing, and analyzing this data. Furthermore, advancements in cloud computing technologies and the development of sophisticated analytical tools are enabling organizations across various sectors—including transportation, urban planning, environmental monitoring, and defense—to leverage spatiotemporal data for improved decision-making and operational efficiency. The market's competitive landscape is characterized by a mix of established technology giants like Microsoft and AWS, alongside specialized providers like Piesat Information Technology and Geovis Technology. Competition is likely to intensify as smaller companies innovate and seek to establish market share. While data privacy concerns and the complexity of integrating diverse data sources could present challenges, the overall growth trajectory remains positive, fueled by the continued proliferation of data and the rising need for real-time insights. The market's segmentation, while not explicitly provided, can be reasonably inferred. We can expect segmentation based on deployment model (cloud, on-premises, hybrid), industry vertical (government, energy, transportation, etc.), and platform type (open-source vs. proprietary). The forecast period of 2025-2033 suggests a long-term outlook of sustained growth, with potential for accelerated expansion in regions with rapidly developing digital infrastructure and strong government support for data-driven initiatives. Regional market share is expected to be influenced by factors such as technological adoption rates, regulatory frameworks, and the concentration of key players within specific geographic areas. The presence of numerous Chinese companies in the provided list suggests a significant presence in the Asia-Pacific region, potentially becoming a major market driver.
This geospatial dataset delivers high-accuracy GPS event streams from millions of connected devices across Asia, enabling advanced mobility, mapping, and location intelligence applications. Sourced from tier-1 app developers and trusted suppliers, it provides granular insights for commercial, government, and research use.
Each record includes: Latitude & Longitude coordinates Event timestamp (epoch & date) Mobile Advertising ID (IDFA/GAID) Horizontal accuracy (~85% fill rate) Country code (ISO3) Optional metadata: IP address, carrier, device model
Access & Delivery API with polygon queries (up to 10,000 tiles) Formats: JSON, CSV, Parquet Delivery via API, AWS S3, or Google Cloud Storage Hourly or daily refresh options Historical backfill from September 2024 Credit-based pricing for scalability
Compliance Fully compliant with GDPR and CCPA, with clear opt-in/out mechanisms and transparent privacy policies.
Use Cases Advanced mapping and GIS solutions Urban mobility and infrastructure planning Commercial site selection and market expansion Geofencing and targeted advertising Disaster response planning and risk assessment Transportation and logistics optimization
Rural Alaskan communities in which 55% or less of homes are served by a piped, septic & well, or covered haul system, identified by the Alaska Water and Sewer Challenge project of the Alaska DEC Village Safe Water program.
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
The Earth Observation Satellites Ground Stations market is experiencing robust growth, driven by increasing demand for high-resolution satellite imagery and data across various sectors. The market, estimated at $5 billion in 2025, is projected to exhibit a Compound Annual Growth Rate (CAGR) of 15% from 2025 to 2033, reaching approximately $15 billion by 2033. This expansion is fueled by several key factors. Firstly, the proliferation of small satellites and constellations is generating a massive volume of data requiring efficient ground infrastructure for reception and processing. Secondly, advancements in sensor technology are leading to higher-resolution imagery with enhanced analytical capabilities, stimulating demand across diverse applications like precision agriculture, environmental monitoring, urban planning, and disaster management. Finally, the increasing adoption of cloud-based solutions for data storage, processing, and analytics is streamlining workflows and lowering barriers to entry for users. Major players like Amazon Web Services, Microsoft Azure, and specialized providers like K-Sat and Infostellar are actively shaping the market landscape through their innovative offerings and strategic partnerships. However, several challenges remain. High infrastructure costs associated with setting up and maintaining ground stations, particularly those equipped to handle large volumes of data from advanced sensors, pose a significant hurdle for smaller players. Furthermore, regulatory complexities surrounding data ownership, access, and cross-border transfer can hinder market growth. Competition amongst established players and new entrants is also intensifying, driving the need for continuous innovation and cost optimization. The market segmentation reveals a strong emphasis on both government and commercial applications, with significant regional variations reflecting the differing levels of technological adoption and investment in space infrastructure across the globe. The forecast period of 2025-2033 promises a dynamic market characterized by ongoing technological advancement, strategic collaborations, and fierce competition, ultimately benefitting end-users with access to increasingly sophisticated earth observation data and services.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
GEOGLOWS is the Group on Earth Observation's Global Water Sustainability Program. It coordinates efforts from public
and private entities to make application ready river data more accessible and sustainably available to underdeveloped
regions. The GEOGLOWS Hydrological Model provides a retrospective and daily forecast of global river discharge at 7
million river sub-basins. The stream network is a hydrologically conditioned subset of the TDX-Hydro streams and
basins data produced by the United State's National Geospatial Intelligence Agency. The daily forecast provides 3
hourly average discharge in a 51 member ensemble and 15 day lead time derived from the ECMWF Integrated Forecast
System (IFS). The retrospective simulation is derived from ERA5 climate reanalysis data and provides daily average
streamflow beginning on 1 January 1940. New forecasts are uploaded daily and the retrospective simulation is updated
weekly on Sundays to keep the lag time between 5 and 12 days.
The geoglows-v2 bucket contains: (1) model configuration files used to generate the simulations, (2) the GIS streams
datasets used by the model, (3) the GIS streams datasets optimized for visualizations used by Esri's Living Atlas
layer, (4) several supporting table of metadata including country names, river names, hydrological properties used for
modeling.
The geoglows-v2-forecasts bucket contains: (1) daily 15 forecasts in zarr format optimized for time series queries of
all ensemble members in the prediction, (2) CSV formatted summary files optimized for producing time series animated
web maps for the entire global streams dataset.
The geoglows-v2-retrospective bucket contains: (1) the model retrospective outputs in (1a) zarr format optimized for
time series queries of up to a few hundred rivers on demand as well as (1b) in netCDF format best for bulk downloading
the dataset, (2) estimated return period flows for all 7 million million rivers (2a) in zarr format optimized for
reading subsets of the dataset as well as (2b) in netCDF format best for bulk downloading. (3) The initialization files
produced at the end of each incremental simulation useful for restarting the model from a specific date.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset compares four cities FIXED-line broadband internet speeds: - Melbourne, AU - Bangkok, TH - Shanghai, CN - Los Angeles, US - Alice Springs, AU
ERRATA: 1.Data is for Q3 2020, but some files are labelled incorrectly as 02-20 of June 20. They all should read Sept 20, or 09-20 as Q3 20, rather than Q2. Will rename and reload. Amended in v7.
*lines of data for each geojson file; a line equates to a 600m^2 location, inc total tests, devices used, and average upload and download speed - MEL 16181 locations/lines => 0.85M speedtests (16.7 tests per 100people) - SHG 31745 lines => 0.65M speedtests (2.5/100pp) - BKK 29296 lines => 1.5M speedtests (14.3/100pp) - LAX 15899 lines => 1.3M speedtests (10.4/100pp) - ALC 76 lines => 500 speedtests (2/100pp)
Geojsons of these 2* by 2* extracts for MEL, BKK, SHG now added, and LAX added v6. Alice Springs added v15.
This dataset unpacks, geospatially, data summaries provided in Speedtest Global Index (linked below). See Jupyter Notebook (*.ipynb) to interrogate geo data. See link to install Jupyter.
** To Do Will add Google Map versions so everyone can see without installing Jupyter. - Link to Google Map (BKK) added below. Key:Green > 100Mbps(Superfast). Black > 500Mbps (Ultrafast). CSV provided. Code in Speedtestv1.1.ipynb Jupyter Notebook. - Community (Whirlpool) surprised [Link: https://whrl.pl/RgAPTl] that Melb has 20% at or above 100Mbps. Suggest plot Top 20% on map for community. Google Map link - now added (and tweet).
** Python melb = au_tiles.cx[144:146 , -39:-37] #Lat/Lon extract shg = tiles.cx[120:122 , 30:32] #Lat/Lon extract bkk = tiles.cx[100:102 , 13:15] #Lat/Lon extract lax = tiles.cx[-118:-120, 33:35] #lat/Lon extract ALC=tiles.cx[132:134, -22:-24] #Lat/Lon extract
Histograms (v9), and data visualisations (v3,5,9,11) will be provided. Data Sourced from - This is an extract of Speedtest Open data available at Amazon WS (link below - opendata.aws).
**VERSIONS v.24 Add tweet and google map of Top 20% (over 100Mbps locations) in Mel Q322. Add v.1.5 MEL-Superfast notebook, and CSV of results (now on Google Map; link below). v23. Add graph of 2022 Broadband distribution, and compare 2020 - 2022. Updated v1.4 Jupyter notebook. v22. Add Import ipynb; workflow-import-4cities. v21. Add Q3 2022 data; five cities inc ALC. Geojson files. (2020; 4.3M tests 2022; 2.9M tests)
v20. Speedtest - Five Cities inc ALC. v19. Add ALC2.ipynb. v18. Add ALC line graph. v17. Added ipynb for ALC. Added ALC to title.v16. Load Alice Springs Data Q221 - csv. Added Google Map link of ALC. v15. Load Melb Q1 2021 data - csv. V14. Added Melb Q1 2021 data - geojson. v13. Added Twitter link to pics. v12 Add Line-Compare pic (fastest 1000 locations) inc Jupyter (nbn-intl-v1.2.ipynb). v11 Add Line-Compare pic, plotting Four Cities on a graph. v10 Add Four Histograms in one pic. v9 Add Histogram for Four Cities. Add NBN-Intl.v1.1.ipynb (Jupyter Notebook). v8 Renamed LAX file to Q3, rather than 03. v7 Amended file names of BKK files to correctly label as Q3, not Q2 or 06. v6 Added LAX file. v5 Add screenshot of BKK Google Map. v4 Add BKK Google map(link below), and BKK csv mapping files. v3 replaced MEL map with big key version. Prev key was very tiny in top right corner. v2 Uploaded MEL, SHG, BKK data and Jupyter Notebook v1 Metadata record
** LICENCE AWS data licence on Speedtest data is "CC BY-NC-SA 4.0", so use of this data must be: - non-commercial (NC) - reuse must be share-alike (SA)(add same licence). This restricts the standard CC-BY Figshare licence.
** Other uses of Speedtest Open Data; - see link at Speedtest below.
The Global Forecast System (GFS) is a weather forecast model produced by the National Centers for Environmental Prediction (NCEP). Dozens of atmospheric and land-soil variables are available through this dataset, from temperatures, winds, and precipitation to soil moisture and atmospheric ozone concentration. The GFS data files stored here can be immediately used for OAR/ARL’s NOAA-EPA Atmosphere-Chemistry Coupler Cloud (NACC-Cloud) tool, and are in a Network Common Data Form (netCDF), which is a very common format used across the scientific community. These particular GFS files contain a comprehensive number of global atmosphere/land variables at a relatively high spatiotemporal resolution (approximately 13x13 km horizontal, vertical resolution of 127 levels, and hourly), are not only necessary for the NACC-Cloud tool to adequately drive community air quality applications (e.g., U.S. EPA’s Community Multiscale Air Quality model; https://www.epa.gov/cmaq), but can be very useful for a myriad of other applications in the Earth system modeling communities (e.g., atmosphere, hydrosphere, pedosphere, etc.). While many other data file and record formats are indeed available for Earth system and climate research (e.g., GRIB, HDF, GeoTIFF), the netCDF files here are advantageous to the larger community because of the comprehensive, high spatiotemporal information they contain, and because they are more scalable, appendable, shareable, self-describing, and community-friendly (i.e., many tools available to the community of users). Out of the four operational GFS forecast cycles per day (at 00Z, 06Z, 12Z and 18Z) this particular netCDF dataset is updated daily (/inputs/yyyymmdd/) for the 12Z cycle and includes 24-hr output for both 2D (gfs.t12z.sfcf$0hh.nc) and 3D variables (gfs.t12z.atmf$0hh.nc).
Also available are netCDF formatted Global Land Surface Datasets (GLSDs) developed by Hung et al. (2024). The GLSDs are based on numerous satellite products, and have been gridded to match the GFS spatial resolution (~13x13 km). These GLSDs contain vegetation canopy data (e.g., land surface type, vegetation clumping index, leaf area index, vegetative canopy height, and green vegetation fraction) that are supplemental to and can be combined with the GFS meteorological netCDF data for various applications, including NOAA-ARL's canopy-app. The canopy data variables are climatological, based on satellite data from the year 2020, combined with GFS meteorology for the year 2022, and are created at a daily temporal resolution (/inputs/geo-files/gfs.canopy.t12z.2022mmdd.sfcf000.global.nc)
https://www.promarketreports.com/privacy-policyhttps://www.promarketreports.com/privacy-policy
The global 3D mapping and modeling market is expected to grow significantly in the next few years as demand increases for detailed and accurate representations of physical environments in three-dimensional space. Estimated to be valued at USD 38.62 billion in the year 2025, the market was expected to grow at a CAGR of 14.5% from 2025 to 2033 and was estimated to reach an amount of USD 90.26 billion by the end of 2033. The high growth rate is because of improvement in advanced technologies with the development of high-resolution sensors and methods of photogrammetry that make possible higher-resolution realistic and immersive 3D models.Key trends in the market are the adoption of virtual and augmented reality (VR/AR) applications, 3D mapping with smart city infrastructure, and increased architecture, engineering, and construction utilization of 3D models. Other factors are driving the growing adoption of cloud-based 3D mapping and modeling solutions. The solutions promise scalability, cost-effectiveness, and easy access to 3D data, thus appealing to business and organizations of all sizes. Recent developments include: Jun 2023: Nomoko (Switzerland), a leading provider of real-world 3D data technology, announced that it has joined the Overture Maps Foundation, a non-profit organization committed to fostering collaboration and innovation in the geospatial domain. Nomoko will collaborate with Meta, Amazon Web Services (AWS), TomTom, and Microsoft, to create interoperable, accessible 3D datasets, leveraging its real-world 3D modeling capabilities., May 2023: The Sanborn Map Company (Sanborn), an authority in 3D models, announced the development of a powerful new tool, the Digital Twin Base Map. This innovative technology sets a new standard for urban analysis, implementation of Digital Cities, navigation, and planning with a fundamental transformation from a 2D map to a 3D environment. The Digital Twin Base Map is a high-resolution 3D map providing unprecedented detail and accuracy., Feb 2023: Bluesky Geospatial launched the MetroVista, a 3D aerial mapping program in the USA. The service employs a hybrid imaging-Lidar airborne sensor to capture highly detailed 3D data, including 360-degree views of buildings and street-level features, in urban areas to create digital twins, visualizations, and simulations., Feb 2023: Esri, a leading global provider of geographic information system (GIS), location intelligence, and mapping solutions, released new ArcGIS Reality Software to capture the world in 3D. ArcGIS Reality enables site, city, and country-wide 3D mapping for digital twins. These 3D models and high-resolution maps allow organizations to analyze and interact with a digital world, accurately showing their locations and situations., Jan 2023: Strava, a subscription-based fitness platform, announced the acquisition of FATMAP, a 3D mapping platform, to integrate into its app. The acquisition adds FATMAP's mountain-focused maps to Strava's platform, combining with the data already within Strava's products, including city and suburban areas for runners and other fitness enthusiasts., Jan 2023: The 3D mapping platform FATMAP is acquired by Strava. FATMAP applies the concept of 3D visualization specifically for people who like mountain sports like skiing and hiking., Jan 2022: GeoScience Limited (the UK) announced receiving funding from Deep Digital Cornwall (DDC) to develop a new digital heat flow map. The DDC project has received grant funding from the European Regional Development Fund. This study aims to model the heat flow in the region's shallower geothermal resources to promote its utilization in low-carbon heating. GeoScience Ltd wants to create a more robust 3D model of the Cornwall subsurface temperature through additional boreholes and more sophisticated modeling techniques., Aug 2022: In order to create and explore the system's possibilities, CGTrader worked with the online retailer of dietary supplements Hello100. The system has the ability to scale up the generation of more models, and it has enhanced and improved Hello100's appearance on Amazon Marketplace.. Key drivers for this market are: The demand for 3D maps and models is growing rapidly across various industries, including architecture, engineering, and construction (AEC), manufacturing, transportation, and healthcare. Advances in hardware, software, and data acquisition techniques are making it possible to create more accurate, detailed, and realistic 3D maps and models. Digital twins, which are virtual representations of real-world assets or systems, are driving the demand for 3D mapping and modeling technologies for the creation of accurate and up-to-date digital representations.
. Potential restraints include: The acquisition and processing of 3D data can be expensive, especially for large-scale projects. There is a lack of standardization in the 3D mapping modeling industry, which can make it difficult to share and exchange data between different software and systems. There is a shortage of skilled professionals who are able to create and use 3D maps and models effectively.. Notable trends are: 3D mapping and modeling technologies are becoming essential for a wide range of applications, including urban planning, architecture, construction, environmental management, and gaming. Advancements in hardware, software, and data acquisition techniques are enabling the creation of more accurate, detailed, and realistic 3D maps and models. Digital twins, which are virtual representations of real-world assets or systems, are driving the demand for 3D mapping and modeling technologies for the creation of accurate and up-to-date digital representations..
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
ArcticDEM - 2m GSD Digital Elevation Models (DEMs) and mosaics from 2007 to the present. The ArcticDEM project seeks to fill the need for high-resolution time-series elevation data in the Arctic. The time-dependent nature of the strip DEM files allows users to perform change detection analysis and to compare observations of topography data acquired in different seasons or years. The mosaic DEM tiles are assembled from multiple strip DEMs with the intention of providing a more consistent and comprehensive product over large areas. ArcticDEM data is constructed from in-track and cross-track high-resolution (~0.5 meter) imagery acquired by the Maxar constellation of optical imaging satellites.
Elevation datasets in New Jersey have been collected over several years as several discrete projects. Each project covers a geographic area, which is a subsection of the entire state, and has differing specifications based on the available technology at the time and project budget. The geographic extent of one project may overlap that of a neighboring project. Each of the 18 projects contains deliverable products such as LAS (Lidar point cloud) files, unclassified/classified, tiled to cover project area; relevant metadata records or documents, most adhering to the Federal Geographic Data Committee’s (FGDC) Content Standard for Digital Geospatial Metadata (CSDGM); tiling index feature class or shapefile; flights lines feature class or shapefile; Digital Elevation Model in image format or Esri grid format; other derivative data products such as contour lines feature class or shapefile.
https://data.linz.govt.nz/license/attribution-4-0-international/https://data.linz.govt.nz/license/attribution-4-0-international/
This dataset provides a seamless cloud-free 10m resolution satellite imagery layer of the New Zealand mainland and offshore islands.
The imagery was captured by the European Space Agency Sentinel-2 satellites between September 2022 - April 2023.
Data comprises: • 450 ortho-rectified RGB GeoTIFF images in NZTM projection, tiled into the LINZ Standard 1:50000 tile layout. • Satellite sensors: ESA Sentinel-2A and Sentinel-2B • Acquisition dates: September 2022 - April 2023 • Spectral resolution: R, G, B • Spatial resolution: 10 meters • Radiometric resolution: 8-bits (downsampled from 12-bits)
This is a visual product only. The data has been downsampled from 12-bits to 8-bits, and the original values of the images have been modified for visualisation purposes.
Also available on: • Basemaps • NZ Imagery - Registry of Open Data on AWS
NASA's goal in Earth science is to observe, understand, and model the Earth system to discover how it is changing, to better predict change, and to understand the consequences for life on Earth. The Applied Sciences Program, within the Earth Science Division of the NASA Science Mission Directorate, serves individuals and organizations around the globe by expanding and accelerating societal and economic benefits derived from Earth science, information, and technology research and development.
The Prediction Of Worldwide Energy Resources (POWER) Project, funded through the Applied Sciences Program at NASA Langley Research Center, gathers NASA Earth observation data and parameters related to the fields of surface solar irradiance and meteorology to serve the public in several free, easy-to-access and easy-to-use methods. POWER helps communities become resilient amid observed climate variability by improving data accessibility, aiding research in energy development, building energy efficiency, and supporting agriculture projects.
The POWER project contains over 380 satellite-derived meteorology and solar energy Analysis Ready Data (ARD) at four temporal levels: hourly, daily, monthly, and climatology. The POWER data archive provides data at the native resolution of the source products. The data is updated nightly to maintain near real time availability (2-3 days for meteorological parameters and 5-7 days for solar). The POWER services catalog consists of a series of RESTful Application Programming Interfaces, geospatial enabled image services, and web mapping Data Access Viewer. These three service offerings support data discovery, access, and distribution to the project’s user base as ARD and as direct application inputs to decision support tools.
The latest data version update includes hourly-based source ARD, in addition to enhanced daily, monthly, annual, and climatology data. The daily time series for meteorology is available from 1981, while solar-based parameters start in 1984. The hourly source data are from Clouds and the Earth's Radiant Energy System (CERES) and Global Modeling and Assimilation Office (GMAO), spanning from 1984 for meteorology and from 2001 for solar-based parameters. The hourly data equips users with the ARD needed to model building system energy performance, providing information directly amenable to decision support tools introducing the industry standard EnergyPlus Weather file format.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is an Australian extract of Speedtest Open data available at Amazon WS (link below - opendata.aws).AWS data licence is "CC BY-NC-SA 4.0", so use of this data must be:- non-commercial (NC)- reuse must be share-alike (SA)(add same licence).This restricts the standard CC-BY Figshare licence.A world speedtest open data was dowloaded (>400Mb, 7M lines of data). An extract of Australia's location (lat, long) revealed 88,000 lines of data (attached as csv).A Jupyter notebook of extract process is attached.See Binder version at Github - https://github.com/areff2000/speedtestAU.+> Install: 173 packages | Downgrade: 1 packages | Total download: 432MBBuild container time: approx - load time 25secs.=> Error: Timesout - BUT UNABLE TO LOAD GLOBAL DATA FILE (6.6M lines).=> Error: Overflows 8GB RAM container provided with global data file (3GB)=> On local JupyterLab M2 MBP; loads in 6 mins.Added Binder from ARDC service: https://binderhub.rc.nectar.org.auDocs: https://ardc.edu.au/resource/fair-for-jupyter-notebooks-a-practical-guide/A link to Twitter thread of outputs provided.A link to Data tutorial provided (GitHub), including Jupyter Notebook to analyse World Speedtest data, selecting one US State.Data Shows: (Q220)- 3.1M speedtests | 762,000 devices | - 88,000 grid locations (600m * 600m), summarised as a point- average speed 33.7Mbps (down), 12.4M (up) | Max speed 724Mbps- data is for 600m * 600m grids, showing average speed up/down, number of tests, and number of users (IP). Added centroid, and now lat/long.See tweet of image of centroids also attached.NB: Discrepancy Q2-21, Speedtest Global shows June AU average speedtest at 80Mbps, whereas Q2 mean is 52Mbps (v17; Q1 45Mbps; v14). Dec 20 Speedtest Global has AU at 59Mbps. Could be possible timing difference. Or spatial anonymising masking shaping highest speeds. Else potentially data inconsistent between national average and geospatial detail. Check in upcoming quarters.NextSteps:Histogram - compare Q220, Q121, Q122. per v1.4.ipynb.Versions:v41: Added AUS Q225 (96k lines avg d/l 130.5 Mbps (median d/l 108.4 Mbps) u/l 22.45 Mbps). Imported using v2 Jupyter notebook (MBP 16Gb). Mean tests: 17.2. Mean devices: 5.11. Download, extract and publish: 20 mins. Download avg is double Q422.v40: Added AUS Q125 (93k lines avg d/l 116.6 Mbps u/l 21.35 Mbps). Imported using v2 Jupyter notebook (MBP 16Gb). Mean tests: 16.9. Mean devices: 5.13. Download, extract and publish: 14 mins.v39: Added AUS Q424 (95k lines avg d/l 110.9 Mbps u/l 21.02 Mbps). Imported using v2 Jupyter notebook (MBP 16Gb). Mean tests: 17.2. Mean devices: 5.24. Download, extract and publish: 14 mins.v38: Added AUS Q324 (92k lines avg d/l 107.0 Mbps u/l 20.79 Mbps). Imported using v2 Jupyter notebook (iMac 32Gb). Mean tests: 17.7. Mean devices: 5.33.Added github speedtest-workflow-importv2vis.ipynb Jupyter added datavis code to colour code national map. (per Binder on Github; link below).v37: Added AUS Q224 (91k lines avg d/l 97.40 Mbps u/l 19.88 Mbps). Imported using speedtest-workflow-importv2 jupyter notebook. Mean tests:18.1. Mean devices: 5.4.v36 Load UK data, Q1-23 and compare to AUS and NZ Q123 data. Add compare image (au-nz-ukQ123.png), calc PlayNZUK.ipynb, data load import-UK.ipynb. UK data bit rough and ready as uses rectangle to mark out UK, but includes some EIRE and FR. Indicative only and to be definitively needs geo-clean to exclude neighbouring countries.v35 Load Melb geo-maps of speed quartiles (0-25, 25-50, 50-75, 75-100, 100-). Avg in 2020; 41Mbps. Avg in 2023; 86Mbps. MelbQ323.png, MelbQ320.png. Calc with Speedtest-incHist.ipynb code. Needed to install conda mapclassify. ax=melb.plot(column=...dict(bins[25,50,75,100]))v34 Added AUS Q124 (93k lines avg d/l 87.00 Mbps u/l 18.86 Mbps). Imported using speedtest-workflow-importv2 jupyter notebook. Mean tests:18.3. Mean devices: 5.5.v33 Added AUS Q423 (92k lines avg d/l 82.62 Mbps). Imported using speedtest-workflow-importv2 jupyter notebook. Mean tests:18.0. Mean devices: 5.6. Added link to Github.v32 Recalc Au vs NZ for upload performance; added image. using PlayNZ Jupyter. NZ approx 40% locations at or above 100Mbps. Aus
One of the National Geospatial-Intelligence Agency’s (NGA) and the National Oceanic and Atmospheric Administration’s (NOAA) missions is to ensure the safety of navigation on the seas by
maintaining the most current information and the highest quality services for U.S. and global transport networks. To achieve this mission, we need accurate coastal bathymetry over diverse
environmental conditions. The SCuBA program focused on providing critical information to improve existing bathymetry resources and techniques with two specific objectives. The first objective
was to validate National Aeronautics and Space Administration’s (NASA) Ice, Cloud and land Elevation SATellite-2 (ICESat-2), an Earth observing, space-based light detection and ranging (LiDAR)
capability, as a useful bathymetry tool for nearshore bathymetry information in differing environmental conditions. Upon validating the ICESat-2 bathymetry retrievals relative to sea floor
type, water clarity, and water surface dynamics, the next objective is to use ICESat-2 as a calibration tool to improve existing Satellite Derived Bathymetry (SDB) coastal bathymetry products
with poor coastal depth information but superior spatial coverage. Current resources that monitor coastal bathymetry can have large vertical depth errors (up to 50 percent) in the nearshore
region; however, derived results from ICESat-2 shows promising results for improving the accuracy of the bathymetry information in the nearshore region.
Project Overview
One of NGA’s and NOAA’s primary missions is to provide safety of navigation information. However, coastal depth information is still lacking in some regions—specifically, remote regions. In fact, it has been reported that 80 percent of the entire seafloor has not been mapped. Traditionally, airborne LiDARs and survey boats are used to map the seafloor, but in remote areas, we have to rely on satellite capabilities, which currently lack the vertical accuracy desired to support safety of navigation in shallow water. In 2018, NASA launched a space-based LiDAR system called ICESat-2 that has global coverage and a polar orbit originally designed to monitor the ice elevation in polar regions. Remarkably, because it has a green laser beam, ICESat-2 also happens to collect bathymetry information ICESat-2. With algorithm development provided by University of Texas (UT) Austin, NGA Research and Development (R&D) leveraged the ICESat-2 platform to generate SCuBA, an automated depth retrieval algorithm for accurate, global, refraction-corrected underwater depths from 0 m to 30 m, detailed in Figure 1 of the documentation. The key benefit of this product is the vertical depth accuracy of depth retrievals, which is ideal for a calibration tool. NGA and NOAA National Geodetic Survey (NGS), partnered to make this product available to the public for all US territories.
View the SCuBA Info Graphic
All details on how SCuBA was developed, how to access data, and how to use the data, please visit the DOCUMENTATION page.
The Harmonized Landsat Sentinel-2 (HLS) project provides consistent surface reflectance (SR) and top of atmosphere (TOA) brightness data from a virtual constellation of satellite sensors. The Operational Land Imager (OLI) is housed aboard the joint NASA/USGS Landsat 8 and Landsat 9 satellites, while the Multi-Spectral Instrument (MSI) is mounted aboard Europe’s Copernicus Sentinel-2A, Sentinel-2B, and Sentinel-2C satellites. The combined measurement enables global observations of the land every 2–3 days at 30-meter (m) spatial resolution. The HLS project uses a set of algorithms to obtain seamless products from OLI and MSI that include atmospheric correction, cloud and cloud-shadow masking, spatial co-registration and common gridding, illumination and view angle normalization, and spectral bandpass adjustment.The HLSL30 product provides 30-m Nadir Bidirectional Reflectance Distribution Function (BRDF)-Adjusted Reflectance (NBAR) and is derived from Landsat 8/9 OLI data products. The HLSS30 and HLSL30 products are gridded to the same resolution and Military Grid Reference System (MGRS) tiling system and thus are “stackable” for time series analysis.The HLSL30 product is provided in Cloud Optimized GeoTIFF (COG) format, and each band is distributed as a separate file. There are 11 bands included in the HLSL30 product along with one quality assessment (QA) band and four angle bands. See the User Guide for a more detailed description of the individual bands provided in the HLSL30 product.Known Issues
Not seeing a result you expected?
Learn how you can add new datasets to our index.
This bucket contains multiple datasets (as Quilt packages) created by the Center for Geospatial Sciences (CGS) at the University of California-Riverside. The data in this bucket contains the following: