Facebook
TwitterAre you looking to identify B2B leads to promote your business, product, or service? Outscraper Google Maps Scraper might just be the tool you've been searching for. This powerful software enables you to extract business data directly from Google's extensive database, which spans millions of businesses across countless industries worldwide.
Outscraper Google Maps Scraper is a tool built with advanced technology that lets you scrape a myriad of valuable information about businesses from Google's database. This information includes but is not limited to, business names, addresses, contact information, website URLs, reviews, ratings, and operational hours.
Whether you are a small business trying to make a mark or a large enterprise exploring new territories, the data obtained from the Outscraper Google Maps Scraper can be a treasure trove. This tool provides a cost-effective, efficient, and accurate method to generate leads and gather market insights.
By using Outscraper, you'll gain a significant competitive edge as it allows you to analyze your market and find potential B2B leads with precision. You can use this data to understand your competitors' landscape, discover new markets, or enhance your customer database. The tool offers the flexibility to extract data based on specific parameters like business category or geographic location, helping you to target the most relevant leads for your business.
In a world that's growing increasingly data-driven, utilizing a tool like Outscraper Google Maps Scraper could be instrumental to your business' success. If you're looking to get ahead in your market and find B2B leads in a more efficient and precise manner, Outscraper is worth considering. It streamlines the data collection process, allowing you to focus on what truly matters – using the data to grow your business.
https://outscraper.com/google-maps-scraper/
As a result of the Google Maps scraping, your data file will contain the following details:
Query Name Site Type Subtypes Category Phone Full Address Borough Street City Postal Code State Us State Country Country Code Latitude Longitude Time Zone Plus Code Rating Reviews Reviews Link Reviews Per Scores Photos Count Photo Street View Working Hours Working Hours Old Format Popular Times Business Status About Range Posts Verified Owner ID Owner Title Owner Link Reservation Links Booking Appointment Link Menu Link Order Links Location Link Place ID Google ID Reviews ID
If you want to enrich your datasets with social media accounts and many more details you could combine Google Maps Scraper with Domain Contact Scraper.
Domain Contact Scraper can scrape these details:
Email Facebook Github Instagram Linkedin Phone Twitter Youtube
Facebook
TwitterExplore APISCRAPY, your AI-powered Google Map Data Scraper. Easily extract Business Location Data from Google Maps and other platforms. Seamlessly access and utilize publicly available map data for your business needs. Scrape All Publicly Available Data From Google Maps & Other Platforms.
Facebook
TwitterOutscraper's Global Location Data service is an advanced solution for harnessing location-based data from Google Maps. Equipped with features such as worldwide coverage, precise filtering, and a plethora of data fields, Outscraper is your reliable source of fresh and accurate data.
Outscraper's Global Location Data Service leverages the extensive data accessible via Google Maps to deliver critical location data on a global scale. This service offers a robust solution for your global intelligence needs, utilizing cutting-edge technology to collect and analyze data from Google Maps and create accurate and relevant location datasets. The service is supported by a constant stream of reliable and current data, powered by Outscraper's advanced web scraping technology, guaranteeing that the data pulled from Google Maps is both fresh and accurate.
One of the key features of Outscraper's Global Location Data Service is its advanced filtering capabilities, allowing you to extract only the location data you need. This means you can specify particular categories, locations, and other criteria to obtain the most pertinent and valuable data for your business requirements, eliminating the need to sort through irrelevant records.
With Outscraper, you gain worldwide coverage for your location data needs. The service's advanced data scraping technology lets you collect data from any country and city without restrictions, making it an indispensable tool for businesses operating on a global scale or those looking to expand internationally. Outscraper provides a wealth of data, offering an unmatched number of fields to compile and enrich your location data. With over 40 data fields, you can generate comprehensive and detailed datasets that offer deep insights into your areas of interest.
The global reach of this service spans across Africa, Asia, and Europe, covering over 150 countries, including but not limited to Zimbabwe in Africa, Yemen in Asia, and Slovenia in Europe. This broad coverage ensures that no matter where your business operations or interests lie, you will have access to the location data you need.
Experience the Outscraper difference today and elevate your location data analysis to the next level.
Facebook
Twitterhttps://brightdata.com/licensehttps://brightdata.com/license
The Google Reviews dataset is perfect for obtaining comprehensive insights into businesses and their customer feedback globally. Easily filter by location, business type, or reviewer details to extract the precise data you need. The Google Reviews dataset includes key data points such as URL, place ID, place name, country, address, review ID, reviewer name, total reviews and photos by the reviewer, reviewer profile URL, and more. This dataset provides valuable information for sentiment analysis, business comparisons, and customer behavior studies.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Building footprint and height data were obtained from the latest global 3D building database. The building footprint data originated from Microsoft and Google datasets. Building height information was estimated using an XGBoost machine learning regression model that integrates multi-source remote sensing features. The height estimation model was trained using datasets from ONEGEO Map, Microsoft, Baidu, and EMU Analytics, utilizing 2020 data for the final estimations. Validation of this database demonstrates that the height estimation models perform exceptionally well at a global scale across both the Northern and Southern Hemispheres. The estimated heights closely match reference height data, especially for landmark buildings, showcasing superior accuracy compared to other global height products. The 3D building data that support this dataset are available in Zenodo DOI:10.5194/essd-16-5357-2024 (Che, Y., Li, X., Liu, X., Wang, Y., Liao, W., Zheng, X., Zhang, X., Xu, X., Shi, Q., Zhu, J., Yuan, H., and Dai, Y. 3D-GloBFP: the first global three-dimensional building footprint dataset. Earth System Science Data)Based on the 3D building database, we verify the locations and boundaries of individual cultural heritage sites and their buffer zones using UNESCO's heritage map platform (https://whc.unesco.org/), and categorize heritage into three groups for data extraction:Broad Scale Sites: For sites encompassing continuous building clusters or portions of cities (e.g., City of Bath), we extract buildings within the designated buffer zones provided by the UNESCO platform.Single Building Sites: For individual monuments or structures (e.g., Tower of London), we precisely extract the building footprints based on their exact coordinates.Multiple Dispersed Buildings: For sites consisting of multiple, non-contiguous structures (e.g., Wooden Churches of Southern Małopolska, Poland), we identify each location using the platform’s data and verify them through Google Maps before extracting the relevant buildings.A few linear heritage sites, such as extensive archaeological routes spanning over a thousand kilometers, are excluded due to the complexities associated with their vast spatial extent and the variability of climate conditions across different segments.The effective data coverage varies across continents: Europe and North America have an effective rate of 82.5%, Asia and the Pacific 68.3%, Latin America and the Caribbean 75.7%, Arab States 76.5%, and Africa 49.2%. This variability reflects differences in data availability. In less developed regions, remote sensing data tends to overlook non-urban heritage sites, and soil and rock structures common in Africa and Southeast Asia are more difficult to detect using satellite remote sensing techniques, leading to lower effective data coverage in these regions.This dataset accompanies the following published article:Chen, Zihua, Gao, Qian, Wu, Yajing, Li, Jiaxin, Li, Xiaowei, Li, Xiao, Wang, Zhenbo, & Cui, Haiyang (2025). World Cultural Heritage sites are under climate stress and no emissions mitigation pathways can uniformly protect them. Communications Earth & Environment, 6:628. https://doi.org/10.1038/s43247-025-02603-8
Facebook
TwitterThe Unpublished Digital Geologic-GIS Map of Moores Creek National Battlefield, North Carolina is composed of GIS data layers and GIS tables in a 10.1 file geodatabase (mocr_geology.gdb), a 10.1 ArcMap (.mxd) map document (mocr_geology.mxd), individual 10.1 layer (.lyr) files for each GIS data layer, an ancillary map information document (mocr_geology.pdf) which contains source map unit descriptions, as well as other source map text, figures and tables, metadata in FGDC text (.txt) and FAQ (.pdf) formats, and a GIS readme file (mocr_geology_gis_readme.pdf). Please read the mocr_geology_gis_readme.pdf for information pertaining to the proper extraction of the file geodatabase and other map files. To request GIS data in ESRI 10.1 shapefile format contact Stephanie O'Meara (stephanie.omeara@colostate.edu; see contact information below). The data is also available as a 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. Google Earth software is available for free at: http://www.google.com/earth/index.html. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (mocr_geology_metadata.txt or mocr_geology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:250,000 and United States National Map Accuracy Standards features are within (horizontally) 127 meters or 416.7 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.3. (available at: https://www.nps.gov/articles/gri-geodatabase-model.htm). The GIS data projection is NAD83, UTM Zone 17N, however, for the KML/KMZ format the data is projected upon export to WGS84 Geographic, the native coordinate system used by Google Earth. The data is within the area of interest of Moores Creek National Battlefield.
Facebook
TwitterThe Unpublished Digital Geologic Map of Chickasaw National Recreation Area and Vicinity, Oklahoma is composed of GIS data layers and GIS tables in a 10.0 file geodatabase (chic_geology.gdb), a 10.0 ArcMap (.MXD) map document (chic_geology.mxd), and individual 10.0 layer (.LYR) files for each GIS data layer, an ancillary map information (.PDF) document (chic_geology.pdf) which contains source map unit descriptions, as well as other source map text, figures and tables, metadata in FGDC text (.TXT) and FAQ (.HTML) formats, and a GIS readme file (chic_gis_readme.pdf). Please read the chic_gis_readme.pdf for information pertaining to the proper extraction of the file geodatabase and other map files. To request GIS data in ESRI 10.0 shapefile format contact Stephanie O’Meara (stephanie_o’meara@colostate.edu; see contact information below). The data is also available as a 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. Google Earth software is available for free at: http://www.google.com/earth/index.html. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (chic_metadata_faq.html; available at http://nrdata.nps.gov/geology/gri_data/gis/chic/chic_metadata_faq.html). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:24,000 and United States National Map Accuracy Standards features are within (horizontally) 12.2 meters or 40 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.1. (available at: http://science.nature.nps.gov/im/inventory/geology/GeologyGISDataModel.cfm). The GIS data projection is NAD83, UTM Zone 14N, however, for the KML/KMZ format the data is projected upon export to WGS84 Geographic, the native coordinate system used by Google Earth. The data is within the area of interest of Chickasaw National Recreation Area.
Facebook
TwitterThe Easiest Way to Collect Data from the Internet Download anything you see on the internet into spreadsheets within a few clicks using our ready-made web crawlers or a few lines of code using our APIs
We have made it as simple as possible to collect data from websites
Easy to Use Crawlers Amazon Product Details and Pricing Scraper Amazon Product Details and Pricing Scraper Get product information, pricing, FBA, best seller rank, and much more from Amazon.
Google Maps Search Results Google Maps Search Results Get details like place name, phone number, address, website, ratings, and open hours from Google Maps or Google Places search results.
Twitter Scraper Twitter Scraper Get tweets, Twitter handle, content, number of replies, number of retweets, and more. All you need to provide is a URL to a profile, hashtag, or an advance search URL from Twitter.
Amazon Product Reviews and Ratings Amazon Product Reviews and Ratings Get customer reviews for any product on Amazon and get details like product name, brand, reviews and ratings, and more from Amazon.
Google Reviews Scraper Google Reviews Scraper Scrape Google reviews and get details like business or location name, address, review, ratings, and more for business and places.
Walmart Product Details & Pricing Walmart Product Details & Pricing Get the product name, pricing, number of ratings, reviews, product images, URL other product-related data from Walmart.
Amazon Search Results Scraper Amazon Search Results Scraper Get product search rank, pricing, availability, best seller rank, and much more from Amazon.
Amazon Best Sellers Amazon Best Sellers Get the bestseller rank, product name, pricing, number of ratings, rating, product images, and more from any Amazon Bestseller List.
Google Search Scraper Google Search Scraper Scrape Google search results and get details like search rank, paid and organic results, knowledge graph, related search results, and more.
Walmart Product Reviews & Ratings Walmart Product Reviews & Ratings Get customer reviews for any product on Walmart.com and get details like product name, brand, reviews, and ratings.
Scrape Emails and Contact Details Scrape Emails and Contact Details Get emails, addresses, contact numbers, social media links from any website.
Walmart Search Results Scraper Walmart Search Results Scraper Get Product details such as pricing, availability, reviews, ratings, and more from Walmart search results and categories.
Glassdoor Job Listings Glassdoor Job Listings Scrape job details such as job title, salary, job description, location, company name, number of reviews, and ratings from Glassdoor.
Indeed Job Listings Indeed Job Listings Scrape job details such as job title, salary, job description, location, company name, number of reviews, and ratings from Indeed.
LinkedIn Jobs Scraper Premium LinkedIn Jobs Scraper Scrape job listings on LinkedIn and extract job details such as job title, job description, location, company name, number of reviews, and more.
Redfin Scraper Premium Redfin Scraper Scrape real estate listings from Redfin. Extract property details such as address, price, mortgage, redfin estimate, broker name and more.
Yelp Business Details Scraper Yelp Business Details Scraper Scrape business details from Yelp such as phone number, address, website, and more from Yelp search and business details page.
Zillow Scraper Premium Zillow Scraper Scrape real estate listings from Zillow. Extract property details such as address, price, Broker, broker name and more.
Amazon product offers and third party sellers Amazon product offers and third party sellers Get product pricing, delivery details, FBA, seller details, and much more from the Amazon offer listing page.
Realtor Scraper Premium Realtor Scraper Scrape real estate listings from Realtor.com. Extract property details such as Address, Price, Area, Broker and more.
Target Product Details & Pricing Target Product Details & Pricing Get product details from search results and category pages such as pricing, availability, rating, reviews, and 20+ data points from Target.
Trulia Scraper Premium Trulia Scraper Scrape real estate listings from Trulia. Extract property details such as Address, Price, Area, Mortgage and more.
Amazon Customer FAQs Amazon Customer FAQs Get FAQs for any product on Amazon and get details like the question, answer, answered user name, and more.
Yellow Pages Scraper Yellow Pages Scraper Get details like business name, phone number, address, website, ratings, and more from Yellow Pages search results.
Facebook
TwitterOpenStreetMap (openstreetmap.org) is a global collaborative mapping project, which offers maps and map data released with an open license, encouraging free re-use and re-distribution. The data is created by a large community of volunteers who use a variety of simple on-the-ground surveying techniques, and wiki-syle editing tools to collaborate as they create the maps, in a process which is open to everyone. The project originated in London, and an active community of mappers and developers are based here. Mapping work in London is ongoing (and you can help!) but the coverage is already good enough for many uses.
Browse the map of London on OpenStreetMap.org
The whole of England updated daily:
For more details of downloads available from OpenStreetMap, including downloading the whole planet, see 'planet.osm' on the wiki.
Download small areas of the map by bounding-box. For example this URL requests the data around Trafalgar Square:
http://api.openstreetmap.org/api/0.6/map?bbox=-0.13062,51.5065,-0.12557,51.50969
Data filtered by "tag". For example this URL returns all elements in London tagged shop=supermarket:
http://www.informationfreeway.org/api/0.6/*[shop=supermarket][bbox=-0.48,51.30,0.21,51.70]
The format of the data is a raw XML represention of all the elements making up the map. OpenStreetMap is composed of interconnected "nodes" and "ways" (and sometimes "relations") each with a set of name=value pairs called "tags". These classify and describe properties of the elements, and ultimately influence how they get drawn on the map. To understand more about tags, and different ways of working with this data format refer to the following pages on the OpenStreetMap wiki.
Rather than working with raw map data, you may prefer to embed maps from OpenStreetMap on your website with a simple bit of javascript. You can also present overlays of other data, in a manner very similar to working with google maps. In fact you can even use the google maps API to do this. See OSM on your own website for details and links to various javascript map libraries.
The OpenStreetMap project aims to attract large numbers of contributors who all chip in a little bit to help build the map. Although the map editing tools take a little while to learn, they are designed to be as simple as possible, so that everyone can get involved. This project offers an exciting means of allowing local London communities to take ownership of their part of the map.
Read about how to Get Involved and see the London page for details of OpenStreetMap community events.
Facebook
Twitterhttps://dataverse.harvard.edu/api/datasets/:persistentId/versions/3.1/customlicense?persistentId=doi:10.7910/DVN/UFC6B5https://dataverse.harvard.edu/api/datasets/:persistentId/versions/3.1/customlicense?persistentId=doi:10.7910/DVN/UFC6B5
Web-based GIS for spatiotemporal crop climate niche mapping Interactive Google Earth Engine Application—Version 2, July 2020 https://cropniche.cartoscience.com https://cartoscience.users.earthengine.app/view/crop-niche Google Earth Engine Code /* ---------------------------------------------------------------------------------------------------------------------- # CropSuit-GEE Authors: Brad G. Peter (bpeter@ua.edu), Joseph P. Messina, and Zihan Lin Organizations: BGP, JPM - University of Alabama; ZL - Michigan State University Last Modified: 06/28/2020 To cite this code use: Peter, B. G.; Messina, J. P.; Lin, Z., 2019, "Web-based GIS for spatiotemporal crop climate niche mapping", https://doi.org/10.7910/DVN/UFC6B5, Harvard Dataverse, V1 ------------------------------------------------------------------------------------------------------------------------- This is a Google Earth Engine crop climate suitability geocommunication and map export tool designed to support agronomic development and deployment of improved crop system technologies. This content is made possible by the support of the American People provided to the Feed the Future Innovation Lab for Sustainable Intensification through the United States Agency for International Development (USAID). The contents are the sole responsibility of the authors and do not necessarily reflect the views of USAID or the United States Government. Program activities are funded by USAID under Cooperative Agreement No. AID-OAA-L-14-00006. ------------------------------------------------------------------------------------------------------------------------- Summarization of input options: There are 14 user options available. The first is a country of interest selection using a 2-digit FIPS code (link available below). This selection is used to produce a rectangular bounding box for export; however, other geometries can be selected with minimal modification to the code. Options 2 and 3 specify the complete temporal range for aggregation (averaged across seasons; single seasons may also be selected). Options 4–7 specify the growing season for calculating total seasonal rainfall and average season temperatures and NDVI (NDVI is for export only and is not used in suitability determination). Options 8–11 specify the climate parameters for the crop of interest (rainfall and temperature max/min). Option 12 enables masking to agriculture, 13 enables exporting of all data layers, and 14 is a text string for naming export files. ------------------------------------------------------------------------------------------------------------------------- ••••••••••••••••••••••••••••••••••••••••••• USER OPTIONS ••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••• */ // CHIRPS data availability: https://developers.google.com/earth-engine/datasets/catalog/UCSB-CHG_CHIRPS_PENTAD // MOD11A2 data availability: https://developers.google.com/earth-engine/datasets/catalog/MODIS_006_MOD11A2 var country = 'MI' // [1] https://en.wikipedia.org/wiki/List_of_FIPS_country_codes var startRange = 2001 // [2] var endRange = 2017 // [3] var startSeasonMonth = 11 // [4] var startSeasonDay = 1 // [5] var endSeasonMonth = 4 // [6] var endSeasonDay = 30 // [7] var precipMin = 750 // [8] var precipMax = 1200 // [9] var tempMin = 22 // [10] var tempMax = 32 // [11] var maskToAg = 'TRUE' // [12] 'TRUE' (default) or 'FALSE' var exportLayers = 'TRUE' // [13] 'TRUE' (default) or 'FALSE' var exportNameHeader = 'crop_suit_maize' // [14] text string for naming export file // ••••••••••••••••••••••••••••••••• NO USER INPUT BEYOND THIS POINT •••••••••••••••••••••••••••••••••••••••••••••••••••• // Access precipitation and temperature ImageCollections and a global countries FeatureCollection var region = ee.FeatureCollection('USDOS/LSIB_SIMPLE/2017') .filterMetadata('country_co','equals',country) var precip = ee.ImageCollection('UCSB-CHG/CHIRPS/PENTAD').select('precipitation') var temp = ee.ImageCollection('MODIS/006/MOD11A2').select(['LST_Day_1km','LST_Night_1km']) var ndvi = ee.ImageCollection('MODIS/006/MOD13Q1').select(['NDVI']) // Create layers for masking to agriculture and masking out water bodies var waterMask = ee.Image('UMD/hansen/global_forest_change_2015').select('datamask').eq(1) var agModis = ee.ImageCollection('MODIS/006/MCD12Q1').select('LC_Type1').mode() .remap([1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17], [0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0]) var agGC = ee.Image('ESA/GLOBCOVER_L4_200901_200912_V2_3').select('landcover') .remap([11,14,20,30,40,50,60,70,90,100,110,120,130,140,150,160,170,180,190,200,210,220,230], [1,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]) var cropland = ee.Image('USGS/GFSAD1000_V1').neq(0) var agMask = agModis.add(agGC).add(cropland).gt(0).eq(1) // Modify user input options for processing with raw data var years = ee.List.sequence(startRange,endRange) var bounds = region.geometry().bounds() var tMinMod = (tempMin+273.15)/0.02 var tMaxMod = (tempMax+273.15)/0.02 //...
Facebook
TwitterThe Digital Geomorphic-GIS Map of Gulf Islands National Seashore (5-meter accuracy and 1-foot resolution 2006-2007 mapping), Mississippi and Florida is composed of GIS data layers and GIS tables, and is available in the following GRI-supported GIS data formats: 1.) a 10.1 file geodatabase (guis_geomorphology.gdb), a 2.) Open Geospatial Consortium (OGC) geopackage, and 3.) 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. The file geodatabase format is supported with a 1.) ArcGIS Pro map file (.mapx) file (guis_geomorphology.mapx) and individual Pro layer (.lyrx) files (for each GIS data layer), as well as with a 2.) 10.1 ArcMap (.mxd) map document (guis_geomorphology.mxd) and individual 10.1 layer (.lyr) files (for each GIS data layer). The OGC geopackage is supported with a QGIS project (.qgz) file. Upon request, the GIS data is also available in ESRI 10.1 shapefile format. Contact Stephanie O'Meara (see contact information below) to acquire the GIS data in these GIS data formats. In addition to the GIS data and supporting GIS files, three additional files comprise a GRI digital geologic-GIS dataset or map: 1.) A GIS readme file (guis_geology_gis_readme.pdf), 2.) the GRI ancillary map information document (.pdf) file (guis_geomorphology.pdf) which contains geologic unit descriptions, as well as other ancillary map information and graphics from the source map(s) used by the GRI in the production of the GRI digital geologic-GIS data for the park, and 3.) a user-friendly FAQ PDF version of the metadata (guis_geomorphology_metadata_faq.pdf). Please read the guis_geology_gis_readme.pdf for information pertaining to the proper extraction of the GIS data and other map files. Google Earth software is available for free at: https://www.google.com/earth/versions/. QGIS software is available for free at: https://www.qgis.org/en/site/. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). For a complete listing of GRI products visit the GRI publications webpage: For a complete listing of GRI products visit the GRI publications webpage: https://www.nps.gov/subjects/geology/geologic-resources-inventory-products.htm. For more information about the Geologic Resources Inventory Program visit the GRI webpage: https://www.nps.gov/subjects/geology/gri,htm. At the bottom of that webpage is a "Contact Us" link if you need additional information. You may also directly contact the program coordinator, Jason Kenworthy (jason_kenworthy@nps.gov). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (guis_geomorphology_metadata.txt or guis_geomorphology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:26,000 and United States National Map Accuracy Standards features are within (horizontally) 13.2 meters or 43.3 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS, QGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.3. (available at: https://www.nps.gov/articles/gri-geodatabase-model.htm).
Facebook
TwitterThe Unpublished Digital Bedrock Geologic Map of Saint-Gaudens National Historic Site and Vicinity, New Hampshire and Vermont is composed of GIS data layers and GIS tables in a 10.1 file geodatabase (saga_geology.gdb), a 10.1 ArcMap (.MXD) map document (saga_geology.mxd), individual 10.1 layer (.LYR) files for each GIS data layer, an ancillary map information (.PDF) document (saga_geology.pdf) which contains source map unit descriptions, as well as other source map text, figures and tables, metadata in FGDC text (.TXT) and FAQ (.HTML) formats, and a GIS readme file (saga_gis_readme.pdf). Please read the saga_gis_readme.pdf for information pertaining to the proper extraction of the file geodatabase and other map files. To request GIS data in ESRI 10.1 shapefile format contact Stephanie O Meara (stephanie.omeara@colostate.edu; see contact information below). The data is also available as a 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. Google Earth software is available for free at: http://www.google.com/earth/index.html. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (saga_metadata_faq.html; available at http://nrdata.nps.gov/geology/gri_data/gis/saga/saga_metadata_faq.html). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:24,000 and United States National Map Accuracy Standards features are within (horizontally) 12.2 meters or 40 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.2. (available at: http://science.nature.nps.gov/im/inventory/geology/GeologyGISDataModel.cfm). The GIS data projection is NAD83, UTM Zone 18N, however, for the KML/KMZ format the data is projected upon export to WGS84 Geographic, the native coordinate system used by Google Earth. The data is within the area of interest of Saint-Gaudens National Historic Site.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This data set covers global extraction and production of coal and metal ores on an individual mine level. It covers
1171 individual mines, reporting mine-level production for 80 different materials in the period 2000-2021. Furthermore, also data on mining coordinates, ownership, mineral reserves, mining waste, transportation of mining products, as well
as mineral processing capacities (smelters and mineral refineries) and production is included. The data was gathered manually from more than 1900 openly available sources, such as annual or sustainability reports of mining companies. All datapoints are linked to their respective sources. After manual screening and entry of the data, automatic cleaning, harmonization and data checking was conducted. Geoinformation was obtained either from coordinates available in company reports, or by retrieving the coordinates via Google Maps API and subsequent manual checking. For mines where no coordinates could be found, other geospatial attributes such as province, region, district or municipality were recorded, and linked to the GADM data set, available at www.gadm.org.
The data set consists of 12 tables. The table “facilities” contains descriptive and spatial information of mines and processing facilities, and is available as a GeoPackage (GPKG) file. All other tables are available in comma-separated values (CSV) format. A schematic depiction of the database is provided as in PNG format in the file database_model.png.
Facebook
TwitterLinkedIn Company Data for Company Analysis, Valuation & Portfolio Strategy LinkedIn company data is one of the most powerful forms of alternative data for understanding company behavior, firmographics, business dynamics, and real-time hiring signals. Canaria’s enriched LinkedIn company data provides detailed company profiles, including hiring activity, job postings, employee trends, headquarters and branch locations, and verified metadata from Google Maps. This LinkedIn corporate data is updated weekly and optimized for use in company analysis, startup scouting, private company valuation, and investment monitoring. It supports BI dashboards, risk models, CRM enrichment, and portfolio strategy.
Use Cases: What Problems This LinkedIn Data Solves Our LinkedIn company insights transform opaque business landscapes into structured, analyzable data. Whether you’re conducting M&A due diligence, tracking high-growth companies, or benchmarking performance, this dataset empowers fast, confident decisions.
Company Analysis • Identify a company’s size, industry classification, and headcount signals using LinkedIn firmographic data • Analyze social presence through LinkedIn follower metrics and employee engagement • Understand geographic expansion through branch locations and hiring distribution • Benchmark companies using LinkedIn profile activity and job posting history • Monitor business changes with real-time LinkedIn updates
Company Valuation & Financial Benchmarking • Feed LinkedIn-based firmographics into comps and financial models • Use hiring velocity from LinkedIn job data as a proxy for business growth • Strengthen private market intelligence with verified non-financial signals • Validate scale, structure, and presence via LinkedIn and Google Maps footprint
Company Risk Analysis • Detect red flags using hiring freezes or drop in profile activity • Spot market shifts through location downsizing or organizational changes • Identify distressed companies with decreased LinkedIn job posting frequency • Compare stated presence vs. active behavior to identify risk anomalies
Business Intelligence (BI) & Strategic Planning • Segment companies by industry, headcount, growth behavior, and hiring activity • Build BI dashboards integrating LinkedIn job trends and firmographic segmentation • Identify geographic hiring hotspots using Maps and LinkedIn signal overlays • Track job creation, title distribution, and skill demand in near real-time • Export filtered LinkedIn corporate data into CRMs, analytics tools, and lead scoring systems
Portfolio Management & Investment Monitoring • Enhance portfolio tracking with LinkedIn hiring data and firmographic enrichment • Spot hiring surges, geographic expansions, or restructuring in real-time • Correlate LinkedIn growth indicators with strategic outcomes • Analyze competitors and targets using historical and real-time LinkedIn data • Generate alerts for high-impact company changes in your portfolio universe
What Makes This LinkedIn Company Data Unique
Includes Real-Time Hiring Signals • Gain visibility into which companies are hiring, at what scale, and for which roles using enriched LinkedIn job data
Verified Location Intelligence • Confirm branch and HQ locations with Google Maps coordinates and public company metadata
Weekly Updates • Stay ahead of the market with fresh, continuously updated LinkedIn company insights
Clean & Analysis-Ready Format • Structured, deduplicated, and taxonomy-mapped data that integrates with CRMs, BI platforms, and investment models
Who Benefits from LinkedIn Company Data • Hedge funds, VCs, and PE firms analyzing startup and private company activity • Portfolio managers and financial analysts tracking operational shifts • Market research firms modeling sector momentum and firmographics • Strategy teams calculating market size using LinkedIn company footprints • BI and analytics teams building company-level dashboards • Compliance and KYC teams enriching company identity records • Corp dev teams scouting LinkedIn acquisition targets and expansion signals
Summary Canaria’s LinkedIn company data delivers high-frequency, high-quality insights into U.S. companies, combining job posting trends, location data, and firmographic intelligence. With real-time updates and structured delivery formats, this alternative dataset enables powerful workflows across company analysis, financial modeling, investment research, market segmentation, and business strategy.
About Canaria Inc. Canaria Inc. is a leader in alternative data, specializing in job market intelligence, LinkedIn company data, and Glassdoor salary analytics. We deliver clean, structured, and enriched datasets at scale using proprietary data scraping pipelines and advanced AI/LLM-based modeling, all backed by human validation. Our AI-powered pipeline is developed by a seasoned team of machine learning experts from Google, Meta, and Amazon, and by alumni of Stanford, Caltech, and Columbia ...
Facebook
TwitterOpen Government Licence - Canada 2.0https://open.canada.ca/en/open-government-licence-canada
License information was derived automatically
“Automatically Extracted Buildings” is a raw digital product in vector format created by NRCan. It consists of a single topographical feature class that delineates polygonal building footprints automatically extracted from airborne Lidar data, high-resolution optical imagery or other sources.
Facebook
TwitterThe Unpublished Digital Geologic-GIS Map of Fort Laramie National Historic Site and Vicinity, Wyoming is composed of GIS data layers and GIS tables in a 10.1 file geodatabase (fola_geology.gdb), a 10.1 ArcMap (.mxd) map document (fola_geology.mxd), individual 10.1 layer (.lyr) files for each GIS data layer, an ancillary map information document (fola_geology.pdf) which contains source map unit descriptions, as well as other source map text, figures and tables, metadata in FGDC text (.txt) and FAQ (.pdf) formats, and a GIS readme file (fola_geology_gis_readme.pdf). Please read the fola_geology_gis_readme.pdf for information pertaining to the proper extraction of the file geodatabase and other map files. To request GIS data in ESRI 10.1 shapefile format contact Stephanie O'Meara (stephanie.omeara@colostate.edu; see contact information below). The data is also available as a 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. Google Earth software is available for free at: http://www.google.com/earth/index.html. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (fola_geology_metadata.txt or fola_geology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:31,680 and United States National Map Accuracy Standards features are within (horizontally) 16.1 meters or 52.8 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.3. (available at: https://www.nps.gov/articles/gri-geodatabase-model.htm). The GIS data projection is NAD83, UTM Zone 13N, however, for the KML/KMZ format the data is projected upon export to WGS84 Geographic, the native coordinate system used by Google Earth. The data is within the area of interest of Fort Laramie National Historic Site.
Facebook
TwitterThe Unpublished Digital Geologic-GIS Map of Theodore Roosevelt National Park and Vicinity, North Dakota is composed of GIS data layers and GIS tables in a 10.1 file geodatabase (thro_geology.gdb), a 10.1 ArcMap (.mxd) map document (thro_geology.mxd), individual 10.1 layer (.lyr) files for each GIS data layer, an ancillary map information document (thro_geology.pdf) which contains source map unit descriptions, as well as other source map text, figures and tables, metadata in FGDC text (.txt) and FAQ (.pdf) formats, and a GIS readme file (thro_geology_gis_readme.pdf). Please read the thro_geology_gis_readme.pdf for information pertaining to the proper extraction of the file geodatabase and other map files. To request GIS data in ESRI 10.1 shapefile format contact Stephanie O'Meara (stephanie.omeara@colostate.edu; see contact information below). The data is also available as a 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. Google Earth software is available for free at: http://www.google.com/earth/index.html. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: North Dakota Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (thro_geology_metadata.txt or thro_geology_metadata_faq.pdf). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:24,000 and United States National Map Accuracy Standards features are within (horizontally) 12.2 meters or 40 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.3. (available at: http://science.nature.nps.gov/im/inventory/geology/GeologyGISDataModel.cfm). The GIS data projection is NAD83, UTM Zone 13N, however, for the KML/KMZ format the data is projected upon export to WGS84 Geographic, the native coordinate system used by Google Earth. The data is within the area of interest of Theodore Roosevelt National Park.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Sentinel2GlobalLULC is a deep learning-ready dataset of RGB images from the Sentinel-2 satellites designed for global land use and land cover (LULC) mapping. Sentinel2GlobalLULC v2.1 contains 194,877 images in GeoTiff and JPEG format corresponding to 29 broad LULC classes. Each image has 224 x 224 pixels at 10 m spatial resolution and was produced by assigning the 25th percentile of all available observations in the Sentinel-2 collection between June 2015 and October 2020 in order to remove atmospheric effects (i.e., clouds, aerosols, shadows, snow, etc.). A spatial purity value was assigned to each image based on the consensus across 15 different global LULC products available in Google Earth Engine (GEE).
Our dataset is structured into 3 main zip-compressed folders, an Excel file with a dictionary for class names and descriptive statistics per LULC class, and a python script to convert RGB GeoTiff images into JPEG format. The first folder called "Sentinel2LULC_GeoTiff.zip" contains 29 zip-compressed subfolders where each one corresponds to a specific LULC class with hundreds to thousands of GeoTiff Sentinel-2 RGB images. The second folder called "Sentinel2LULC_JPEG.zip" contains 29 zip-compressed subfolders with a JPEG formatted version of the same images provided in the first main folder. The third folder called "Sentinel2LULC_CSV.zip" includes 29 zip-compressed CSV files with as many rows as provided images and with 12 columns containing the following metadata (this same metadata is provided in the image filenames):
For seven LULC classes, we could not export from GEE all images that fulfilled a spatial purity of 100% since there were millions of them. In this case, we exported a stratified random sample of 14,000 images and provided an additional CSV file with the images actually contained in our dataset. That is, for these seven LULC classes, we provide these 2 CSV files:
To clearly state the geographical coverage of images available in this dataset, we included in the version v2.1, a compressed folder called "Geographic_Representativeness.zip". This zip-compressed folder contains a csv file for each LULC class that provides the complete list of countries represented in that class. Each csv file has two columns, the first one gives the country code and the second one gives the number of images provided in that country for that LULC class. In addition to these 29 csv files, we provided another csv file that maps each ISO Alpha-2 country code to its original full country name.
© Sentinel2GlobalLULC Dataset by Yassir Benhammou, Domingo Alcaraz-Segura, Emilio Guirado, Rohaifa Khaldi, Boujemâa Achchab, Francisco Herrera & Siham Tabik is marked with Attribution 4.0 International (CC-BY 4.0)
Facebook
TwitterThe Unpublished Digital Quaternary Geologic Map of Aniakchak National Monument and Preserve and Vicinity, Alaska is composed of GIS data layers and GIS tables in a 10.1 file geodatabase (asur_geology.gdb), a 10.1 ArcMap (.MXD) map document (asur_geology.mxd), individual 10.1 layer (.LYR) files for each GIS data layer, an ancillary map information (.PDF) document (ania_geology.pdf) which contains source map unit descriptions, as well as other source map text, figures and tables, metadata in FGDC text (.TXT) and FAQ (.HTML) formats, and a GIS readme file (asur_gis_readme.pdf). Please read the asur_gis_readme.pdf for information pertaining to the proper extraction of the file geodatabase and other map files. To request GIS data in ESRI 10.1 shapefile format contact Stephanie O’Meara (stephanie.o’meara@colostate.edu; see contact information below). The data is also available as a 2.2 KMZ/KML file for use in Google Earth, however, this format version of the map is limited in data layers presented and in access to GRI ancillary table information. Google Earth software is available for free at: http://www.google.com/earth/index.html. Users are encouraged to only use the Google Earth data for basic visualization, and to use the GIS data for any type of data analysis or investigation. The data were completed as a component of the Geologic Resources Inventory (GRI) program, a National Park Service (NPS) Inventory and Monitoring (I&M) Division funded program that is administered by the NPS Geologic Resources Division (GRD). Source geologic maps and data used to complete this GRI digital dataset were provided by the following: U.S. Geological Survey. Detailed information concerning the sources used and their contribution the GRI product are listed in the Source Citation section(s) of this metadata record (asur_metadata_faq.html; available at http://nrdata.nps.gov/geology/gri_data/gis/ania/asur_metadata_faq.html). Users of this data are cautioned about the locational accuracy of features within this dataset. Based on the source map scale of 1:250,000 and United States National Map Accuracy Standards features are within (horizontally) 127 meters or 416.7 feet of their actual location as presented by this dataset. Users of this data should thus not assume the location of features is exactly where they are portrayed in Google Earth, ArcGIS or other software used to display this dataset. All GIS and ancillary tables were produced as per the NPS GRI Geology-GIS Geodatabase Data Model v. 2.2. (available at: http://science.nature.nps.gov/im/inventory/geology/GeologyGISDataModel.cfm). The GIS data projection is NAD83, UTM Zone AD_1983_Alaska_AlbersN, however, for the KML/KMZ format the data is projected upon export to WGS84 Geographic, the native coordinate system used by Google Earth. The data is within the area of interest of Aniakchak National Monument and Preserve.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This tutorial will teach you how to take time-series data from many field sites and create a shareable online map, where clicking on a field location brings you to a page with interactive graph(s).
The tutorial can be completed with a sample dataset (provided via a Google Drive link within the document) or with your own time-series data from multiple field sites.
Part 1 covers how to make interactive graphs in Google Data Studio and Part 2 covers how to link data pages to an interactive map with ArcGIS Online. The tutorial will take 1-2 hours to complete.
An example interactive map and data portal can be found at: https://temple.maps.arcgis.com/apps/View/index.html?appid=a259e4ec88c94ddfbf3528dc8a5d77e8
Facebook
TwitterAre you looking to identify B2B leads to promote your business, product, or service? Outscraper Google Maps Scraper might just be the tool you've been searching for. This powerful software enables you to extract business data directly from Google's extensive database, which spans millions of businesses across countless industries worldwide.
Outscraper Google Maps Scraper is a tool built with advanced technology that lets you scrape a myriad of valuable information about businesses from Google's database. This information includes but is not limited to, business names, addresses, contact information, website URLs, reviews, ratings, and operational hours.
Whether you are a small business trying to make a mark or a large enterprise exploring new territories, the data obtained from the Outscraper Google Maps Scraper can be a treasure trove. This tool provides a cost-effective, efficient, and accurate method to generate leads and gather market insights.
By using Outscraper, you'll gain a significant competitive edge as it allows you to analyze your market and find potential B2B leads with precision. You can use this data to understand your competitors' landscape, discover new markets, or enhance your customer database. The tool offers the flexibility to extract data based on specific parameters like business category or geographic location, helping you to target the most relevant leads for your business.
In a world that's growing increasingly data-driven, utilizing a tool like Outscraper Google Maps Scraper could be instrumental to your business' success. If you're looking to get ahead in your market and find B2B leads in a more efficient and precise manner, Outscraper is worth considering. It streamlines the data collection process, allowing you to focus on what truly matters – using the data to grow your business.
https://outscraper.com/google-maps-scraper/
As a result of the Google Maps scraping, your data file will contain the following details:
Query Name Site Type Subtypes Category Phone Full Address Borough Street City Postal Code State Us State Country Country Code Latitude Longitude Time Zone Plus Code Rating Reviews Reviews Link Reviews Per Scores Photos Count Photo Street View Working Hours Working Hours Old Format Popular Times Business Status About Range Posts Verified Owner ID Owner Title Owner Link Reservation Links Booking Appointment Link Menu Link Order Links Location Link Place ID Google ID Reviews ID
If you want to enrich your datasets with social media accounts and many more details you could combine Google Maps Scraper with Domain Contact Scraper.
Domain Contact Scraper can scrape these details:
Email Facebook Github Instagram Linkedin Phone Twitter Youtube