Shoreline change analysis is an important environmental monitoring tool for evaluating coastal exposure to erosion hazards, particularly for vulnerable habitats such as coastal wetlands where habitat loss is problematic world-wide. The increasing availability of high-resolution satellite imagery and emerging developments in analysis techniques support the implementation of these data into coastal management, including shoreline monitoring and change analysis. Geospatial shoreline data were created from a semi-automated methodology using WorldView (WV) satellite data between 2013 and 2020. The data were compared to contemporaneous field-surveyed Real-time Kinematic (RTK) Global Positioning System (GPS) data collected by the Grand Bay National Estuarine Research Reserve (GBNERR) and digitized shorelines from U.S. Department of Agriculture National Agriculture Imagery Program (NAIP) orthophotos. Field data for shoreline monitoring sites was also collected to aid interpretation of results. This data release contains digital vector shorelines, shoreline change calculations for all three remote sensing data sets, and field surveyed data. The data will aid managers and decision-makers in the adoption of high-resolution satellite imagery into shoreline monitoring activities, which will increase the spatial scale of shoreline change monitoring, provide rapid response to evaluate impacts of coastal erosion, and reduce cost of labor-intensive practices. For further information regarding data collection and/or processing methods, refer to the associated journal article (Smith and others, 2021).
Winter climate change has the potential to have a large impact on coastal wetlands in the southeastern U.S. Warmer winter temperatures and reductions in the intensity of freeze events would likely lead to mangrove forest range expansion and salt marsh displacement in parts of the U.S. Gulf of Mexico and Atlantic coast. The objective of this research was to better understand some of the ecological implications of mangrove forest migration and salt marsh displacement. The potential ecological effects of mangrove migration are diverse ranging from important biotic impacts (e.g., coastal fisheries, land bird migration; colonial nesting wading birds) to ecosystem stability (e.g., response to sea level rise and drought; habitat loss; coastal protection) to biogeochemical processes (e.g., carbon storage; water quality). In this research, our focus was on the impact of mangrove forest migration on coastal wetland soil processes and the consequent implications for coastal wetland responses to sea level rise, ecosystem resilience, and carbon storage. Our study specifically addressed the following questions: (1) How do ecological processes and ecosystem properties differ between salt marshes and mangrove forests; (2) As mangrove forests develop, how do their ecosystem properties change and how do these properties compare to salt marshes; (3) How do plant-soil interactions across mangrove forest structural gradients differ among three distinct locations that span the northern Gulf of Mexico; and (4) What are the implications of mangrove forest encroachment and development into salt marsh in terms of soil development, carbon and nitrogen storage, and soil strength? To address these questions, we utilized the salt marshes and natural mangrove forest structural gradients present at three distinct locations in the northern Gulf of Mexico: Cedar Key (Florida), Port Fourchon (Louisiana), and Port Aransas (Texas). Each of these locations represents a distinct combination of climate-driven abiotic conditions. We quantified relationships between plant community composition and structure, soil and porewater physicochemical properties, hydroperiod, and climatic conditions. The suite of measurements that we collected provide initial insights into how different geographic areas of an ecotone, with different environmental conditions, may be impacted by mangrove forest expansion and development, and how these changes may alter the supply of specific ecosystem goods and services. This file includes the site-level elevation data. This work was conducted via a collaborative effort between scientists at the U.S. Geological Survey National Wetland Research Center and the Department of Biology of the University of Louisiana at Lafayette.
This dataset collection comprises several tables that are all interconnected. Each table is a structured set of data with rows and columns, making it easy to locate information. The information within these tables is related, offering a comprehensive view of the subject matter. The data in this collection has been sourced from the website of Lantmäteriet (The Land Survey) in Sweden. It offers a wealth of information, with each table providing a unique perspective on the overall topic.
We offer comprehensive data collection services that cater to a wide range of industries and applications. Whether you require image, audio, or text data, we have the expertise and resources to collect and deliver high-quality data that meets your specific requirements. Our data collection methods include manual collection, web scraping, and other automated techniques that ensure accuracy and completeness of data.
Our team of experienced data collectors and quality assurance professionals ensure that the data is collected and processed according to the highest standards of quality. We also take great care to ensure that the data we collect is relevant and applicable to your use case. This means that you can rely on us to provide you with clean and useful data that can be used to train machine learning models, improve business processes, or conduct research.
We are committed to delivering data in the format that you require. Whether you need raw data or a processed dataset, we can deliver the data in your preferred format, including CSV, JSON, or XML. We understand that every project is unique, and we work closely with our clients to ensure that we deliver the data that meets their specific needs. So if you need reliable data collection services for your next project, look no further than us.
According to a 2020 survey, internet users in the United Kingdom (UK) had the same level of distrust (74 percent) towards both social media companies and advertisers when it came to customer data collection. There was a slightly lower level of distrust when it came to search engines, at 71 percent.
This dataset contains information on the prices and fees charged by for-hire fishing operations in the Southeastern US.
DISCOVERAQ_Colorado_Ground_Platteville_Data contains data collected at the Platteville ground site during the Colorado (Denver) deployment of NASA's DISCOVER-AQ field study. This data product contains data for only the Colorado deployment and data collection is complete.Understanding the factors that contribute to near surface pollution is difficult using only satellite-based observations. The incorporation of surface-level measurements from aircraft and ground-based platforms provides the crucial information necessary to validate and expand upon the use of satellites in understanding near surface pollution. Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) was a four-year campaign conducted in collaboration between NASA Langley Research Center, NASA Goddard Space Flight Center, NASA Ames Research Center, and multiple universities to improve the use of satellites to monitor air quality for public health and environmental benefit. Through targeted airborne and ground-based observations, DISCOVER-AQ enabled more effective use of current and future satellites to diagnose ground level conditions influencing air quality.DISCOVER-AQ employed two NASA aircraft, the P-3B and King Air, with the P-3B completing in-situ spiral profiling of the atmosphere (aerosol properties, meteorological variables, and trace gas species). The King Air conducted both passive and active remote sensing of the atmospheric column extending below the aircraft to the surface. Data from an existing network of surface air quality monitors, AERONET sun photometers, Pandora UV/vis spectrometers and model simulations were also collected. Further, DISCOVER-AQ employed many surface monitoring sites, with measurements being made on the ground, in conjunction with the aircraft. The B200 and P-3B conducted flights in Baltimore-Washington, D.C. in 2011, Houston, TX in 2013, San Joaquin Valley, CA in 2013, and Denver, CO in 2014. These regions were targeted due to being in violation of the National Ambient Air Quality Standards (NAAQS).The first objective of DISCOVER-AQ was to determine and investigate correlations between surface measurements and satellite column observations for the trace gases ozone (O3), nitrogen dioxide (NO2), and formaldehyde (CH2O) to understand how satellite column observations can diagnose surface conditions. DISCOVER-AQ also had the objective of using surface-level measurements to understand how satellites measure diurnal variability and to understand what factors control diurnal variability. Lastly, DISCOVER-AQ aimed to explore horizontal scales of variability, such as regions with steep gradients and urban plumes.
Altosight | AI Custom Web Scraping Data
✦ Altosight provides global web scraping data services with AI-powered technology that bypasses CAPTCHAs, blocking mechanisms, and handles dynamic content.
We extract data from marketplaces like Amazon, aggregators, e-commerce, and real estate websites, ensuring comprehensive and accurate results.
✦ Our solution offers free unlimited data points across any project, with no additional setup costs.
We deliver data through flexible methods such as API, CSV, JSON, and FTP, all at no extra charge.
― Key Use Cases ―
➤ Price Monitoring & Repricing Solutions
🔹 Automatic repricing, AI-driven repricing, and custom repricing rules 🔹 Receive price suggestions via API or CSV to stay competitive 🔹 Track competitors in real-time or at scheduled intervals
➤ E-commerce Optimization
🔹 Extract product prices, reviews, ratings, images, and trends 🔹 Identify trending products and enhance your e-commerce strategy 🔹 Build dropshipping tools or marketplace optimization platforms with our data
➤ Product Assortment Analysis
🔹 Extract the entire product catalog from competitor websites 🔹 Analyze product assortment to refine your own offerings and identify gaps 🔹 Understand competitor strategies and optimize your product lineup
➤ Marketplaces & Aggregators
🔹 Crawl entire product categories and track best-sellers 🔹 Monitor position changes across categories 🔹 Identify which eRetailers sell specific brands and which SKUs for better market analysis
➤ Business Website Data
🔹 Extract detailed company profiles, including financial statements, key personnel, industry reports, and market trends, enabling in-depth competitor and market analysis
🔹 Collect customer reviews and ratings from business websites to analyze brand sentiment and product performance, helping businesses refine their strategies
➤ Domain Name Data
🔹 Access comprehensive data, including domain registration details, ownership information, expiration dates, and contact information. Ideal for market research, brand monitoring, lead generation, and cybersecurity efforts
➤ Real Estate Data
🔹 Access property listings, prices, and availability 🔹 Analyze trends and opportunities for investment or sales strategies
― Data Collection & Quality ―
► Publicly Sourced Data: Altosight collects web scraping data from publicly available websites, online platforms, and industry-specific aggregators
► AI-Powered Scraping: Our technology handles dynamic content, JavaScript-heavy sites, and pagination, ensuring complete data extraction
► High Data Quality: We clean and structure unstructured data, ensuring it is reliable, accurate, and delivered in formats such as API, CSV, JSON, and more
► Industry Coverage: We serve industries including e-commerce, real estate, travel, finance, and more. Our solution supports use cases like market research, competitive analysis, and business intelligence
► Bulk Data Extraction: We support large-scale data extraction from multiple websites, allowing you to gather millions of data points across industries in a single project
► Scalable Infrastructure: Our platform is built to scale with your needs, allowing seamless extraction for projects of any size, from small pilot projects to ongoing, large-scale data extraction
― Why Choose Altosight? ―
✔ Unlimited Data Points: Altosight offers unlimited free attributes, meaning you can extract as many data points from a page as you need without extra charges
✔ Proprietary Anti-Blocking Technology: Altosight utilizes proprietary techniques to bypass blocking mechanisms, including CAPTCHAs, Cloudflare, and other obstacles. This ensures uninterrupted access to data, no matter how complex the target websites are
✔ Flexible Across Industries: Our crawlers easily adapt across industries, including e-commerce, real estate, finance, and more. We offer customized data solutions tailored to specific needs
✔ GDPR & CCPA Compliance: Your data is handled securely and ethically, ensuring compliance with GDPR, CCPA and other regulations
✔ No Setup or Infrastructure Costs: Start scraping without worrying about additional costs. We provide a hassle-free experience with fast project deployment
✔ Free Data Delivery Methods: Receive your data via API, CSV, JSON, or FTP at no extra charge. We ensure seamless integration with your systems
✔ Fast Support: Our team is always available via phone and email, resolving over 90% of support tickets within the same day
― Custom Projects & Real-Time Data ―
✦ Tailored Solutions: Every business has unique needs, which is why Altosight offers custom data projects. Contact us for a feasibility analysis, and we’ll design a solution that fits your goals
✦ Real-Time Data: Whether you need real-time data delivery or scheduled updates, we provide the flexibility to receive data when you need it. Track price changes, monitor product trends, or gather...
DISCOVERAQ_Maryland_Ground_Edgewood_Data contains data collected at the Edgewood ground site during the Maryland (Baltimore-Washington) deployment of NASA's DISCOVER-AQ field study. This data product contains data for only the Maryland deployment and data collection is complete.Understanding the factors that contribute to near surface pollution is difficult using only satellite-based observations. The incorporation of surface-level measurements from aircraft and ground-based platforms provides the crucial information necessary to validate and expand upon the use of satellites in understanding near surface pollution. Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) was a four-year campaign conducted in collaboration between NASA Langley Research Center, NASA Goddard Space Flight Center, NASA Ames Research Center, and multiple universities to improve the use of satellites to monitor air quality for public health and environmental benefit. Through targeted airborne and ground-based observations, DISCOVER-AQ enabled more effective use of current and future satellites to diagnose ground level conditions influencing air quality.DISCOVER-AQ employed two NASA aircraft, the P-3B and King Air, with the P-3B completing in-situ spiral profiling of the atmosphere (aerosol properties, meteorological variables, and trace gas species). The King Air conducted both passive and active remote sensing of the atmospheric column extending below the aircraft to the surface. Data from an existing network of surface air quality monitors, AERONET sun photometers, Pandora UV/vis spectrometers and model simulations were also collected. Further, DISCOVER-AQ employed many surface monitoring sites, with measurements being made on the ground, in conjunction with the aircraft. The B200 and P-3B conducted flights in Baltimore-Washington, D.C. in 2011, Houston, TX in 2013, San Joaquin Valley, CA in 2013, and Denver, CO in 2014. These regions were targeted due to being in violation of the National Ambient Air Quality Standards (NAAQS).The first objective of DISCOVER-AQ was to determine and investigate correlations between surface measurements and satellite column observations for the trace gases ozone (O3), nitrogen dioxide (NO2), and formaldehyde (CH2O) to understand how satellite column observations can diagnose surface conditions. DISCOVER-AQ also had the objective of using surface-level measurements to understand how satellites measure diurnal variability and to understand what factors control diurnal variability. Lastly, DISCOVER-AQ aimed to explore horizontal scales of variability, such as regions with steep gradients and urban plumes.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Characteristics of data collection, abstraction, and management at audit sites A–G.
DISCOVERAQ_Maryland_Ground_Essex_Data contains data collected at the Essex ground site during the Maryland (Baltimore-Washington) deployment of NASA's DISCOVER-AQ field study. This data product contains data for only the Maryland deployment and data collection is complete.Understanding the factors that contribute to near surface pollution is difficult using only satellite-based observations. The incorporation of surface-level measurements from aircraft and ground-based platforms provides the crucial information necessary to validate and expand upon the use of satellites in understanding near surface pollution. Deriving Information on Surface conditions from Column and Vertically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) was a four-year campaign conducted in collaboration between NASA Langley Research Center, NASA Goddard Space Flight Center, NASA Ames Research Center, and multiple universities to improve the use of satellites to monitor air quality for public health and environmental benefit. Through targeted airborne and ground-based observations, DISCOVER-AQ enabled more effective use of current and future satellites to diagnose ground level conditions influencing air quality.DISCOVER-AQ employed two NASA aircraft, the P-3B and King Air, with the P-3B completing in-situ spiral profiling of the atmosphere (aerosol properties, meteorological variables, and trace gas species). The King Air conducted both passive and active remote sensing of the atmospheric column extending below the aircraft to the surface. Data from an existing network of surface air quality monitors, AERONET sun photometers, Pandora UV/vis spectrometers and model simulations were also collected. Further, DISCOVER-AQ employed many surface monitoring sites, with measurements being made on the ground, in conjunction with the aircraft. The B200 and P-3B conducted flights in Baltimore-Washington, D.C. in 2011, Houston, TX in 2013, San Joaquin Valley, CA in 2013, and Denver, CO in 2014. These regions were targeted due to being in violation of the National Ambient Air Quality Standards (NAAQS).The first objective of DISCOVER-AQ was to determine and investigate correlations between surface measurements and satellite column observations for the trace gases ozone (O3), nitrogen dioxide (NO2), and formaldehyde (CH2O) to understand how satellite column observations can diagnose surface conditions. DISCOVER-AQ also had the objective of using surface-level measurements to understand how satellites measure diurnal variability and to understand what factors control diurnal variability. Lastly, DISCOVER-AQ aimed to explore horizontal scales of variability, such as regions with steep gradients and urban plumes.
This dataset consists of Pennsylvania bridge over water locations that are being considered for flood data collection.
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Atmospheric data collection stations layer created from water management information system (WMIS) sites data. This service is for the Open Data Download application for the Southwest Florida Water Management District.
Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
The Antarctic Site Inventory Project has collected biological data and site-descriptive information in the Antarctic Peninsula region since 1994. This research effort has provided data on those sites which are visited by tourists on shipboard expeditions in the region. The aim is to obtain data on the population status of several key species of Antarctic seabirds, which might be affected by the cumulative impact resulting from visits to the sites. This project will continue the effort by focusing on two heavily-visited Antarctic Peninsula sites: Paulet Island, in the northwestern Weddell Sea and Petermann Island, in the Lemaire Channel near Anvers Island. These sites were selected because both rank among the ten most visited sites in Antarctica each year in terms of numbers of visitors and zodiac landings; both are diverse in species composition, and both are sensitive to potential environmental disruptions from visitors. These data collected focus on two important biological parameters for penguins and blue-eyed shags: (1) breeding population size (number of occupied nests) and (2) breeding success (number of chicks per occupied nests). A long-term data program will be supported, with studies at the two sites over a five-year period. The main focus will be at Petermann Island, selected for intensive study due to its visitor status and location in the region near Palmer Station. This will allow for comparative data with the Palmer Long Term Ecological Research program. Demographic data will be collected in accordance with Standard Methods established by the Convention for the Conservation of Antarctic Marine Living Resources Ecosystem Monitoring Program and thus will be comparable with similar data sets being collected by other international Antarctic Treaty nation research programs. While separating human-induced change from change resulting from a combination of environmental factors will be difficult, this work will provide a first step to identify potential impacts. These long-term data sets will contribute to a better understanding of biological processes in the entire region and will contribute valuable information to be used by the Antarctic Treaty Parties as they address issues in environmental stewardship in Antarctica.
This dataset contains air temperature, relative humidity, precipitation, solar radiation, wind speed, soil temperature, and soil moisture data from the Soil Climate Analysis Network (SCAN) site 2026, "Walnut Gulch #1," located in Cochise County, Arizona. The dataset links to a National Resources Conservation Service data request form, from which available data can be queried. The data collection site is at an elevation of 4500 feet; data has been continuously collected there since 1999-03-19. Resources in this dataset:Resource Title: GeoData catalog record. File Name: Web Page, url: https://geodata.nal.usda.gov/geonetwork/srv/eng/catalog.search#/metadata/WalnutGulch1_eaa_2015_February_23_023
The interactive map displays the water quality data collection sites in the Merrimack River Basin, Massachusetts.
Observer Program web page that lists the observer field manual and all current data collection forms that observers are required to take out to sea.
Unlock the Potential of Your Web Traffic with Advanced Data Resolution
In the digital age, understanding and leveraging web traffic data is crucial for businesses aiming to thrive online. Our pioneering solution transforms anonymous website visits into valuable B2B and B2C contact data, offering unprecedented insights into your digital audience. By integrating our unique tag into your website, you unlock the capability to convert 25-50% of your anonymous traffic into actionable contact rows, directly deposited into an S3 bucket for your convenience. This process, known as "Web Traffic Data Resolution," is at the forefront of digital marketing and sales strategies, providing a competitive edge in understanding and engaging with your online visitors.
Comprehensive Web Traffic Data Resolution Our product stands out by offering a robust solution for "Web Traffic Data Resolution," a process that demystifies the identities behind your website traffic. By deploying a simple tag on your site, our technology goes to work, analyzing visitor behavior and leveraging proprietary data matching techniques to reveal the individuals and businesses behind the clicks. This innovative approach not only enhances your data collection but does so with respect for privacy and compliance standards, ensuring that your business gains insights ethically and responsibly.
Deep Dive into Web Traffic Data At the core of our solution is the sophisticated analysis of "Web Traffic Data." Our system meticulously collects and processes every interaction on your site, from page views to time spent on each section. This data, once anonymous and perhaps seen as abstract numbers, is transformed into a detailed ledger of potential leads and customer insights. By understanding who visits your site, their interests, and their contact information, your business is equipped to tailor marketing efforts, personalize customer experiences, and streamline sales processes like never before.
Benefits of Our Web Traffic Data Resolution Service Enhanced Lead Generation: By converting anonymous visitors into identifiable contact data, our service significantly expands your pool of potential leads. This direct enhancement of your lead generation efforts can dramatically increase conversion rates and ROI on marketing campaigns.
Targeted Marketing Campaigns: Armed with detailed B2B and B2C contact data, your marketing team can create highly targeted and personalized campaigns. This precision in marketing not only improves engagement rates but also ensures that your messaging resonates with the intended audience.
Improved Customer Insights: Gaining a deeper understanding of your web traffic enables your business to refine customer personas and tailor offerings to meet market demands. These insights are invaluable for product development, customer service improvement, and strategic planning.
Competitive Advantage: In a digital landscape where understanding your audience can make or break your business, our Web Traffic Data Resolution service provides a significant competitive edge. By accessing detailed contact data that others in your industry may overlook, you position your business as a leader in customer engagement and data-driven strategies.
Seamless Integration and Accessibility: Our solution is designed for ease of use, requiring only the placement of a tag on your website to start gathering data. The contact rows generated are easily accessible in an S3 bucket, ensuring that you can integrate this data with your existing CRM systems and marketing tools without hassle.
How It Works: A Closer Look at the Process Our Web Traffic Data Resolution process is streamlined and user-friendly, designed to integrate seamlessly with your existing website infrastructure:
Tag Deployment: Implement our unique tag on your website with simple instructions. This tag is lightweight and does not impact your site's loading speed or user experience.
Data Collection and Analysis: As visitors navigate your site, our system collects web traffic data in real-time, analyzing behavior patterns, engagement metrics, and more.
Resolution and Transformation: Using advanced data matching algorithms, we resolve the collected web traffic data into identifiable B2B and B2C contact information.
Data Delivery: The resolved contact data is then securely transferred to an S3 bucket, where it is organized and ready for your access. This process occurs daily, ensuring you have the most up-to-date information at your fingertips.
Integration and Action: With the resolved data now in your possession, your business can take immediate action. From refining marketing strategies to enhancing customer experiences, the possibilities are endless.
Security and Privacy: Our Commitment Understanding the sensitivity of web traffic data and contact information, our solution is built with security and privacy at its core. We adhere to strict data protection regulat...
https://www.icpsr.umich.edu/web/ICPSR/studies/4288/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/4288/terms
This collection contains survey data collected at the end of October 2004 from the 49 state law enforcement agencies in the United States that had traffic patrol responsibility. Information was gathered about their policies for recording race and ethnicity data for persons in traffic stops, including the circumstances under which demographic data should be collected for traffic-related stops and whether such information should be stored in an electronically accessible format. The survey was not designed to obtain available agency databases containing traffic stop records.
POLARIS_Ground_Data is the ground site data collected during the Photochemistry of Ozone Loss in the Arctic Region in Summer (POLARIS) campaign. Data from the Composition and Photo-Dissociative Flux Measurement (CPFM) are featured in this collection. Data collection for this product is complete.The POLARIS mission was a joint effort of NASA and NOAA that occurred in 1997 and was designed to expand on the photochemical and transport processes that cause the summer polar decreases in the stratospheric ozone. The POLARIS campaign had the overarching goal of better understanding the change of stratospheric ozone levels from very high concentrations in the spring to very low concentrations in the autumn. The NASA ER-2 high-altitude aircraft was the primary platform deployed along with balloons, satellites, and ground-sites. The POLARIS campaign was based in Fairbanks, Alaska with some flights being conducted from California and Hawaii. Flights were conducted between the summer solstice and fall equinox at mid- to high latitudes. The data collected included meteorological variables; long-lived tracers in reference to summertime transport questions; select species with reactive nitrogen (NOy), halogen (Cly), and hydrogen (HOx) reservoirs; and aerosols. More specifically, the ER-2 utilized various techniques/instruments including Laser Absorption, Gas Chromatography, Non-dispersive IR, UV Photometry, Catalysis, and IR Absorption. These techniques/instruments were used to collect data including N2O, CH4, CH3CCl3, CO2, O3, H2O, and NOy. Ground stations were responsible for collecting SO2 and O3, while balloons recorded pressure, temperature, wind speed, and wind directions. Satellites partnered with these platforms collected meteorological data and Lidar imagery. The observations were used to constrain stratospheric computer models to evaluate ozone changes due to chemistry and transport.
Shoreline change analysis is an important environmental monitoring tool for evaluating coastal exposure to erosion hazards, particularly for vulnerable habitats such as coastal wetlands where habitat loss is problematic world-wide. The increasing availability of high-resolution satellite imagery and emerging developments in analysis techniques support the implementation of these data into coastal management, including shoreline monitoring and change analysis. Geospatial shoreline data were created from a semi-automated methodology using WorldView (WV) satellite data between 2013 and 2020. The data were compared to contemporaneous field-surveyed Real-time Kinematic (RTK) Global Positioning System (GPS) data collected by the Grand Bay National Estuarine Research Reserve (GBNERR) and digitized shorelines from U.S. Department of Agriculture National Agriculture Imagery Program (NAIP) orthophotos. Field data for shoreline monitoring sites was also collected to aid interpretation of results. This data release contains digital vector shorelines, shoreline change calculations for all three remote sensing data sets, and field surveyed data. The data will aid managers and decision-makers in the adoption of high-resolution satellite imagery into shoreline monitoring activities, which will increase the spatial scale of shoreline change monitoring, provide rapid response to evaluate impacts of coastal erosion, and reduce cost of labor-intensive practices. For further information regarding data collection and/or processing methods, refer to the associated journal article (Smith and others, 2021).