47 datasets found
  1. w

    Websites using Track The Click

    • webtechsurvey.com
    csv
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    WebTechSurvey, Websites using Track The Click [Dataset]. https://webtechsurvey.com/technology/track-the-click
    Explore at:
    csvAvailable download formats
    Dataset authored and provided by
    WebTechSurvey
    License

    https://webtechsurvey.com/termshttps://webtechsurvey.com/terms

    Time period covered
    2025
    Area covered
    Global
    Description

    A complete list of live websites using the Track The Click technology, compiled through global website indexing conducted by WebTechSurvey.

  2. Leading websites which track users' information in Norway 2019, by cookies

    • statista.com
    Updated May 15, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2019). Leading websites which track users' information in Norway 2019, by cookies [Dataset]. https://www.statista.com/statistics/1030325/leading-websites-which-track-users-information-in-norway-by-cookies/
    Explore at:
    Dataset updated
    May 15, 2019
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    2019
    Area covered
    Norway
    Description

    Reisenett.no was the website which led in the ranking of leading sites which track users' information in Norway in 2019, by number of cookies. Reisenett.no had 540 cookies and was followed by expedia.no, which had 338 cookies.

  3. d

    Web Scraping Data | Key Customers Domain Name Data | Scanning Logos found on...

    • datarade.ai
    .json
    Updated Jun 27, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    PredictLeads (2024). Web Scraping Data | Key Customers Domain Name Data | Scanning Logos found on Websites | 222M+ Records [Dataset]. https://datarade.ai/data-products/predictleads-web-scraping-data-domain-name-data-business-predictleads
    Explore at:
    .jsonAvailable download formats
    Dataset updated
    Jun 27, 2024
    Dataset authored and provided by
    PredictLeads
    Area covered
    Curaçao, Oman, Benin, Nigeria, Svalbard and Jan Mayen, Malaysia, Colombia, Turkmenistan, Northern Mariana Islands, Burkina Faso
    Description

    PredictLeads Key Customers Data provides essential business intelligence by analyzing company relationships, uncovering vendor partnerships, client connections, and strategic affiliations through advanced web scraping and logo recognition. This dataset captures business interactions directly from company websites, offering valuable insights into market positioning, competitive landscapes, and growth opportunities.

    Use Cases:

    ✅ Account Profiling – Gain a 360-degree customer view by mapping company relationships and partnerships. ✅ Competitive Intelligence – Track vendor-client connections and business affiliations to identify key industry players. ✅ B2B Lead Targeting – Prioritize leads based on their business relationships, improving sales and marketing efficiency. ✅ CRM Data Enrichment – Enhance company records with detailed key customer data, ensuring data accuracy. ✅ Market Research – Identify emerging trends and industry networks to optimize strategic planning.

    Key API Attributes:

    • id (string, UUID) – Unique identifier for the company connection.
    • category (string) – Type of relationship (e.g., vendor, client, partner).
    • source_category (string) – Where the connection was detected (e.g., partner page, case study).
    • source_url (string, URL) – Website where the relationship was found.
    • individual_source_url (string, URL) – Specific page confirming the connection.
    • context (string) – Extracted description of the business relationship (e.g., "Company X - partners with Company Y to enhance payment processing").
    • first_seen_at (ISO 8601 date-time) – Date the connection was first detected.
    • last_seen_at (ISO 8601 date-time) – Most recent confirmation of the relationship.
    • company1 & company2 (objects) – Details of the two connected companies, including:
    • - domain (string) – Company website domain.
    • - company_name (string) – Official company name.
    • - ticker (string, nullable) – Stock ticker, if available.

    📌 PredictLeads Key Customers Data is an indispensable tool for B2B sales, marketing, and market intelligence teams, providing actionable relationship insights to drive targeted outreach, competitor tracking, and strategic decision-making.

    API Example: https://docs.predictleads.com/v3/guide/connections_dataset/data_model

  4. Top Visited Websites

    • kaggle.com
    Updated Nov 19, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    The Devastator (2022). Top Visited Websites [Dataset]. https://www.kaggle.com/datasets/thedevastator/the-top-websites-in-the-world/discussion
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Nov 19, 2022
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    The Devastator
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    The Top Websites in the World

    How They Change Over Time

    About this dataset

    This dataset consists of the top 50 most visited websites in the world, as well as the category and principal country/territory for each site. The data provides insights into which sites are most popular globally, and what type of content is most popular in different parts of the world

    How to use the dataset

    This dataset can be used to track the most popular websites in the world over time. It can also be used to compare website popularity between different countries and categories

    Research Ideas

    • To track the most popular websites in the world over time
    • To see how website popularity changes by region
    • To find out which website categories are most popular

    Acknowledgements

    Dataset by Alexa Internet, Inc. (2019), released on Kaggle under the Open Data Commons Public Domain Dedication and License (ODC-PDDL)

    License

    License: CC0 1.0 Universal (CC0 1.0) - Public Domain Dedication No Copyright - You can copy, modify, distribute and perform the work, even for commercial purposes, all without asking permission. See Other Information.

    Columns

    File: df_1.csv | Column name | Description | |:--------------------------------|:---------------------------------------------------------------------| | Site | The name of the website. (String) | | Domain Name | The domain name of the website. (String) | | Category | The category of the website. (String) | | Principal country/territory | The principal country/territory where the website is based. (String) |

  5. d

    Altosight | AI Custom Web Scraping Data | 100% Global | Free Unlimited Data...

    • datarade.ai
    .json, .csv, .xls
    Updated Sep 7, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Altosight (2024). Altosight | AI Custom Web Scraping Data | 100% Global | Free Unlimited Data Points | Bypassing All CAPTCHAs & Blocking Mechanisms | GDPR Compliant [Dataset]. https://datarade.ai/data-products/altosight-ai-custom-web-scraping-data-100-global-free-altosight
    Explore at:
    .json, .csv, .xlsAvailable download formats
    Dataset updated
    Sep 7, 2024
    Dataset authored and provided by
    Altosight
    Area covered
    Czech Republic, Svalbard and Jan Mayen, Guatemala, Tajikistan, Chile, Paraguay, Wallis and Futuna, Singapore, Côte d'Ivoire, Greenland
    Description

    Altosight | AI Custom Web Scraping Data

    ✦ Altosight provides global web scraping data services with AI-powered technology that bypasses CAPTCHAs, blocking mechanisms, and handles dynamic content.

    We extract data from marketplaces like Amazon, aggregators, e-commerce, and real estate websites, ensuring comprehensive and accurate results.

    ✦ Our solution offers free unlimited data points across any project, with no additional setup costs.

    We deliver data through flexible methods such as API, CSV, JSON, and FTP, all at no extra charge.

    ― Key Use Cases ―

    ➤ Price Monitoring & Repricing Solutions

    🔹 Automatic repricing, AI-driven repricing, and custom repricing rules 🔹 Receive price suggestions via API or CSV to stay competitive 🔹 Track competitors in real-time or at scheduled intervals

    ➤ E-commerce Optimization

    🔹 Extract product prices, reviews, ratings, images, and trends 🔹 Identify trending products and enhance your e-commerce strategy 🔹 Build dropshipping tools or marketplace optimization platforms with our data

    ➤ Product Assortment Analysis

    🔹 Extract the entire product catalog from competitor websites 🔹 Analyze product assortment to refine your own offerings and identify gaps 🔹 Understand competitor strategies and optimize your product lineup

    ➤ Marketplaces & Aggregators

    🔹 Crawl entire product categories and track best-sellers 🔹 Monitor position changes across categories 🔹 Identify which eRetailers sell specific brands and which SKUs for better market analysis

    ➤ Business Website Data

    🔹 Extract detailed company profiles, including financial statements, key personnel, industry reports, and market trends, enabling in-depth competitor and market analysis

    🔹 Collect customer reviews and ratings from business websites to analyze brand sentiment and product performance, helping businesses refine their strategies

    ➤ Domain Name Data

    🔹 Access comprehensive data, including domain registration details, ownership information, expiration dates, and contact information. Ideal for market research, brand monitoring, lead generation, and cybersecurity efforts

    ➤ Real Estate Data

    🔹 Access property listings, prices, and availability 🔹 Analyze trends and opportunities for investment or sales strategies

    ― Data Collection & Quality ―

    ► Publicly Sourced Data: Altosight collects web scraping data from publicly available websites, online platforms, and industry-specific aggregators

    ► AI-Powered Scraping: Our technology handles dynamic content, JavaScript-heavy sites, and pagination, ensuring complete data extraction

    ► High Data Quality: We clean and structure unstructured data, ensuring it is reliable, accurate, and delivered in formats such as API, CSV, JSON, and more

    ► Industry Coverage: We serve industries including e-commerce, real estate, travel, finance, and more. Our solution supports use cases like market research, competitive analysis, and business intelligence

    ► Bulk Data Extraction: We support large-scale data extraction from multiple websites, allowing you to gather millions of data points across industries in a single project

    ► Scalable Infrastructure: Our platform is built to scale with your needs, allowing seamless extraction for projects of any size, from small pilot projects to ongoing, large-scale data extraction

    ― Why Choose Altosight? ―

    ✔ Unlimited Data Points: Altosight offers unlimited free attributes, meaning you can extract as many data points from a page as you need without extra charges

    ✔ Proprietary Anti-Blocking Technology: Altosight utilizes proprietary techniques to bypass blocking mechanisms, including CAPTCHAs, Cloudflare, and other obstacles. This ensures uninterrupted access to data, no matter how complex the target websites are

    ✔ Flexible Across Industries: Our crawlers easily adapt across industries, including e-commerce, real estate, finance, and more. We offer customized data solutions tailored to specific needs

    ✔ GDPR & CCPA Compliance: Your data is handled securely and ethically, ensuring compliance with GDPR, CCPA and other regulations

    ✔ No Setup or Infrastructure Costs: Start scraping without worrying about additional costs. We provide a hassle-free experience with fast project deployment

    ✔ Free Data Delivery Methods: Receive your data via API, CSV, JSON, or FTP at no extra charge. We ensure seamless integration with your systems

    ✔ Fast Support: Our team is always available via phone and email, resolving over 90% of support tickets within the same day

    ― Custom Projects & Real-Time Data ―

    ✦ Tailored Solutions: Every business has unique needs, which is why Altosight offers custom data projects. Contact us for a feasibility analysis, and we’ll design a solution that fits your goals

    ✦ Real-Time Data: Whether you need real-time data delivery or scheduled updates, we provide the flexibility to receive data when you need it. Track price changes, monitor product trends, or gather...

  6. Website Screenshot Software Market Report | Global Forecast From 2025 To...

    • dataintelo.com
    csv, pdf, pptx
    Updated Jan 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Website Screenshot Software Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-website-screenshot-software-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Jan 7, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Website Screenshot Software Market Outlook



    The global website screenshot software market size was valued at USD 250 million in 2023 and is projected to reach USD 550 million by 2032, growing at a compound annual growth rate (CAGR) of 9.1% from 2024 to 2032. The primary growth driver for this market is the increasing need for detailed website analytics and competitive analysis, facilitated by the enhanced functionality that website screenshot software provides.



    One of the significant growth factors contributing to the expansion of the website screenshot software market is the booming e-commerce sector. As businesses increasingly move online, the demand for tools that can capture, analyze, and archive website content has surged. E-commerce companies, in particular, rely heavily on website screenshot software to track their competitors' pricing strategies, promotional activities, and website design changes. Moreover, the emphasis on digital marketing strategies necessitates frequent monitoring and analysis of various web pages, propelling the demand for such software.



    The rise in remote work is another critical factor driving the market growth. With teams working from various locations, the need for collaborative tools that facilitate real-time sharing of web content has become imperative. Website screenshot software allows team members to capture and share web pages seamlessly, aiding in better communication and faster decision-making. Such tools are particularly beneficial for web development and digital marketing teams, enabling them to provide visual feedback and suggestions efficiently.



    Technological advancements and the integration of advanced features like automated screenshot capture, scheduling, and cloud storage capabilities are also contributing to market growth. These advancements make it easier for users to capture, store, and manage large volumes of web content. Additionally, the increasing adoption of cloud-based solutions offers flexibility and scalability, further boosting the adoption of website screenshot software. The continuous innovation in software capabilities is expected to sustain market growth over the forecast period.



    In the realm of digital tools, Web Scraper Software plays a pivotal role in complementing website screenshot software. While screenshot software captures static images of web pages, web scraper software goes a step further by extracting data from websites for analysis. This capability is crucial for businesses that require detailed insights into competitor activities, market trends, and consumer behavior. By automating the data extraction process, web scraper software saves time and resources, allowing companies to focus on strategic decision-making. The synergy between website screenshot and web scraper software can significantly enhance a company's ability to conduct comprehensive web analytics and competitive analysis.



    Regionally, North America holds a significant share of the website screenshot software market, driven by the presence of major technology companies and a high adoption rate of advanced digital tools. However, Asia Pacific is projected to witness the highest growth rate during the forecast period, thanks to the rapid digital transformation in emerging economies, increasing internet penetration, and the burgeoning e-commerce sector. Europe is also a key market, with growing investments in digital marketing and web development driving the demand for website screenshot software.



    Deployment Type Analysis



    The website screenshot software market is segmented into cloud-based and on-premises deployment. Cloud-based deployment is expected to dominate the market owing to its benefits such as ease of access, scalability, and lower upfront costs. Cloud-based solutions allow users to access the software from anywhere, making it highly suitable for remote teams and enterprises with multiple locations. This flexibility is a significant advantage, especially in the current scenario where remote working has become the norm for many organizations. Furthermore, cloud-based deployment facilitates automatic updates and maintenance, reducing the burden on in-house IT teams.



    On-premises deployment, however, holds its significance in the market, particularly among large enterprises with stringent data security and compliance requirements. These organizations prefer to have complete control over their data and infrastructure, which is achievable through on-p

  7. Use of online websites for running related activities in the U.S. 2017

    • statista.com
    Updated Dec 10, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2024). Use of online websites for running related activities in the U.S. 2017 [Dataset]. https://www.statista.com/statistics/609341/running-related-activities-online-websites-used-for/
    Explore at:
    Dataset updated
    Dec 10, 2024
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Mar 2017 - Jun 2017
    Area covered
    United States
    Description

    The statistic shows the running related activities for which online websites are commonly used by runners in the United States in 2017. According to the survey, 12 percent of respondents used online websites to track mileage.

  8. a

    DOC Mountain Bike Track Routes

    • doc-deptconservation.opendata.arcgis.com
    Updated Jan 15, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    DOC_admin (2019). DOC Mountain Bike Track Routes [Dataset]. https://doc-deptconservation.opendata.arcgis.com/datasets/0fdd22944b1b42ec87f54c11790208f6
    Explore at:
    Dataset updated
    Jan 15, 2019
    Dataset authored and provided by
    DOC_admin
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    DOC's Mountain Bike Tracks dataset (Tracks where where mountain biking is permitted) with attribution that comes from, and hence reflects the website content.Contains agglomerated tracks where Mountain Biking is a permitted activity, as identified by the web team.Refreshed weekly and reflects the content on the website.Description of the dataset fields as below. NB Not all fields exist in all datasets.FIELD NAMESDescription of fieldaccessHow you can access the siteactivitiesThings you can do near the siteassetIdDOC organisation identifier.bikingTimeEstimated time to cycle the trackbookableIs it bookable?campsiteCategoryCategory of campsitecompletionTimeEstimated time to complete the trackdateLoadedToGISDate the contents were written into the GIS databasedifficultyThe Difficulty categories of the track. Multiple values possible, indicate varying difficulty along the length of the track. Please see the website for further details.dogsAllowedAre dogs allowed on site?facilitiesFacilities available on sitefreeIs it free?hasAlertsWhether there are alerts to do with the site or track?hutCategoryCategory of HutintroductionDescription of SiteintroductionThumbnailLink to thumbnail picturelandscapeAssociated landscape of sitelocationStringUser friendly description of PlacemountainBikingTrackWebPageLink to page on websitenameName of sitenumberOfBunksNumber of bunks in hutnumberOfPoweredSitesNumber of powered sites on campsitenumberOfUnpoweredSitesNumber of unpowered sites on campsiteplaceLocationproximityToRoadEndProximity to road end. Populated where it aids accessibilityregionRegion of New ZealandstaticLinkLink to page on websitestatusInidcation of whether the site is Open or Closed. Best to refer to website for associated alerts.walkingAndTrampingWebPageLink to page on website

  9. Data from: Past Track

    • prep-response-portal.napsgfoundation.org
    • cest-cusec.hub.arcgis.com
    • +4more
    Updated May 4, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NOAA GeoPlatform (2022). Past Track [Dataset]. https://prep-response-portal.napsgfoundation.org/datasets/noaa::past-track-1
    Explore at:
    Dataset updated
    May 4, 2022
    Dataset provided by
    National Oceanic and Atmospheric Administrationhttp://www.noaa.gov/
    Authors
    NOAA GeoPlatform
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Area covered
    Description

    The National Hurricane Center and Central Pacific Center Tropical Weather Summary Web Service is a web service that contains the tropical cyclone data for possible storms throughout the United States and its territories. Specific storms can be identified on this summary service by the storm’s wallet. Wallet information is found in the "idp_source" with a field alias GIS Source attribute field of the data as the leading three characters.This web service visually displays potential impact of tropical cyclones on coastal communities. The provided wind, probability of flooding, surge inundation layers, watch warnings and tidal masks offers critical information for emergency preparedness and response efforts. This includes helping residents, emergency managers, and policymakers understand the potential severity of coastal flooding and take appropriate precautions. This web service covers a wide range of coastal areas prone to tropical cyclones, ensuring that stakeholders across different regions have access to essential tropical storm information. This comprehensive coverage enhances the service's utility and relevance for a diverse audience.However, understanding the full extent of risk requires a comprehensive view of the affected areas. Therefore, it's highly recommended to complement the National Hurricane Center and Central Pacific Center Tropical Weather Web Service is complimented with the use of additional resources including NHC Peak Storm Surge Web Service that provide information about major roads, railways, landmarks, and areas likely to be flooded. Incorporating data on past flood levels can further enrich the analysis and aid in predicting future impacts.One such valuable asset is the NWS National Viewer’s Tropical Site, which offers a wealth of supplementary information to enhance situational awareness and risk assessment. By integrating these complementary resources, stakeholders can gain a holistic understanding of the potential impacts of tropical cyclones and make more effective decisions to safeguard lives, property, and critical.Layer Descriptions:2 Day Outlook depicts the 2-day Graphic Tropical Weather Outlook from the NHC.7 Day Outlook depicts the 7-day Graphic Tropical Weather Outlook from the NHC.Forecast Points depicts the and current position and forecast positions of the storm out to 120 hours.Forecast Track is a line connecting the forecast points.Forecast Cone depicts the forecast "Cone of uncertainty".Watch-Warning depicts a "watch/warning" line indicating which sections of the coastline are in a watch/warning state due to the storm.Past Points depicts the "best" track of the storm to the current time.Past Track is a line connecting the past points.Best Wind Radii shows how the size of the storm has changed and the areas potentially affected so far by sustained winds.Surface Wind Field is intended to show the areas potentially being affected by sustained winds of tropical storm force (34 knot), (50 knot) and hurricane force (64 knot).Forecast Wind Radii are intended to show the expected size of the storm and the areas potentially affected by sustained winds of tropical storm force (34 Knot), (50 knot) and hurricane force (64 knot).Arrival Time of TS Winds depicts the earliest reasonable or the most likely arrival time of tropical storm force winds.Inundation depicts the total water level that occurs on normally dry ground as a result of the storm tide.Tidal Mask depicts the total water level that occurs on normally dry ground as a result of the storm tide, plus intertidal zones/estuarine wetlands.Probabilistic Winds depicts the probability of 34, 50 and 64 knot winds.Update Frequency: Every 6 hours and every 3 hours if the storm is approaching the shore.Link to graphical web page: https://www.nhc.noaa.govLink to data download (shapefile): https://www.nhc.noaa.gov/gisLink to metadataQuestions/Concerns about the service, please contact the DISS GIS team.Time Information: This service is not time enabled.

  10. d

    Pacific Turtle Tracks: Grupo Tortuguero

    • seamap.env.duke.edu
    • seamap4u-dev.env.duke.edu
    xml
    Updated Feb 29, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wallace Nichols; Wallace Nichols (2024). Pacific Turtle Tracks: Grupo Tortuguero [Dataset]. http://doi.org/10.15468/3d5b3s
    Explore at:
    xmlAvailable download formats
    Dataset updated
    Feb 29, 2024
    Dataset provided by
    OBIS-SEAMAP
    Authors
    Wallace Nichols; Wallace Nichols
    License

    https://seamap.env.duke.edu/content/license_permissionhttps://seamap.env.duke.edu/content/license_permission

    Time period covered
    Aug 17, 1996 - Aug 29, 2001
    Area covered
    Description

    Original provider: Grupo Tortuguero Tracking Project

    Dataset credits: Data provider Grupo Tortuguero Tracking Project Originating data center Satellite Tracking and Analysis Tool (STAT)

    Abstract: The oceanic movements of loggerhead turtles (Caretta caretta), green turtles (Chelonia mydas), and a Pacific ridley turtle (Lepidochelys olivacea) were monitored with satellite telemetry between 1996 and 2001 in the Pacific Ocean. During this time several turtles migrated across the Pacific Ocean, covering more than 11,500 km between Santa Rosaliita, Baja California, Mexico (28 40N, 114 14W), and Sendai Bay, Japan (37 54N, 140 56E). These findings are consistent with the hypothesis that loggerheads feeding in the eastern Pacific eventually return to nest on western Pacific beaches. Baja California loggerhead turtles have been shown, through molecular genetic analysis (Bowen et al. 1995) and flipper tag returns (Uchida and Teruya 1988, Resendiz et al. 1998), to be primarily of Japanese origin. We conclude that loggerhead turtles are capable of transpacific migrations and propose that the band of water between 25 and 30 degrees North latitude, the Subtropical Frontal Zone, may be an important transpacific migratory corridor. Recent findings (Polovina et al. 2000) indicate that juvenile loggerheads in the North Pacific move westward against weak (0.1-0.3 km/hr) eastward geostrophic currents, demonstrating that passive drift may not entirely explain the dispersal of loggerheads.

    Juvenile loggerhead turtles, Caretta caretta, in the 20 - 85 cm straight carapace length (SCL) size range have been observed in the offshore waters along the Pacific coast of California, USA, and the Baja California peninsula, Mexico (Pitman 1990, Nichols, in press). Bartlett (1989) suggested that these turtles might be of western Pacific origin, migrating 10,000 km and feeding on pelagic red crabs (Pleuroncodes planipes) along the Baja California coast. Subsequently, Pacific loggerheads appear to utilize the entire North Pacific during the course of development in a manner similar to Atlantic loggerheads' use of the Atlantic Ocean (Bolten et al. 1998). After a period of more than 10 years (Zug et al. 1995), mature turtles evidently cross the Pacific Ocean from pelagic waters and foraging areas along the Baja California coast to return to natal beaches, a journey of more than 12,000 km in each direction. This is the first effort to document pelagic movements of North Pacific loggerheads from feeding grounds to nesting areas using satellite telemetry. Previous telemetry studies of loggerhead turtles have documented post-reproductive movements (Stoneburner 1982), pelagic movements (Polovina et al. 2000), home ranges (Renaud and Carpenter 1994), navigational abilities (Papi et al. 1997) and homing behavior (Luschi 1996). However, few studies of sea turtles have documented pre-nesting movements from feeding grounds to breeding areas. Notably, Renaud and Landry (1996) documented movement of a Kemp's ridley turtle (Lepidochelys kempii) from feeding grounds in Louisiana, USA, to its successful nesting in Rancho Nuevo, Mexico. A unique opportunity to track the movements of an adult-sized loggerhead turtle, rarely encountered along the Baja California coast, emerged in 1996. The turtle had been raised in captivity and used in the initial genetic analysis of Baja California loggerhead turtles (Bowen et al., 1995). Its mature size (Kamezaki and Matsui, 1997), genetic affinities with Japanese turtles, and the existence of a previous tag return from Japanese waters of a captive-raised, Baja California loggerhead turtle (Resendiz et al., 1998) were the deciding factors in choosing this particular turtle for the study. This turtle is included in the dataset as series 7667, named Adelita. The objective of the study was to monitor the oceanic movement, using satellite telemetry, of a Pacific loggerhead turtle initially captured on feeding grounds along the Baja California coast. Movement data also were examined with respect to oceanographic and meteorological information in an effort to gain insight into the navigational cues that guide adult sea turtles and to identify possible transpacific movement corridors.

  11. Transforming Data Discovery Through Behavior Modeling and Recommendation -...

    • openicpsr.org
    Updated Oct 29, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sara Lafia; A.J. Million; Libby Hemphill (2024). Transforming Data Discovery Through Behavior Modeling and Recommendation - Google Analytics Trace Data [Dataset]. http://doi.org/10.3886/E209981V4
    Explore at:
    Dataset updated
    Oct 29, 2024
    Dataset provided by
    Inter-university Consortium for Political and Social Researchhttps://www.icpsr.umich.edu/web/pages/
    National Opinion Research Center
    University of Michigan
    Authors
    Sara Lafia; A.J. Million; Libby Hemphill
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Sep 1, 2012 - Sep 1, 2016
    Description

    This dataset contains trace data describing user interactions with the Inter-university Consortium for Political and Social Research website (ICPSR). We gathered site usage data from Google Analytics. We focused our analysis on user sessions, which are groups of interactions with resources (e.g., website pages) and events initiated by users. ICPSR tracks a subset of user interactions (i.e., other than page views) through event triggers. We analyzed sequences of interactions with resources, including the ICPSR data catalog, variable index, data citations collected in the ICPSR Bibliography of Data-related Literature, and topical information about project archives. As part of our analysis, we calculated the total number of unique sessions and page views in the study period. Data in our study period fell between September 1, 2012, and 2016. ICPSR's website was updated and relaunched in September 2012 with new search functionality, including a Social Science Variables Database (SSVD) tool. ICPSR then reorganized its website and changed its analytics collection procedures in 2016, marking this as the cutoff date for our analysis. Data are relevant for two reasons. First, updates to the ICPSR website during the study period focused only on front-end design rather than the website's search functionality. Second, the core features of the website over the period we examined (e.g., faceted and variable search, standardized metadata, the use of controlled vocabularies, and restricted data applications) are shared with other major data archives, making it likely that the trends in user behavior we report are generalizable.

  12. d

    NOAA NHC - Irma Storm Track - Best Track + Advisories

    • search.dataone.org
    • hydroshare.org
    Updated Apr 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    NOAA National Hurricane Center (NHC) (2022). NOAA NHC - Irma Storm Track - Best Track + Advisories [Dataset]. http://doi.org/10.4211/hs.aa5c9982a4694a19be2fa9299b78e5ca
    Explore at:
    Dataset updated
    Apr 15, 2022
    Dataset provided by
    Hydroshare
    Authors
    NOAA National Hurricane Center (NHC)
    Area covered
    Description

    The NOAA National Hurricane Center (NHC) publishes advisory bulletins with named storm conditions and expectations, see [1]. We have also downloaded shapefiles for eighty-four 5-day forecasts (published from August 30 to September 11) of track line, predicted points, ensemble forecasts envelope, and affected shoreline where applicable [2]. NOAA also publishes the best track for major storms [3]. The "best track" is a smoothed version of the advisories track. Web services are also provided by NHC for the advisory points and lines [4] [5]. Another user has constructed the Irma track (shapefile) from the NHC advisory bulletins [6].

    FEMA also posts windfield data, including peak wind gust and contours [7]. See FEMA disaster webpage [8] for map and list of counties receiving disaster declarations (map pdf available for download from this page)

    References [1] NOAA NHC - Irma storm advisories [http://www.nhc.noaa.gov/archive/2017/IRMA.shtml]
    [2] NOAA NHC - Irma 5-day forecasts [https://www.nhc.noaa.gov/gis/archive_forecast_results.php?id=al11&year=2017&name=Hurricane%20IRMA] [3] NOAA NHC - best tracks for 2017 storms [https://www.nhc.noaa.gov/data/tcr/index.php?season=2017&basin=atl] [4] NOAA NHC - Irma advisory points web service [https://services.arcgis.com/XSeYKQzfXnEgju9o/ArcGIS/rest/services/The_2017_Atlantic_Hurricane_season_(to_October_16th)/FeatureServer/1] [5] NOAA NHC - Irma advisory lines web service [https://services.arcgis.com/XSeYKQzfXnEgju9o/ArcGIS/rest/services/The_2017_Atlantic_Hurricane_season_(to_October_16th)/FeatureServer/6] [6] Irma Advisories Track, compiled by David Tarboton [https://www.hydroshare.org/resource/546fa3feeaf242fc8aabf9fe05ab454c/] [7] FEMA public download site for Hurricane Irma 2017 [https://data.femadata.com/NationalDisasters/HurricaneIrma/] [8] FEMA Disaster Declarations and related links [https://www.fema.gov/disaster/4337]

  13. Error Tracking System

    • catalog.data.gov
    • data.amerigeoss.org
    Updated Mar 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. EPA Office of Mission Support (OMS) (2024). Error Tracking System [Dataset]. https://catalog.data.gov/dataset/error-tracking-system
    Explore at:
    Dataset updated
    Mar 16, 2024
    Dataset provided by
    United States Environmental Protection Agencyhttp://www.epa.gov/
    Description

    Error Tracking System is a database used to store & track error notifications sent by users of EPA's web site. ETS is managed by OIC/OEI. OECA's ECHO & OEI Envirofacts use it. Error notifications from EPA's home Page under "Contact Us" also uses it.

  14. d

    Web Scraping Data | Key Customers Domain Name Data | Scanning Logos found on...

    • datarade.ai
    .json
    Updated Nov 23, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    PredictLeads (2023). Web Scraping Data | Key Customers Domain Name Data | Scanning Logos found on Websites | 248M+ Records [Dataset]. https://datarade.ai/data-categories/logo-data/datasets
    Explore at:
    .jsonAvailable download formats
    Dataset updated
    Nov 23, 2023
    Dataset authored and provided by
    PredictLeads
    Area covered
    Marshall Islands, British Indian Ocean Territory, American Samoa, United Arab Emirates, New Zealand, Saint Barthélemy, Croatia, Azerbaijan, Brunei Darussalam, Slovenia
    Description

    PredictLeads Key Customers Data provides essential business intelligence by analyzing company relationships, uncovering vendor partnerships, client connections, and strategic affiliations through advanced web scraping and logo recognition. This dataset captures business interactions directly from company websites, offering valuable insights into market positioning, competitive landscapes, and growth opportunities.

    Use Cases:

    ✅ Account Profiling – Gain a 360-degree customer view by mapping company relationships and partnerships. ✅ Competitive Intelligence – Track vendor-client connections and business affiliations to identify key industry players. ✅ B2B Lead Targeting – Prioritize leads based on their business relationships, improving sales and marketing efficiency. ✅ CRM Data Enrichment – Enhance company records with detailed key customer data, ensuring data accuracy. ✅ Market Research – Identify emerging trends and industry networks to optimize strategic planning.

    Key API Attributes:

    • id (string, UUID) – Unique identifier for the company connection.
    • category (string) – Type of relationship (e.g., vendor, client, partner).
    • source_category (string) – Where the connection was detected (e.g., partner page, case study).
    • source_url (string, URL) – Website where the relationship was found.
    • individual_source_url (string, URL) – Specific page confirming the connection.
    • context (string) – Extracted description of the business relationship (e.g., "Company X - partners with Company Y to enhance payment processing").
    • first_seen_at (ISO 8601 date-time) – Date the connection was first detected.
    • last_seen_at (ISO 8601 date-time) – Most recent confirmation of the relationship.
    • company1 & company2 (objects) – Details of the two connected companies, including:
    • - domain (string) – Company website domain.
    • - company_name (string) – Official company name.
    • - ticker (string, nullable) – Stock ticker, if available.

    📌 PredictLeads Key Customers Data is an indispensable tool for B2B sales, marketing, and market intelligence teams, providing actionable relationship insights to drive targeted outreach, competitor tracking, and strategic decision-making.

    PredictLeads Docs: https://docs.predictleads.com/v3/guide/connections_dataset

  15. Website Accessibility Evaluation Tool Market Report | Global Forecast From...

    • dataintelo.com
    csv, pdf, pptx
    Updated Jan 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Website Accessibility Evaluation Tool Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-website-accessibility-evaluation-tool-market
    Explore at:
    csv, pdf, pptxAvailable download formats
    Dataset updated
    Jan 7, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Website Accessibility Evaluation Tool Market Outlook


    The global website accessibility evaluation tool market size was valued at approximately USD 450 million in 2023 and is projected to reach around USD 900 million by 2032, growing at a CAGR of 8.2% from 2024 to 2032. The growing emphasis on inclusivity and regulatory compliance is a significant driver for this market.



    The rise in digital transformation initiatives across various industries has escalated the demand for website accessibility evaluation tools. Governments and organizations globally are increasingly recognizing the necessity of making websites accessible to individuals with disabilities. This awareness is primarily fueled by stringent regulations such as the Americans with Disabilities Act (ADA) in the U.S. and the Web Content Accessibility Guidelines (WCAG) 2.1 issued by the World Wide Web Consortium (W3C). Ensuring compliance with these regulations not only helps companies avoid legal repercussions but also enhances their brand reputation and user experience.



    Additionally, the increasing internet penetration and the proliferation of web-based services have made it imperative for businesses to ensure that their digital platforms are user-friendly and accessible to all. According to the International Telecommunication Union (ITU), by 2023, over 60% of the global population had access to the internet. This extensive internet usage has led to an influx of online content and services, necessitating the need for robust accessibility evaluation tools to cater to diverse user needs. Businesses are recognizing the potential market opportunities that come with accessible websites, driving the demand for these tools.



    The COVID-19 pandemic further accelerated the digital shift, necessitating businesses to adopt and optimize digital platforms to remain operational. The surge in online transactions, remote working, and virtual learning highlighted the critical need for accessible websites, as people with disabilities also had to rely heavily on digital services. Consequently, the market for website accessibility evaluation tools experienced a significant boost as companies sought to ensure their websites were accessible to all, further propelling market growth.



    The Internet Archive Tool has emerged as a valuable resource in the realm of digital accessibility. As businesses and organizations strive to ensure their websites are accessible to all users, the Internet Archive Tool provides a unique service by preserving web pages over time. This tool allows developers and accessibility experts to track changes and improvements in website accessibility, offering insights into how digital platforms have evolved to meet accessibility standards. By providing a historical record of web content, the Internet Archive Tool aids in identifying past accessibility issues and understanding the impact of updates and regulations on web accessibility. This capability is particularly beneficial for organizations aiming to maintain compliance with evolving accessibility guidelines and demonstrate their commitment to inclusivity.



    Regionally, North America is anticipated to dominate the market owing to stringent regulatory frameworks and high awareness levels regarding web accessibility. The presence of major technology players and the rapid adoption of advanced digital solutions further contribute to market growth in this region. Europe is expected to follow closely, driven by regulations like the European Accessibility Act and rising awareness about digital inclusivity. Asia Pacific is also predicted to witness substantial growth due to increasing internet penetration and government initiatives promoting digital accessibility.



    Component Analysis


    The website accessibility evaluation tool market is segmented by component into software and services. Software solutions form a significant portion of the market due to their widespread adoption for automated accessibility checks. These tools are designed to scan web pages and identify accessibility issues, offering solutions to rectify them. They cater to various types of disabilities, including visual, auditory, and motor impairments, ensuring comprehensive accessibility compliance. The market is witnessing continuous advancements in software capabilities, driven by artificial intelligence and machine learning technologies, which enhance the accuracy and efficiency of these tools.



    On the services front, the market includes consulting, integration, and s

  16. M

    Mobile Web Analytics Report

    • marketreportanalytics.com
    doc, pdf, ppt
    Updated Apr 3, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Market Report Analytics (2025). Mobile Web Analytics Report [Dataset]. https://www.marketreportanalytics.com/reports/mobile-web-analytics-56142
    Explore at:
    ppt, doc, pdfAvailable download formats
    Dataset updated
    Apr 3, 2025
    Dataset authored and provided by
    Market Report Analytics
    License

    https://www.marketreportanalytics.com/privacy-policyhttps://www.marketreportanalytics.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The mobile web analytics market, currently valued at $2756 million in 2025, is poised for significant growth, projected to expand at a compound annual growth rate (CAGR) of 8.6% from 2025 to 2033. This robust expansion is driven by several key factors. The increasing adoption of mobile devices globally fuels the demand for robust analytics solutions to understand user behavior and optimize website performance for mobile users. Furthermore, the rising sophistication of mobile web technologies, including the proliferation of progressive web apps (PWAs) and the increasing importance of mobile commerce (m-commerce), necessitates detailed analytics to track conversions, engagement metrics, and overall user experience. Competition among businesses to enhance their mobile presence and maximize return on investment (ROI) from mobile marketing efforts further drives market growth. The market segmentation, encompassing both Android and iOS platforms along with mobile app and mobile web page analytics, reflects the multifaceted nature of this sector and allows for specialized solutions tailored to individual client needs. Leading players such as Google, Facebook, Tencent, and others, leverage their existing technological infrastructure and vast user bases to dominate market share. The geographical distribution of the mobile web analytics market showcases a strong presence in North America and Europe, primarily due to the high level of digital maturity and technological adoption in these regions. However, substantial growth opportunities exist in rapidly developing economies across Asia-Pacific, particularly in countries like India and China, as smartphone penetration increases and businesses seek to capitalize on the expanding mobile user base. The market is also experiencing a growing need for advanced analytics capabilities, moving beyond basic website traffic data to encompass more sophisticated user segmentation, predictive analytics, and AI-driven insights. This shift towards data-driven decision-making will continue to shape the future of mobile web analytics and fuel further market expansion. The presence of established technology giants alongside innovative startups fosters competition and innovation, leading to a continuously evolving landscape of products and services.

  17. No Code Web Scraper Tool Market Report | Global Forecast From 2025 To 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Jan 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). No Code Web Scraper Tool Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/no-code-web-scraper-tool-market
    Explore at:
    pptx, pdf, csvAvailable download formats
    Dataset updated
    Jan 7, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    No Code Web Scraper Tool Market Outlook



    As of 2023, the global market size for No Code Web Scraper Tools is valued at approximately USD 850 million and is projected to reach nearly USD 2.5 billion by 2032, growing at a compound annual growth rate (CAGR) of 12.5%. This growth is primarily driven by the increasing demand for simplified data extraction solutions that do not require extensive coding knowledge, enabling businesses of all sizes to efficiently collect and utilize web data.



    The burgeoning need for data-driven decision-making across industries is a significant growth factor for the No Code Web Scraper Tool market. Organizations are increasingly recognizing the value of web data in gaining competitive insights, making informed business decisions, and optimizing operations. The ability to scrape web data without needing advanced technical skills democratizes data access, allowing non-technical users to harness the power of web data extraction. This trend is further propelled by the rise of small and medium enterprises (SMEs) that require cost-effective and efficient tools to stay competitive.



    The growth of e-commerce and digital marketing also plays a pivotal role in the expansion of the No Code Web Scraper Tool market. As online retail continues to flourish, businesses are keen to monitor competitor pricing, track customer reviews, and gather market intelligence. No code web scraping tools provide an accessible and scalable solution for e-commerce platforms to automate these data collection processes. Furthermore, digital marketers utilize these tools to gather and analyze data from various online sources, supporting more targeted and effective marketing campaigns.



    Technological advancements and the integration of artificial intelligence (AI) and machine learning (ML) in web scraping tools are additional drivers of market growth. These advancements enhance the capabilities of web scraping tools, making them more efficient, accurate, and user-friendly. AI and ML technologies facilitate the automatic adaptation to changes in website structures, reducing the need for manual intervention and ensuring continuous data extraction. This technological evolution not only improves the functionality of these tools but also broadens their applicability across different sectors.



    In the realm of web scraping, a Hook Extractor is a crucial component that enhances the efficiency and accuracy of data extraction processes. This tool is designed to seamlessly integrate with existing web scraping frameworks, allowing users to capture specific data elements from complex web pages. By utilizing a Hook Extractor, businesses can streamline their data collection efforts, ensuring that they gather only the most relevant information for their needs. This is particularly beneficial in scenarios where web pages are dynamic and frequently updated, as the Hook Extractor can adapt to changes in the website structure without requiring manual reconfiguration. As a result, organizations can maintain a competitive edge by continuously accessing up-to-date data insights.



    Regionally, North America holds a significant share of the No Code Web Scraper Tool market, driven by the high adoption rate of advanced technologies and the presence of numerous tech-savvy enterprises. The Asia-Pacific region is expected to witness the highest growth rate during the forecast period, fueled by the rapid digitalization and increasing internet penetration. Europe also presents lucrative opportunities, supported by the growing awareness and adoption of data-driven strategies among businesses. Latin America, the Middle East, and Africa are gradually catching up, with increasing investments in IT infrastructure and digital transformation initiatives.



    Component Analysis



    The No Code Web Scraper Tool market by component is segmented into software and services. The software segment comprises the actual web scraping tools that businesses use to extract data from websites. These tools are designed to be user-friendly, often featuring drag-and-drop interfaces and pre-built templates that simplify the data extraction process. The software segment is expected to dominate the market, driven by continuous innovations and the development of more sophisticated, AI-powered scraping solutions. Additionally, the subscription-based pricing model for these tools makes them accessible to a wide range of users, from individual entrepreneurs to large enterprises.



    In contrast,

  18. n

    Aurora Australis Voyage 6 2011/12 Track and Underway Data

    • cmr.earthdata.nasa.gov
    • researchdata.edu.au
    • +3more
    Updated Sep 20, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2019). Aurora Australis Voyage 6 2011/12 Track and Underway Data [Dataset]. https://cmr.earthdata.nasa.gov/search/concepts/C1214305611-AU_AADC
    Explore at:
    Dataset updated
    Sep 20, 2019
    Time period covered
    Apr 16, 2012 - Apr 30, 2012
    Area covered
    Description

    On every voyage of the Aurora Australis, approximately 50 onboard sensors collect data on average every 10 seconds. These data are known as the underway datasets. The type of data collected include water and air temperature, wind speeds, ship speed and location, humidity, fluorescence, salinity and so on. For the full list of available data types, see the website.

    These data are broadcast "live" (every 30 minutes) back to Australia and are available via the Australian Oceanographic Data Centre's portal (see the provided link). Once the ship returns to port, the data are then transferred to Australian Antarctic Division servers where they are then made available via the Marine Science Data Search system (see the provided URL).

    This dataset contains the underway data collected during Voyage 6 of the Aurora Australis Voyage in the 2011/12 season.

    Purpose of voyage: Macquarie Island resupply

    Underway (meteorological) data are available online via the Australian Antarctic Division Data Centre web page (or via the Related URL section).

  19. Aurora Australis Voyage 1 2017/18 Track and Underway Data

    • data.gov.au
    html, wfs
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Australian Antarctic Data Centre, Aurora Australis Voyage 1 2017/18 Track and Underway Data [Dataset]. https://data.gov.au/dataset/ds-aodn-201718010
    Explore at:
    html, wfsAvailable download formats
    Dataset provided by
    Australian Antarctic Divisionhttps://www.antarctica.gov.au/
    Australian Antarctic Data Centre
    Description

    On every voyage of the Aurora Australis, approximately 50 onboard sensors collect data on average every 10 seconds. These data are known as the underway datasets. The type of data collected …Show full descriptionOn every voyage of the Aurora Australis, approximately 50 onboard sensors collect data on average every 10 seconds. These data are known as the underway datasets. The type of data collected include water and air temperature, wind speeds, ship speed and location, humidity, fluorescence, salinity and so on. For the full list of available data types, see the website. These data are broadcast "live" (every 30 minutes) back to Australia and are available via the Australian Oceanographic Data Centre's portal (see the provided link). Once the ship returns to port, the data are then transferred to Australian Antarctic Division servers where they are then made available via the Marine Science Data Search system (see the provided URL). This dataset contains the underway data collected during Voyage 1 of the Aurora Australis Voyage in the 2017/18 season. Purpose of voyage: Davis Resupply - Davis over ice resupply, refuel and personnel deployment/retrieval. Deploy helicopters to Davis station. Leader: Dr. Doug Thost Deputy Leader: Mr. Andrew Cawthorn Underway (meteorological) data are available online via the Australian Antarctic Division Data Centre web page (or via the Related URL section).

  20. d

    Aurora Australis Voyage 3 2011/12 Track and Underway Data

    • data.gov.au
    html, wfs
    Updated Dec 3, 2011
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Australian Antarctic Data Centre (2011). Aurora Australis Voyage 3 2011/12 Track and Underway Data [Dataset]. https://data.gov.au/dataset/ds-aodn-201112030
    Explore at:
    html, wfsAvailable download formats
    Dataset updated
    Dec 3, 2011
    Dataset provided by
    Australian Antarctic Data Centre
    Description

    On every voyage of the Aurora Australis, approximately 50 onboard sensors collect data on average every 10 seconds. These data are known as the underway datasets. The type of data collected …Show full descriptionOn every voyage of the Aurora Australis, approximately 50 onboard sensors collect data on average every 10 seconds. These data are known as the underway datasets. The type of data collected include water and air temperature, wind speeds, ship speed and location, humidity, fluorescence, salinity and so on. For the full list of available data types, see the website. These data are broadcast "live" (every 30 minutes) back to Australia and are available via the Australian Oceanographic Data Centre's portal (see the provided link). Once the ship returns to port, the data are then transferred to Australian Antarctic Division servers where they are then made available via the Marine Science Data Search system (see the provided URL). This dataset contains the underway data collected during Voyage 3 of the Aurora Australis Voyage in the 2011/12 season. Purpose of voyage: Commonwealth Bay visit and Marine Science Underway (meteorological) data are available online via the Australian Antarctic Division Data Centre web page (or via the Related URL section).

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
WebTechSurvey, Websites using Track The Click [Dataset]. https://webtechsurvey.com/technology/track-the-click

Websites using Track The Click

Explore at:
csvAvailable download formats
Dataset authored and provided by
WebTechSurvey
License

https://webtechsurvey.com/termshttps://webtechsurvey.com/terms

Time period covered
2025
Area covered
Global
Description

A complete list of live websites using the Track The Click technology, compiled through global website indexing conducted by WebTechSurvey.

Search
Clear search
Close search
Google apps
Main menu