100+ datasets found
  1. d

    Technical Limits (SPEN_018) Data Quality Checks - Dataset - Datopian CKAN...

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Technical Limits (SPEN_018) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_technical_limits
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Technical Limits dataset. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  2. d

    Long Term Development Statement (SPEN_002) Data Quality Checks - Dataset -...

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Long Term Development Statement (SPEN_002) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_ltds
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Long Term Development Statement dataset. The quality assessment was carried out on 31st March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality; to demonstrate our progress we conduct annual assessments of our data quality in line with the dataset refresh rate. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  3. D

    Cloud Data Quality Monitoring and Testing Market Report | Global Forecast...

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2024). Cloud Data Quality Monitoring and Testing Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-cloud-data-quality-monitoring-and-testing-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Sep 5, 2024
    Authors
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Cloud Data Quality Monitoring and Testing Market Outlook



    The global cloud data quality monitoring and testing market size was valued at USD 1.5 billion in 2023 and is expected to reach USD 4.8 billion by 2032, growing at a compound annual growth rate (CAGR) of 13.8% during the forecast period. This robust growth is driven by increasing cloud adoption across various industries, coupled with the rising need for ensuring data quality and compliance.



    One of the primary growth factors of the cloud data quality monitoring and testing market is the exponential increase in data generation and consumption. As organizations continue to integrate cloud solutions, the volume of data being processed and stored on the cloud has surged dramatically. This data influx necessitates stringent quality monitoring to ensure data integrity, accuracy, and consistency, thus driving the demand for advanced data quality solutions. Moreover, as businesses enhance their data-driven decision-making processes, the need for high-quality data becomes ever more critical, further propelling market growth.



    Another significant driver is the growing complexity of data architectures due to diverse data sources and types. The modern data environment is characterized by a mix of structured, semi-structured, and unstructured data originating from various sources like IoT devices, social media platforms, and enterprise applications. Ensuring the quality of such heterogeneous data sets requires sophisticated monitoring and testing tools that can seamlessly operate within cloud ecosystems. Consequently, organizations are increasingly investing in cloud data quality solutions to manage this complexity, thereby fueling market expansion.



    Compliance and regulatory requirements also play a pivotal role in the growth of the cloud data quality monitoring and testing market. Industries such as BFSI, healthcare, and government are subject to stringent data governance and privacy regulations that mandate regular auditing and validation of data quality. Failure to comply with these regulations can result in severe penalties and reputational damage. Hence, companies are compelled to adopt cloud data quality monitoring and testing solutions to ensure compliance and mitigate risks associated with data breaches and inaccuracies.



    From a regional perspective, North America dominates the market due to its advanced IT infrastructure and early adoption of cloud technologies. However, significant growth is also expected in the Asia Pacific region, driven by rapid digital transformation initiatives and increasing investments in cloud infrastructure by emerging economies like China and India. Europe also presents substantial growth opportunities, with industries embracing cloud solutions to enhance operational efficiency and innovation. The regional dynamics indicate a wide-ranging impact of cloud data quality monitoring and testing solutions across the globe.



    Component Analysis



    The cloud data quality monitoring and testing market is broadly segmented into software and services. The software segment encompasses various tools and platforms designed to automate and streamline data quality monitoring processes. These solutions include data profiling, data cleansing, data integration, and master data management software. The demand for such software is on the rise due to its ability to provide real-time insights into data quality issues, thereby enabling organizations to take proactive measures in addressing discrepancies. Advanced software solutions often leverage AI and machine learning algorithms to enhance data accuracy and predictive capabilities.



    The services segment is equally crucial, offering a gamut of professional and managed services to support the implementation and maintenance of data quality monitoring systems. Professional services include consulting, system integration, and training services, which help organizations in the seamless adoption of data quality tools and best practices. Managed services, on the other hand, provide ongoing support and maintenance, ensuring that data quality standards are consistently met. As organizations seek to optimize their cloud data environments, the demand for comprehensive service offerings is expected to rise, driving market growth.



    One of the key trends within the component segment is the increasing integration of software and services to offer holistic data quality solutions. Vendors are increasingly bundling their software products with complementary services, providing a one-stop solution that covers all aspects of data quality managem

  4. d

    Historic Faults (SPEN_019) Data Quality Checks - Dataset - Datopian CKAN...

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Historic Faults (SPEN_019) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_historic_faults
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Historic Faults dataset. The quality assessment was carried out on the 31st March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  5. u

    Comprehensive assessment of research data management : practices and data...

    • researchdata.up.ac.za
    zip
    Updated Jul 19, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Glenn Tshweu (2025). Comprehensive assessment of research data management : practices and data quality indicators in a social sciences organisation [Dataset]. http://doi.org/10.25403/UPresearchdata.26324230.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 19, 2025
    Dataset provided by
    University of Pretoria
    Authors
    Glenn Tshweu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset includes information on quality control and data management of researchers and data curators from a social science organization. Four data curators and 24 researchers provided responses for the study. Data collection techniques, data processing strategies, data storage and preservation, metadata standards, data sharing procedures, and the perceived significance of quality control and data quality assurance are the main areas of focus. The dataset attempts to provide insight on the RDM procedures that are being used by a social science organization as well as the difficulties that researchers and data curators encounter in upholding high standards of data quality. The goal of the study is to encourage more investigations aimed at enhancing scientific community data management practices and guidelines.

  6. Data quality assurance market size in South Korea 2010-2017

    • statista.com
    Updated Jun 26, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2024). Data quality assurance market size in South Korea 2010-2017 [Dataset]. https://www.statista.com/statistics/863273/south-korea-data-quality-assurance-market-size/
    Explore at:
    Dataset updated
    Jun 26, 2024
    Dataset authored and provided by
    Statistahttp://statista.com/
    Area covered
    South Korea
    Description

    This statistic shows the size of the data quality assurance industry in South Korea from 2010 to 2016 with an estimate for 2017. It was estimated that the data quality assurance market n South Korea would value around 112.7 billion South Korean won in 2017.

  7. d

    Operational Forecasting (SPEN_011) Data Quality Checks - Dataset - Datopian...

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Operational Forecasting (SPEN_011) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_operational_forecasting
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Operational Forecasting dataset. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  8. f

    Data_Sheet_1_Development and Application of a Statistically-Based Quality...

    • frontiersin.figshare.com
    pdf
    Updated Jun 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Adrien Napoly; Tom Grassmann; Fred Meier; Daniel Fenner (2023). Data_Sheet_1_Development and Application of a Statistically-Based Quality Control for Crowdsourced Air Temperature Data.pdf [Dataset]. http://doi.org/10.3389/feart.2018.00118.s001
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jun 4, 2023
    Dataset provided by
    Frontiers
    Authors
    Adrien Napoly; Tom Grassmann; Fred Meier; Daniel Fenner
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In urban areas, dense atmospheric observational networks with high-quality data are still a challenge due to high costs for installation and maintenance over time. Citizen weather stations (CWS) could be one answer to that issue. Since more and more owners of CWS share their measurement data publicly, crowdsourcing, i.e., the automated collection of large amounts of data from an undefined crowd of citizens, opens new pathways for atmospheric research. However, the most critical issue is found to be the quality of data from such networks. In this study, a statistically-based quality control (QC) is developed to identify suspicious air temperature (T) measurements from crowdsourced data sets. The newly developed QC exploits the combined knowledge of the dense network of CWS to statistically identify implausible measurements, independent of external reference data. The evaluation of the QC is performed using data from Netatmo CWS in Toulouse, France, and Berlin, Germany, over a 1-year period (July 2016 to June 2017), comparing the quality-controlled data with data from two networks of reference stations. The new QC efficiently identifies erroneous data due to solar exposition and siting issues, which are common error sources of CWS. Estimation of T is improved when averaging data from a group of stations within a restricted area rather than relying on data of individual CWS. However, a positive deviation in CWS data compared to reference data is identified, particularly for daily minimum T. To illustrate the transferability of the newly developed QC and the applicability of CWS data, a mapping of T is performed over the city of Paris, France, where spatial density of CWS is especially high.

  9. d

    Data Quality Assurance - Field Replicates

    • catalog.data.gov
    • s.cnmilf.com
    Updated Jul 6, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Data Quality Assurance - Field Replicates [Dataset]. https://catalog.data.gov/dataset/data-quality-assurance-field-replicates
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Description

    This dataset contains replicate samples collected in the field by community technicians. No field replicates were collected in 2012. Replicate constituents with differences less than 10 percent are considered acceptable.

  10. Data quality indicators

    • ons.gov.uk
    • cy.ons.gov.uk
    xlsx
    Updated Feb 13, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Office for National Statistics (2020). Data quality indicators [Dataset]. https://www.ons.gov.uk/peoplepopulationandcommunity/personalandhouseholdfinances/incomeandwealth/datasets/dataqualityindicators
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Feb 13, 2020
    Dataset provided by
    Office for National Statisticshttp://www.ons.gov.uk/
    License

    Open Government Licence 3.0http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/
    License information was derived automatically

    Description

    Metrics used to give an indication of data quality between our test’s groups. This includes whether documentation was used and what proportion of respondents rounded their answers. Unit and item non-response are also reported.

  11. l

    CalOES NG9-1-1 GIS Data Quality Control Plan April 18, 2022

    • data.lacounty.gov
    • geohub.lacity.org
    • +1more
    Updated Jul 19, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    County of Los Angeles (2022). CalOES NG9-1-1 GIS Data Quality Control Plan April 18, 2022 [Dataset]. https://data.lacounty.gov/documents/ddb6c8e2e41b4990ba38e0bbe93e343a
    Explore at:
    Dataset updated
    Jul 19, 2022
    Dataset authored and provided by
    County of Los Angeles
    Description

    GIS quality control checks are intended to identify issues in the source data that may impact a variety of9-1-1 end use systems.The primary goal of the initial CalOES NG9-1-1 implementation is to facilitate 9-1-1 call routing. Thesecondary goal is to use the data for telephone record validation through the LVF and the GIS-derivedMSAG.With these goals in mind, the GIS QC checks, and the impact of errors found by them are categorized asfollows in this document:Provisioning Failure Errors: GIS data issues resulting in ingest failures (results in no provisioning of one or more layers)Tier 1 Critical errors: Impact on initial 9-1-1 call routing and discrepancy reportingTier 2 Critical errors: Transition to GIS derived MSAGTier 3 Warning-level errors: Impact on routing of call transfersTier 4 Other errors: Impact on PSAP mapping and CAD systemsGeoComm's GIS Data Hub is configurable to stop GIS data that exceeds certain quality control check error thresholdsfrom provisioning to the SI (Spatial Interface) and ultimately to the ECRFs, LVFs and the GIS derivedMSAG.

  12. o

    Curtailment (SPEN_009) Data Quality Checks

    • spenergynetworks.opendatasoft.com
    Updated Mar 28, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Curtailment (SPEN_009) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_curtailment/
    Explore at:
    Dataset updated
    Mar 28, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Curtailment dataset. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  13. e

    Internal Quality checks database

    • data.europa.eu
    • cloud.csiss.gmu.edu
    • +1more
    Updated Sep 21, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rural Payments Agency (2021). Internal Quality checks database [Dataset]. https://data.europa.eu/data/datasets/internal-quality-checks-database
    Explore at:
    Dataset updated
    Sep 21, 2021
    Dataset authored and provided by
    Rural Payments Agency
    Description

    Details of processing accuracy rates

  14. d

    Data Quality Assurance - Instrument Detection Limits

    • catalog.data.gov
    • dataone.org
    Updated Jul 6, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Data Quality Assurance - Instrument Detection Limits [Dataset]. https://catalog.data.gov/dataset/data-quality-assurance-instrument-detection-limits
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Description

    This dataset includes laboratory instrument detection limit data associated with laboratory instruments used in the analysis of surface water samples collected as part of the USGS - Yukon River Inter-Tribal Watershed Council collaborative water quality monitoring project.

  15. i

    Cloud Data Quality Monitoring and Testing Market Report

    • imrmarketreports.com
    Updated Mar 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Swati Kalagate; Akshay Patil; Vishal Kumbhar (2023). Cloud Data Quality Monitoring and Testing Market Report [Dataset]. https://www.imrmarketreports.com/reports/cloud-data-quality-monitoring-and-testing-market
    Explore at:
    Dataset updated
    Mar 2023
    Dataset provided by
    IMR Market Reports
    Authors
    Swati Kalagate; Akshay Patil; Vishal Kumbhar
    License

    https://www.imrmarketreports.com/privacy-policy/https://www.imrmarketreports.com/privacy-policy/

    Description

    Global Cloud Data Quality Monitoring and Testing Market Report 2022 comes with the extensive industry analysis of development components, patterns, flows and sizes. The report also calculates present and past market values to forecast potential market management through the forecast period between 2022-2028. The report may be the best of what is a geographic area which expands the competitive landscape and industry perspective of the market.

  16. C

    Cloud Data Quality Monitoring and Testing Report

    • marketresearchforecast.com
    doc, pdf, ppt
    Updated Mar 22, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Market Research Forecast (2025). Cloud Data Quality Monitoring and Testing Report [Dataset]. https://www.marketresearchforecast.com/reports/cloud-data-quality-monitoring-and-testing-47835
    Explore at:
    ppt, doc, pdfAvailable download formats
    Dataset updated
    Mar 22, 2025
    Dataset authored and provided by
    Market Research Forecast
    License

    https://www.marketresearchforecast.com/privacy-policyhttps://www.marketresearchforecast.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The Cloud Data Quality Monitoring and Testing market is experiencing robust growth, driven by the increasing reliance on cloud-based data storage and processing, the burgeoning volume of big data, and the stringent regulatory compliance requirements across various industries. The market's expansion is fueled by the need for real-time data quality assurance, proactive identification of data anomalies, and improved data governance. Businesses are increasingly adopting cloud-based solutions to enhance operational efficiency, reduce infrastructure costs, and improve scalability. This shift is particularly evident in large enterprises, which are investing heavily in advanced data quality management tools to support their complex data landscapes. The growth of SMEs adopting cloud-based solutions also contributes significantly to market expansion. While on-premises solutions still hold a market share, the cloud-based segment is demonstrating a significantly higher growth rate, projected to dominate the market within the forecast period (2025-2033). Despite the positive market outlook, certain challenges hinder growth. These include concerns regarding data security and privacy in cloud environments, the complexity of integrating data quality tools with existing IT infrastructure, and the lack of skilled professionals proficient in cloud data quality management. However, advancements in AI and machine learning are mitigating these challenges, enabling automated data quality checks and anomaly detection, thus streamlining the process and reducing the reliance on manual intervention. The market is segmented geographically, with North America and Europe currently holding significant market shares due to early adoption of cloud technologies and robust regulatory frameworks. However, the Asia Pacific region is projected to experience substantial growth in the coming years due to increasing digitalization and expanding cloud infrastructure investments. This competitive landscape with established players and emerging innovative companies is further shaping the market's evolution and expansion.

  17. o

    Quality Assurance and Quality Control (QA/QC) of Meteorological Time Series...

    • osti.gov
    • knb.ecoinformatics.org
    • +1more
    Updated Jan 1, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. DOE > Office of Science > Biological and Environmental Research (BER) (2021). Quality Assurance and Quality Control (QA/QC) of Meteorological Time Series Data for Billy Barr, East River, Colorado USA [Dataset]. http://doi.org/10.15485/1823516
    Explore at:
    Dataset updated
    Jan 1, 2021
    Dataset provided by
    U.S. DOE > Office of Science > Biological and Environmental Research (BER)
    Environmental System Science Data Infrastructure for a Virtual Ecosystem (ESS-DIVE) (United States)
    Area covered
    United States, Colorado, East River
    Description

    A comprehensive Quality Assurance (QA) and Quality Control (QC) statistical framework consists of three major phases: Phase 1—Preliminary raw data sets exploration, including time formatting and combining datasets of different lengths and different time intervals; Phase 2—QA of the datasets, including detecting and flagging of duplicates, outliers, and extreme values; and Phase 3—the development of time series of a desired frequency, imputation of missing values, visualization and a final statistical summary. The time series data collected at the Billy Barr meteorological station (East River Watershed, Colorado) were analyzed. The developed statistical framework is suitable for both real-time and post-data-collection QA/QC analysis of meteorological datasets.The files that are in this data package include one excel file, converted to CSV format (Billy_Barr_raw_qaqc.csv) that contains the raw meteorological data, i.e., input data used for the QA/QC analysis. The second CSV file (Billy_Barr_1hr.csv) is the QA/QC and flagged meteorological data, i.e., output data from the QA/QC analysis. The last file (QAQC_Billy_Barr_2021-03-22.R) is a script written in R that implements the QA/QC and flagging process. The purpose of the CSV data files included in this package is to provide input and output files implemented in the R script.

  18. o

    Flexibility Market Prospectus (SPEN_014) Data Quality Checks

    • spenergynetworks.opendatasoft.com
    Updated Mar 28, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Flexibility Market Prospectus (SPEN_014) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_flexibility/
    Explore at:
    Dataset updated
    Mar 28, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Flexibility Market Prospectus dataset. The quality assessment was carried out on the 31st March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  19. U

    Data Quality Assurance - Laboratory duplicates

    • data.usgs.gov
    • dataone.org
    • +1more
    Updated Feb 24, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Nicole Herman-Mercer (2024). Data Quality Assurance - Laboratory duplicates [Dataset]. http://doi.org/10.5066/F77D2S7B
    Explore at:
    Dataset updated
    Feb 24, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Authors
    Nicole Herman-Mercer
    License

    U.S. Government Workshttps://www.usa.gov/government-works
    License information was derived automatically

    Description

    This dataset includes data quality assurance information concerning the Relative Percent Difference (RPD) of laboratory duplicates. No laboratory duplicate information exists for 2010. The formula for calculating relative percent difference is: ABS(2*[(A-B)/(A+B)]). An RPD of less the 10% is considered acceptable.

  20. d

    TagX Web Browsing clickstream Data - 300K Users North America, EU - GDPR -...

    • datarade.ai
    .json, .csv, .xls
    Updated Sep 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    TagX (2024). TagX Web Browsing clickstream Data - 300K Users North America, EU - GDPR - CCPA Compliant [Dataset]. https://datarade.ai/data-products/tagx-web-browsing-clickstream-data-300k-users-north-america-tagx
    Explore at:
    .json, .csv, .xlsAvailable download formats
    Dataset updated
    Sep 16, 2024
    Dataset authored and provided by
    TagX
    Area covered
    Macedonia (the former Yugoslav Republic of), Ireland, Luxembourg, Switzerland, Andorra, China, Holy See, Japan, United States of America, Finland
    Description

    TagX Web Browsing Clickstream Data: Unveiling Digital Behavior Across North America and EU Unique Insights into Online User Behavior TagX Web Browsing clickstream Data offers an unparalleled window into the digital lives of 1 million users across North America and the European Union. This comprehensive dataset stands out in the market due to its breadth, depth, and stringent compliance with data protection regulations. What Makes Our Data Unique?

    Extensive Geographic Coverage: Spanning two major markets, our data provides a holistic view of web browsing patterns in developed economies. Large User Base: With 300K active users, our dataset offers statistically significant insights across various demographics and user segments. GDPR and CCPA Compliance: We prioritize user privacy and data protection, ensuring that our data collection and processing methods adhere to the strictest regulatory standards. Real-time Updates: Our clickstream data is continuously refreshed, providing up-to-the-minute insights into evolving online trends and user behaviors. Granular Data Points: We capture a wide array of metrics, including time spent on websites, click patterns, search queries, and user journey flows.

    Data Sourcing: Ethical and Transparent Our web browsing clickstream data is sourced through a network of partnered websites and applications. Users explicitly opt-in to data collection, ensuring transparency and consent. We employ advanced anonymization techniques to protect individual privacy while maintaining the integrity and value of the aggregated data. Key aspects of our data sourcing process include:

    Voluntary user participation through clear opt-in mechanisms Regular audits of data collection methods to ensure ongoing compliance Collaboration with privacy experts to implement best practices in data anonymization Continuous monitoring of regulatory landscapes to adapt our processes as needed

    Primary Use Cases and Verticals TagX Web Browsing clickstream Data serves a multitude of industries and use cases, including but not limited to:

    Digital Marketing and Advertising:

    Audience segmentation and targeting Campaign performance optimization Competitor analysis and benchmarking

    E-commerce and Retail:

    Customer journey mapping Product recommendation enhancements Cart abandonment analysis

    Media and Entertainment:

    Content consumption trends Audience engagement metrics Cross-platform user behavior analysis

    Financial Services:

    Risk assessment based on online behavior Fraud detection through anomaly identification Investment trend analysis

    Technology and Software:

    User experience optimization Feature adoption tracking Competitive intelligence

    Market Research and Consulting:

    Consumer behavior studies Industry trend analysis Digital transformation strategies

    Integration with Broader Data Offering TagX Web Browsing clickstream Data is a cornerstone of our comprehensive digital intelligence suite. It seamlessly integrates with our other data products to provide a 360-degree view of online user behavior:

    Social Media Engagement Data: Combine clickstream insights with social media interactions for a holistic understanding of digital footprints. Mobile App Usage Data: Cross-reference web browsing patterns with mobile app usage to map the complete digital journey. Purchase Intent Signals: Enrich clickstream data with purchase intent indicators to power predictive analytics and targeted marketing efforts. Demographic Overlays: Enhance web browsing data with demographic information for more precise audience segmentation and targeting.

    By leveraging these complementary datasets, businesses can unlock deeper insights and drive more impactful strategies across their digital initiatives. Data Quality and Scale We pride ourselves on delivering high-quality, reliable data at scale:

    Rigorous Data Cleaning: Advanced algorithms filter out bot traffic, VPNs, and other non-human interactions. Regular Quality Checks: Our data science team conducts ongoing audits to ensure data accuracy and consistency. Scalable Infrastructure: Our robust data processing pipeline can handle billions of daily events, ensuring comprehensive coverage. Historical Data Availability: Access up to 24 months of historical data for trend analysis and longitudinal studies. Customizable Data Feeds: Tailor the data delivery to your specific needs, from raw clickstream events to aggregated insights.

    Empowering Data-Driven Decision Making In today's digital-first world, understanding online user behavior is crucial for businesses across all sectors. TagX Web Browsing clickstream Data empowers organizations to make informed decisions, optimize their digital strategies, and stay ahead of the competition. Whether you're a marketer looking to refine your targeting, a product manager seeking to enhance user experience, or a researcher exploring digital trends, our cli...

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
(2025). Technical Limits (SPEN_018) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_technical_limits

Technical Limits (SPEN_018) Data Quality Checks - Dataset - Datopian CKAN instance

Explore at:
Dataset updated
May 27, 2025
Description

This data table provides the detailed data quality assessment scores for the Technical Limits dataset. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

Search
Clear search
Close search
Google apps
Main menu