87 datasets found
  1. o

    Curtailment (SPEN_009) Data Quality Checks

    • spenergynetworks.opendatasoft.com
    Updated Mar 28, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Curtailment (SPEN_009) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_curtailment/
    Explore at:
    Dataset updated
    Mar 28, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Curtailment dataset. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

  2. D

    Data Quality Tools Industry Report

    • marketreportanalytics.com
    doc, pdf, ppt
    Updated Apr 21, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Market Report Analytics (2025). Data Quality Tools Industry Report [Dataset]. https://www.marketreportanalytics.com/reports/data-quality-tools-industry-89686
    Explore at:
    pdf, ppt, docAvailable download formats
    Dataset updated
    Apr 21, 2025
    Dataset authored and provided by
    Market Report Analytics
    License

    https://www.marketreportanalytics.com/privacy-policyhttps://www.marketreportanalytics.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The Data Quality Tools market is experiencing robust growth, fueled by the increasing volume and complexity of data across diverse industries. The market, currently valued at an estimated $XX million in 2025 (assuming a logically derived value based on a 17.5% CAGR from a 2019 base year), is projected to reach $YY million by 2033. This substantial expansion is driven by several key factors. Firstly, the rising adoption of cloud-based solutions offers enhanced scalability, flexibility, and cost-effectiveness, attracting both small and medium enterprises (SMEs) and large enterprises. Secondly, the growing need for regulatory compliance (e.g., GDPR, CCPA) necessitates robust data quality management, pushing organizations to invest in advanced tools. Further, the increasing reliance on data-driven decision-making across sectors like BFSI, healthcare, and retail necessitates high-quality, reliable data, thus boosting market demand. The preference for software solutions over on-premise deployments and the substantial investments in services aimed at data integration and cleansing contribute to this growth. However, certain challenges restrain market expansion. High initial investment costs, the complexity of implementation, and the need for skilled professionals to manage these tools can act as barriers for some organizations, particularly SMEs. Furthermore, concerns related to data security and privacy continue to impact adoption rates. Despite these challenges, the long-term outlook for the Data Quality Tools market remains positive, driven by the ever-increasing importance of data quality in a rapidly digitalizing world. The market segmentation highlights significant opportunities across different deployment models, organizational sizes, and industry verticals, suggesting diverse avenues for growth and innovation in the coming years. Competition among established players like IBM, Informatica, and Oracle, alongside emerging players, is intensifying, driving innovation and providing diverse solutions to meet varied customer needs. Recent developments include: September 2022: MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) spin-off DataCebo announced the launch of a new tool, dubbed Synthetic Data (SD) Metrics, to help enterprises compare the quality of machine-generated synthetic data by pitching it against real data sets., May 2022: Pyramid Analytics, which developed its flagship platform, Pyramids Decision Intelligence, announced that it raised USD 120 million in a Series E round of funding. The Pyramid Decision Intelligence platform combines business analytics, data preparation, and data science capabilities with AI guidance functionality. It enables governed self-service analytics in a no-code environment.. Key drivers for this market are: Increasing Use of External Data Sources Owing to Mobile Connectivity Growth. Potential restraints include: Increasing Use of External Data Sources Owing to Mobile Connectivity Growth. Notable trends are: Healthcare is Expected to Witness Significant Growth.

  3. Data from: Identifying prevalent quality issues in code changes by analyzing...

    • zenodo.org
    bin
    Updated Jul 7, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Umar Iftikhar; Umar Iftikhar (2024). Identifying prevalent quality issues in code changes by analyzing reviewers' feedback [Dataset]. http://doi.org/10.5281/zenodo.10408930
    Explore at:
    binAvailable download formats
    Dataset updated
    Jul 7, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Umar Iftikhar; Umar Iftikhar
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The dataset contains python scripts (for data extraction, preprocessing) and dataset for the paper "Identifying prevalent quality issues in code changes by analyzing reviewers' feedback" Submitted to ENASE 2024.

  4. Short-form data quality indicators: Canada, provinces and territories,...

    • www150.statcan.gc.ca
    • ouvert.canada.ca
    • +1more
    Updated Aug 17, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Government of Canada, Statistics Canada (2022). Short-form data quality indicators: Canada, provinces and territories, census metropolitan areas, census agglomerations and census subdivisions [Dataset]. http://doi.org/10.25318/9810016501-eng
    Explore at:
    Dataset updated
    Aug 17, 2022
    Dataset provided by
    Statistics Canadahttps://statcan.gc.ca/en
    Area covered
    Canada
    Description

    Data on short-form data quality indicators for 2021 Census, Canada, provinces and territories, census metropolitan areas, census agglomerations and census subdivisions.

  5. Canada’s 2018-2020 National Action Plan on Open Government – Federal...

    • open.canada.ca
    pdf
    Updated Nov 20, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Natural Resources Canada (2024). Canada’s 2018-2020 National Action Plan on Open Government – Federal Geospatial Platform Data Quality Assessment: Results for 2018-2019 [Dataset]. https://open.canada.ca/data/en/dataset/316f1af5-f931-4006-a17e-efee8211cdcc
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Nov 20, 2024
    Dataset provided by
    Ministry of Natural Resources of Canadahttps://www.nrcan.gc.ca/
    License

    Open Government Licence - Canada 2.0https://open.canada.ca/en/open-government-licence-canada
    License information was derived automatically

    Time period covered
    Jan 1, 2018 - Jun 24, 2020
    Area covered
    Canada
    Description

    Under the Open Government Action Plan, and related National Action Plan, the FGP is required to report on its commitments related to: supporting a user-friendly open government platform; improving the quality of open data available on open.canada.ca; and reviewing additional geospatial datasets to assess their quality. This report summarizes the FGP’s action on meeting these commitments.

  6. Trust barriers related AI data and privacy within companies in U.S. 2019

    • statista.com
    Updated Mar 17, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2022). Trust barriers related AI data and privacy within companies in U.S. 2019 [Dataset]. https://www.statista.com/statistics/1045218/united-states-ai-trust-data-quality-privacy/
    Explore at:
    Dataset updated
    Mar 17, 2022
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Mar 2019
    Area covered
    United States
    Description

    According to a survey conducted at the EmTech Digital conference in March 2019, U.S. business leaders shared their opinions on trust issues with regard to AI data quality and privacy. Nearly half of respondents reported a lack of trust in the quality of AI data in their companies, showing that there is still a long way to go to get quality AI data.

  7. o

    Embedded Capacity Register (SPEN_006) Data Quality Checks

    • spenergynetworks.opendatasoft.com
    Updated May 1, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Embedded Capacity Register (SPEN_006) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_ecr/
    Explore at:
    Dataset updated
    May 1, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Embedded Capacity Register dataset. The quality assessment was carried out on 30th April. Please note, this assessment only covers 1MW and above data. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please not that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy NetworksWe welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.DisclaimerThe data quality assessment may not represent the quality of the current dataset that is published on the Open Data Portal. Please check the date of the latest quality assessment and compare to the 'Modified' date of the corresponding dataset. The data quality assessments will be updated on either a quarterly or annual basis, dependent on the update frequency of the dataset. This information can be found in the dataset metadata, within the Information tab. If you require a more up to date quality assessment, please contact the Open Data Team at opendata@spenergynetworks.co.uk and a member of the team will be in contact.

  8. f

    Additional file 8 of Some data quality issues at ClinicalTrials.gov

    • springernature.figshare.com
    xls
    Updated Jun 2, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Neha Chaturvedi; Bagish Mehrotra; Sangeeta Kumari; Saurabh Gupta; H. S. Subramanya; Gayatri Saberwal (2023). Additional file 8 of Some data quality issues at ClinicalTrials.gov [Dataset]. http://doi.org/10.6084/m9.figshare.11981733.v1
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 2, 2023
    Dataset provided by
    figshare
    Authors
    Neha Chaturvedi; Bagish Mehrotra; Sangeeta Kumari; Saurabh Gupta; H. S. Subramanya; Gayatri Saberwal
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Records with a completion date and registered with an authority in the USA. The data (58,685 records from Additional file 6: Table S4) were sorted into a “USA_ComplDate” sheet for trials registered with at least one authority in the US, and a “USA_ComplDate_leftovers” sheet with the remaining records. The data are presented in the following six Recruitment Type categories: (1) Active, not recruiting (3350 selected records with 2121 leftovers), (2) Completed (21,030; 17,967), (3) Enrolling by invitation (166; 175), (4) Recruiting (3167; 4666), (5) Suspended (134; 99), and (6) Terminated (3986; 1824). The sheets for these categories are numbered 1–6, respectively. (XLS 6129 kb)

  9. o

    Network Flow: Power, Current and Embedded Generation (SPEN_008) Data Quality...

    • spenergynetworks.opendatasoft.com
    Updated Mar 28, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Network Flow: Power, Current and Embedded Generation (SPEN_008) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_network_flow/
    Explore at:
    Dataset updated
    Mar 28, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Network Flow: Power, Current and Embedded Generation dataset. The quality assessment was carried out on the 31st March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks.We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.DisclaimerThe data quality assessment may not represent the quality of the current dataset that is published on the Open Data Portal. Please check the date of the latest quality assessment and compare to the 'Modified' date of the corresponding dataset. The data quality assessments will be updated on either a quarterly or annual basis, dependent on the update frequency of the dataset. This information can be found in the dataset metadata, within the Information tab. If you require a more up to date quality assessment, please contact the Open Data Team at opendata@spenergynetworks.co.uk and a member of the team will be in contact.

  10. d

    Voltage (SPEN_012) Data Quality Checks - Dataset - Datopian CKAN instance

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Voltage (SPEN_012) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_voltage
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    This dataset provides the detailed data quality assessment scores for the Voltage dataset. The quality assessment was carried out on the 31st March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please not that the quality assessment may be based on an earlier version of the dataset. To access our full suite of aggregated quality assessments and learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding our approach to data quality. Our Open Data team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the dataset schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the datasets with the results when available.

  11. D

    Data Quality Tools Market Report

    • marketresearchforecast.com
    doc, pdf, ppt
    Updated Dec 20, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Market Research Forecast (2024). Data Quality Tools Market Report [Dataset]. https://www.marketresearchforecast.com/reports/data-quality-tools-market-5240
    Explore at:
    doc, ppt, pdfAvailable download formats
    Dataset updated
    Dec 20, 2024
    Dataset authored and provided by
    Market Research Forecast
    License

    https://www.marketresearchforecast.com/privacy-policyhttps://www.marketresearchforecast.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The data quality tools market mainly consists of systems and programs under which the quality and reliability of data on various sources and structures can be achieved. They offer functionalities such as data subsetting, data cleaning, data de-duplication, and data validation, which are useful in assessing and rectifying the quality of data in organizations. Key business activity areas include data integration, migration, and governance, with decision-making, analytics, and compliance being viewed as major use cases. prominent sectors include finance, health, and social care, retail and wholesale, manufacturing, and construction. Market issues include the attempt to apply machine learning or artificial intelligence for better data quality, the attempt to apply cloud solutions for scalability and availability, and the need to be concerned with data privacy and regulations. Its employ has been subject to more focus given its criticality in business these days in addition to the increasing market need for enhancing data quality. Key drivers for this market are: Increased Digitization and High Adoption of Automation to Propel Market Growth. Potential restraints include: Privacy and Security Issues to Hamper Market Growth. Notable trends are: Growing Implementation of Touch-based and Voice-based Infotainment Systems to Increase Adoption of Intelligent Cars.

  12. f

    Tailored Site Data Quality Summaries.

    • figshare.com
    • datasetcatalog.nlm.nih.gov
    xls
    Updated Jun 27, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hanieh Razzaghi; Amy Goodwin Davies; Samuel Boss; H. Timothy Bunnell; Yong Chen; Elizabeth A. Chrischilles; Kimberley Dickinson; David Hanauer; Yungui Huang; K. T. Sandra Ilunga; Chryso Katsoufis; Harold Lehmann; Dominick J. Lemas; Kevin Matthews; Eneida A. Mendonca; Keith Morse; Daksha Ranade; Marc Rosenman; Bradley Taylor; Kellie Walters; Michelle R. Denburg; Christopher B. Forrest; L. Charles Bailey (2024). Tailored Site Data Quality Summaries. [Dataset]. http://doi.org/10.1371/journal.pdig.0000527.t004
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 27, 2024
    Dataset provided by
    PLOS Digital Health
    Authors
    Hanieh Razzaghi; Amy Goodwin Davies; Samuel Boss; H. Timothy Bunnell; Yong Chen; Elizabeth A. Chrischilles; Kimberley Dickinson; David Hanauer; Yungui Huang; K. T. Sandra Ilunga; Chryso Katsoufis; Harold Lehmann; Dominick J. Lemas; Kevin Matthews; Eneida A. Mendonca; Keith Morse; Daksha Ranade; Marc Rosenman; Bradley Taylor; Kellie Walters; Michelle R. Denburg; Christopher B. Forrest; L. Charles Bailey
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Study-specific data quality testing is an essential part of minimizing analytic errors, particularly for studies making secondary use of clinical data. We applied a systematic and reproducible approach for study-specific data quality testing to the analysis plan for PRESERVE, a 15-site, EHR-based observational study of chronic kidney disease in children. This approach integrated widely adopted data quality concepts with healthcare-specific evaluation methods. We implemented two rounds of data quality assessment. The first produced high-level evaluation using aggregate results from a distributed query, focused on cohort identification and main analytic requirements. The second focused on extended testing of row-level data centralized for analysis. We systematized reporting and cataloguing of data quality issues, providing institutional teams with prioritized issues for resolution. We tracked improvements and documented anomalous data for consideration during analyses. The checks we developed identified 115 and 157 data quality issues in the two rounds, involving completeness, data model conformance, cross-variable concordance, consistency, and plausibility, extending traditional data quality approaches to address more complex stratification and temporal patterns. Resolution efforts focused on higher priority issues, given finite study resources. In many cases, institutional teams were able to correct data extraction errors or obtain additional data, avoiding exclusion of 2 institutions entirely and resolving 123 other gaps. Other results identified complexities in measures of kidney function, bearing on the study’s outcome definition. Where limitations such as these are intrinsic to clinical data, the study team must account for them in conducting analyses. This study rigorously evaluated fitness of data for intended use. The framework is reusable and built on a strong theoretical underpinning. Significant data quality issues that would have otherwise delayed analyses or made data unusable were addressed. This study highlights the need for teams combining subject-matter and informatics expertise to address data quality when working with real world data.

  13. o

    Transmission Generation Heat Map (SPEN_017) Data Quality Checks

    • spenergynetworks.opendatasoft.com
    Updated Mar 28, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Transmission Generation Heat Map (SPEN_017) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_transmission_generation_heat_map/
    Explore at:
    Dataset updated
    Mar 28, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Transmission Generation Heat Map. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy NetworksWe welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the dataset schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.DisclaimerThe data quality assessment may not represent the quality of the current dataset that is published on the Open Data Portal. Please check the date of the latest quality assessment and compare to the 'Modified' date of the corresponding dataset. The data quality assessments will be updated on either a quarterly or annual basis, dependent on the update frequency of the dataset. This information can be found in the dataset metadata, within the Information tab. If you require a more up to date quality assessment, please contact the Open Data Team at opendata@spenergynetworks.co.uk and a member of the team will be in contact.

  14. c

    Questions and responses to USGS-wide poll on quality assurance practices for...

    • s.cnmilf.com
    • data.usgs.gov
    • +2more
    Updated Jul 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Questions and responses to USGS-wide poll on quality assurance practices for timeseries data, 2021 [Dataset]. https://s.cnmilf.com/user74170196/https/catalog.data.gov/dataset/questions-and-responses-to-usgs-wide-poll-on-quality-assurance-practices-for-timeseries-da
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Description

    This data record contains questions and responses to a USGS-wide survey conducted to identify issues and needs associated with quality assurance and quality control (QA/QC) of USGS timeseries data streams. This research was funded by the USGS Community for Data Integration as part of a project titled “From reactive- to condition-based maintenance: Artificial intelligence for anomaly predictions and operational decision-making”. The poll targeted monitoring network managers and technicians and asked questions about operational data streams and timeseries data collection in order to identity opportunities to streamline data access, expedite the response to data quality issues, improve QA/QC procedures, reduce operations costs, and uncover other maintenance needs. The poll was created using an online survey platform. It was sent to 2326 systematically selected USGS email addresses and received 175 responses in 11 days before it was closed to respondents. The poll contained 48 questions of various types including long answer, multiple choice, and ranking questions. The survey contained a mix of mandatory and optional questions. These distinctions as well as full descriptions of survey questions are noted on the metadata.

  15. d

    Maryland Counties Match Tool for Data Quality

    • catalog.data.gov
    • opendata.maryland.gov
    • +1more
    Updated Sep 15, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    opendata.maryland.gov (2023). Maryland Counties Match Tool for Data Quality [Dataset]. https://catalog.data.gov/dataset/maryland-counties-match-tool-for-data-quality
    Explore at:
    Dataset updated
    Sep 15, 2023
    Dataset provided by
    opendata.maryland.gov
    Area covered
    Maryland
    Description

    Data standardization is an important part of effective management. However, sometimes people have data that doesn't match. This dataset includes different ways that counties might get written by different people. It can be used as a lookup table when you need County to be your unique identifier. For example, it allows you to match St. Mary's, St Marys, and Saint Mary's so that you can use it with disparate data from other data sets.

  16. f

    Description of Themes Identified from Data Quality Issues.

    • plos.figshare.com
    • datasetcatalog.nlm.nih.gov
    xls
    Updated Jun 27, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hanieh Razzaghi; Amy Goodwin Davies; Samuel Boss; H. Timothy Bunnell; Yong Chen; Elizabeth A. Chrischilles; Kimberley Dickinson; David Hanauer; Yungui Huang; K. T. Sandra Ilunga; Chryso Katsoufis; Harold Lehmann; Dominick J. Lemas; Kevin Matthews; Eneida A. Mendonca; Keith Morse; Daksha Ranade; Marc Rosenman; Bradley Taylor; Kellie Walters; Michelle R. Denburg; Christopher B. Forrest; L. Charles Bailey (2024). Description of Themes Identified from Data Quality Issues. [Dataset]. http://doi.org/10.1371/journal.pdig.0000527.t003
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 27, 2024
    Dataset provided by
    PLOS Digital Health
    Authors
    Hanieh Razzaghi; Amy Goodwin Davies; Samuel Boss; H. Timothy Bunnell; Yong Chen; Elizabeth A. Chrischilles; Kimberley Dickinson; David Hanauer; Yungui Huang; K. T. Sandra Ilunga; Chryso Katsoufis; Harold Lehmann; Dominick J. Lemas; Kevin Matthews; Eneida A. Mendonca; Keith Morse; Daksha Ranade; Marc Rosenman; Bradley Taylor; Kellie Walters; Michelle R. Denburg; Christopher B. Forrest; L. Charles Bailey
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Description of Themes Identified from Data Quality Issues.

  17. f

    Additional file 18 of Some data quality issues at ClinicalTrials.gov

    • springernature.figshare.com
    • figshare.com
    xlsx
    Updated Jun 15, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Neha Chaturvedi; Bagish Mehrotra; Sangeeta Kumari; Saurabh Gupta; H. S. Subramanya; Gayatri Saberwal (2023). Additional file 18 of Some data quality issues at ClinicalTrials.gov [Dataset]. http://doi.org/10.6084/m9.figshare.11981700.v1
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Jun 15, 2023
    Dataset provided by
    figshare
    Authors
    Neha Chaturvedi; Bagish Mehrotra; Sangeeta Kumari; Saurabh Gupta; H. S. Subramanya; Gayatri Saberwal
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Processing of PI names to identify ambiguities—step 2. (XLSX 1406 kb)

  18. Assessing the Relationship Between Interviewer Effects and NSDUH Data...

    • catalog.data.gov
    • odgavaprod.ogopendata.com
    • +1more
    Updated Jul 31, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Substance Abuse and Mental Health Services Administration (2025). Assessing the Relationship Between Interviewer Effects and NSDUH Data Quality [Dataset]. https://catalog.data.gov/dataset/assessing-the-relationship-between-interviewer-effects-and-nsduh-data-quality
    Explore at:
    Dataset updated
    Jul 31, 2025
    Dataset provided by
    Substance Abuse and Mental Health Services Administrationhttp://www.samhsa.gov/
    Description

    This National Survey on Drug Use and Health (NSDUH) methodological report presents analyzes the relationships between several field interviewer characteristics and various survey outcomes, including response rates and respondent self-reports on substance use and mental health indicators.

  19. Additional file 6: of Some data quality issues at ClinicalTrials.gov

    • springernature.figshare.com
    ods
    Updated Jun 1, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Neha Chaturvedi; Bagish Mehrotra; Sangeeta Kumari; Saurabh Gupta; H. Subramanya; Gayatri Saberwal (2023). Additional file 6: of Some data quality issues at ClinicalTrials.gov [Dataset]. http://doi.org/10.6084/m9.figshare.8319071.v1
    Explore at:
    odsAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Neha Chaturvedi; Bagish Mehrotra; Sangeeta Kumari; Saurabh Gupta; H. Subramanya; Gayatri Saberwal
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Records with a completion date. The data (64,496 records from Additional file 5: Table S3) were sorted into a “ComplDate” sheet for trials with completion dates, with the remaining records in a “ComplDate_leftovers” sheet. The data are presented in the following six Recruitment Type categories: (1) Active, not recruiting (5471 selected records with 1271 leftovers), (2) Completed (38,997; 2454), (3) Enrolling by invitation (341; 45), (4) Recruiting (7833; 1546), (5) Suspended (233; 106), and (6) Terminated (5810; 389). The sheets for these categories are numbered 1–6, respectively. The file is available at https://osf.io/jcb92 . (ODS 2300 kb)

  20. d

    Embedded Generation by Type (SPEN_010) Data Quality Checks - Dataset -...

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Embedded Generation by Type (SPEN_010) Data Quality Checks - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spen_data_quality_embedded_generation
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    This data table provides the detailed data quality assessment scores for the Embedded Generation by Type dataset. The quality assessment was carried out on the 31st March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
(2025). Curtailment (SPEN_009) Data Quality Checks [Dataset]. https://spenergynetworks.opendatasoft.com/explore/dataset/spen_data_quality_curtailment/

Curtailment (SPEN_009) Data Quality Checks

Explore at:
Dataset updated
Mar 28, 2025
Description

This data table provides the detailed data quality assessment scores for the Curtailment dataset. The quality assessment was carried out on the 31st of March. At SPEN, we are dedicated to sharing high-quality data with our stakeholders and being transparent about its' quality. This is why we openly share the results of our data quality assessments. We collaborate closely with Data Owners to address any identified issues and enhance our overall data quality. To demonstrate our progress we conduct, at a minimum, bi-annual assessments of our data quality - for datasets that are refreshed more frequently than this, please note that the quality assessment may be based on an earlier version of the dataset. To learn more about our approach to how we assess data quality, visit Data Quality - SP Energy Networks. We welcome feedback and questions from our stakeholders regarding this process. Our Open Data Team is available to answer any enquiries or receive feedback on the assessments. You can contact them via our Open Data mailbox at opendata@spenergynetworks.co.uk.The first phase of our comprehensive data quality assessment measures the quality of our datasets across three dimensions. Please refer to the data table schema for the definitions of these dimensions. We are now in the process of expanding our quality assessments to include additional dimensions to provide a more comprehensive evaluation and will update the data tables with the results when available.

Search
Clear search
Close search
Google apps
Main menu