100+ datasets found
  1. n

    Measuring quality of routine primary care data

    • data.niaid.nih.gov
    • datasetcatalog.nlm.nih.gov
    • +1more
    zip
    Updated Mar 12, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Olga Kostopoulou; Brendan Delaney (2021). Measuring quality of routine primary care data [Dataset]. http://doi.org/10.5061/dryad.dncjsxkzh
    Explore at:
    zipAvailable download formats
    Dataset updated
    Mar 12, 2021
    Dataset provided by
    Imperial College London
    Authors
    Olga Kostopoulou; Brendan Delaney
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Description

    Objective: Routine primary care data may be used for the derivation of clinical prediction rules and risk scores. We sought to measure the impact of a decision support system (DSS) on data completeness and freedom from bias.

    Materials and Methods: We used the clinical documentation of 34 UK General Practitioners who took part in a previous study evaluating the DSS. They consulted with 12 standardized patients. In addition to suggesting diagnoses, the DSS facilitates data coding. We compared the documentation from consultations with the electronic health record (EHR) (baseline consultations) vs. consultations with the EHR-integrated DSS (supported consultations). We measured the proportion of EHR data items related to the physician’s final diagnosis. We expected that in baseline consultations, physicians would document only or predominantly observations related to their diagnosis, while in supported consultations, they would also document other observations as a result of exploring more diagnoses and/or ease of coding.

    Results: Supported documentation contained significantly more codes (IRR=5.76 [4.31, 7.70] P<0.001) and less free text (IRR = 0.32 [0.27, 0.40] P<0.001) than baseline documentation. As expected, the proportion of diagnosis-related data was significantly lower (b=-0.08 [-0.11, -0.05] P<0.001) in the supported consultations, and this was the case for both codes and free text.

    Conclusions: We provide evidence that data entry in the EHR is incomplete and reflects physicians’ cognitive biases. This has serious implications for epidemiological research that uses routine data. A DSS that facilitates and motivates data entry during the consultation can improve routine documentation.

  2. Z

    Data from: Photometric Completeness Modelled With Neural Networks

    • data.niaid.nih.gov
    • zenodo.org
    Updated Sep 2, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Speagle, Joshua S (2023). Photometric Completeness Modelled With Neural Networks [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_8306413
    Explore at:
    Dataset updated
    Sep 2, 2023
    Dataset provided by
    Harris, William E
    Speagle, Joshua S
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Neural networks associated with the paper "Photometric Completeness Modelled With Neural Networks" (Harris & Speagle 2023).

    Neural networks (nn_clf_[...].joblib) are included for all possible parameter combinations and trained over various numbers of artificial star tests (ngc[...].dat). See the example notebook (nn_example.ipynb) for detailed explanations of the files, their contents, and some usage examples.

  3. f

    DataSheet1_Continuity and Completeness of Electronic Health Record Data for...

    • frontiersin.figshare.com
    docx
    Updated Jun 12, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chien-Ning Hsu; Kelly Huang; Fang-Ju Lin; Huang-Tz Ou; Ling-Ya Huang; Hsiao-Ching Kuo; Chi-Chuan Wang; Sengwee Toh (2023). DataSheet1_Continuity and Completeness of Electronic Health Record Data for Patients Treated With Oral Hypoglycemic Agents: Findings From Healthcare Delivery Systems in Taiwan.docx [Dataset]. http://doi.org/10.3389/fphar.2022.845949.s001
    Explore at:
    docxAvailable download formats
    Dataset updated
    Jun 12, 2023
    Dataset provided by
    Frontiers
    Authors
    Chien-Ning Hsu; Kelly Huang; Fang-Ju Lin; Huang-Tz Ou; Ling-Ya Huang; Hsiao-Ching Kuo; Chi-Chuan Wang; Sengwee Toh
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Taiwan
    Description

    Objective: To evaluate the continuity and completeness of electronic health record (EHR) data, and the concordance of select clinical outcomes and baseline comorbidities between EHR and linked claims data, from three healthcare delivery systems in Taiwan.Methods: We identified oral hypoglycemic agent (OHA) users from the Integrated Medical Database of National Taiwan University Hospital (NTUH-iMD), which was linked to the National Health Insurance Research Database (NHIRD), from June 2011 to December 2016. A secondary evaluation involved two additional EHR databases. We created consecutive 90-day periods before and after the first recorded OHA prescription and defined patients as having continuous EHR data if there was at least one encounter or prescription in a 90-day interval. EHR data completeness was measured by dividing the number of encounters in the NTUH-iMD by the number of encounters in the NHIRD. We assessed the concordance between EHR and claims data on three clinical outcomes (cardiovascular events, nephropathy-related events, and heart failure admission). We used individual comorbidities that comprised the Charlson comorbidity index to examine the concordance of select baseline comorbidities between EHRs and claims.Results: We identified 39,268 OHA users in the NTUH-iMD. Thirty-one percent (n = 12,296) of these users contributed to the analysis that examined data continuity during the 6-month baseline and 24-month follow-up period; 31% (n = 3,845) of the 12,296 users had continuous data during this 30-month period and EHR data completeness was 52%. The concordance of major cardiovascular events, nephropathy-related events, and heart failure admission was moderate, with the NTU-iMD capturing 49–55% of the outcome events recorded in the NHIRD. The concordance of comorbidities was considerably different between the NTUH-iMD and NHIRD, with an absolute standardized difference >0.1 for most comorbidities examined. Across the three EHR databases studied, 29–55% of the OHA users had continuous records during the 6-month baseline and 24-month follow-up period.Conclusion: EHR data continuity and data completeness may be suboptimal. A thorough evaluation of data continuity and completeness is recommended before conducting clinical and translational research using EHR data in Taiwan.

  4. f

    The degrees of data completeness for the response variables investigated.

    • plos.figshare.com
    xls
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jianghua Liu; Anna Rotkirch; Virpi Lummaa (2023). The degrees of data completeness for the response variables investigated. [Dataset]. http://doi.org/10.1371/journal.pone.0034898.t002
    Explore at:
    xlsAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Jianghua Liu; Anna Rotkirch; Virpi Lummaa
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Note. Complete–the variable value can be accurately determined (e.g. for 86.60% of the mothers under study, survival status (survival versus death) at age 15 of all produced offspring can be accurately determined); Incomplete–the variable value was estimated using the records available for some of all offspring (e.g. for 11.45% of the mothers, survival status data were available for some (at least one, but not all) of their offspring); Missing–there was no way to estimate the variable value and relevant mothers must be excluded from the analyses (e.g. for 1.95% of the mothers, survival status data were missing for all of their offspring); M-fertility–maternal lifetime fertility; O-survival–offspring survival rate at age 15; O-breeding–offspring breeding probability; M-LRS–maternal lifetime reproductive success; M-RBF–maternal risk of breeding failure.

  5. D

    Data Quality Tools Market Report | Global Forecast From 2025 To 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Jan 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Data Quality Tools Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-data-quality-tools-market
    Explore at:
    pptx, pdf, csvAvailable download formats
    Dataset updated
    Jan 7, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Data Quality Tools Market Outlook



    The global data quality tools market size was valued at $1.8 billion in 2023 and is projected to reach $4.2 billion by 2032, growing at a compound annual growth rate (CAGR) of 8.9% during the forecast period. The growth of this market is driven by the increasing importance of data accuracy and consistency in business operations and decision-making processes.



    One of the key growth factors is the exponential increase in data generation across industries, fueled by digital transformation and the proliferation of connected devices. Organizations are increasingly recognizing the value of high-quality data in driving business insights, improving customer experiences, and maintaining regulatory compliance. As a result, the demand for robust data quality tools that can cleanse, profile, and enrich data is on the rise. Additionally, the integration of advanced technologies such as AI and machine learning in data quality tools is enhancing their capabilities, making them more effective in identifying and rectifying data anomalies.



    Another significant driver is the stringent regulatory landscape that requires organizations to maintain accurate and reliable data records. Regulations such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States necessitate high standards of data quality to avoid legal repercussions and financial penalties. This has led organizations to invest heavily in data quality tools to ensure compliance. Furthermore, the competitive business environment is pushing companies to leverage high-quality data for improved decision-making, operational efficiency, and competitive advantage, thus further propelling the market growth.



    The increasing adoption of cloud-based solutions is also contributing significantly to the market expansion. Cloud platforms offer scalable, flexible, and cost-effective solutions for data management, making them an attractive option for organizations of all sizes. The ease of integration with various data sources and the ability to handle large volumes of data in real-time are some of the advantages driving the preference for cloud-based data quality tools. Moreover, the COVID-19 pandemic has accelerated the digital transformation journey for many organizations, further boosting the demand for data quality tools as companies seek to harness the power of data for strategic decision-making in a rapidly changing environment.



    Data Wrangling is becoming an increasingly vital process in the realm of data quality tools. As organizations continue to generate vast amounts of data, the need to transform and prepare this data for analysis is paramount. Data wrangling involves cleaning, structuring, and enriching raw data into a desired format, making it ready for decision-making processes. This process is essential for ensuring that data is accurate, consistent, and reliable, which are critical components of data quality. With the integration of AI and machine learning, data wrangling tools are becoming more sophisticated, allowing for automated data preparation and reducing the time and effort required by data analysts. As businesses strive to leverage data for competitive advantage, the role of data wrangling in enhancing data quality cannot be overstated.



    On a regional level, North America currently holds the largest market share due to the presence of major technology companies and a high adoption rate of advanced data management solutions. However, the Asia Pacific region is expected to witness the highest growth rate during the forecast period. The increasing digitization across industries, coupled with government initiatives to promote digital economies in countries like China and India, is driving the demand for data quality tools in this region. Additionally, Europe remains a significant market, driven by stringent data protection regulations and a strong emphasis on data governance.



    Component Analysis



    The data quality tools market is segmented into software and services. The software segment includes various tools and applications designed to improve the accuracy, consistency, and reliability of data. These tools encompass data profiling, data cleansing, data enrichment, data matching, and data monitoring, among others. The software segment dominates the market, accounting for a substantial share due to the increasing need for automated data management solutions. The integration of AI and machine learning into these too

  6. u

    Ozone (O3) (Data Completeness Report) - 2 - Catalogue - Canadian Urban Data...

    • data.urbandatacentre.ca
    Updated Sep 18, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2023). Ozone (O3) (Data Completeness Report) - 2 - Catalogue - Canadian Urban Data Catalogue (CUDC) [Dataset]. https://data.urbandatacentre.ca/dataset/ozone-o3-data-completeness-report-2
    Explore at:
    Dataset updated
    Sep 18, 2023
    Description

    Hourly ground-level ozone (O3) concentrations were estimated with CHRONOS (Canadian Hemispherical Regional Ozone and NOx System) model from 2002 to 2009, and with GEM-MACH (Global Environmental Multi-scale Modelling Air Quality and Chemistry) model from 2010 to 2015, by Environment and Climate Change Canada staff. Estimates incorporate ground-level observation data. Please note that Environment and Climate Change Canada (ECCC) provides data air quality data directly - see the ECCC End Use Licence.pdf file referenced above under Supporting Documentation.These datasets were used by CANUE staff to calculate values of monthly concentrations of O3, for all postal codes in Canada for each year from 2002 to 2015 (DMTI Spatial, 2015). Values are reported only when data completeness thresholds are met - see Data Completeness.pdf in Supporting Documentation.

  7. G

    Data Quality Rules Engines for Health Data Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Oct 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    The citation is currently not available for this dataset.
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Oct 7, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Data Quality Rules Engines for Health Data Market Outlook



    According to our latest research, the global Data Quality Rules Engines for Health Data market size reached USD 1.42 billion in 2024, reflecting the rapid adoption of advanced data management solutions across the healthcare sector. The market is expected to grow at a robust CAGR of 16.1% from 2025 to 2033, reaching a forecasted value of USD 5.12 billion by 2033. This growth is primarily driven by the increasing demand for accurate, reliable, and regulatory-compliant health data to support decision-making and operational efficiency across various healthcare stakeholders.




    The surge in the Data Quality Rules Engines for Health Data market is fundamentally propelled by the exponential growth in healthcare data volume and complexity. With the proliferation of electronic health records (EHRs), digital claims, and patient management systems, healthcare providers and payers face mounting challenges in ensuring the integrity, accuracy, and consistency of their data assets. Data quality rules engines are increasingly being deployed to automate validation, standardization, and error detection processes, thereby reducing manual intervention, minimizing costly errors, and supporting seamless interoperability across disparate health IT systems. Furthermore, the growing trend of value-based care models and data-driven clinical research underscores the strategic importance of high-quality health data, further fueling market demand.




    Another significant growth factor is the tightening regulatory landscape surrounding health data privacy, security, and reporting requirements. Regulatory frameworks such as HIPAA in the United States, GDPR in Europe, and various local data protection laws globally, mandate stringent data governance and auditability. Data quality rules engines help healthcare organizations proactively comply with these regulations by embedding automated rules that enforce data accuracy, completeness, and traceability. This not only mitigates compliance risks but also enhances organizational reputation and patient trust. Additionally, the increasing adoption of cloud-based health IT solutions is making advanced data quality management tools more accessible to organizations of all sizes, further expanding the addressable market.




    Technological advancements in artificial intelligence (AI), machine learning (ML), and natural language processing (NLP) are also transforming the capabilities of data quality rules engines. Modern solutions are leveraging these technologies to intelligently identify data anomalies, suggest rule optimizations, and adapt to evolving data standards. This level of automation and adaptability is particularly critical in the healthcare domain, where data sources are highly heterogeneous and prone to frequent updates. The integration of AI-driven data quality engines with clinical decision support systems, population health analytics, and regulatory reporting platforms is creating new avenues for innovation and efficiency. Such advancements are expected to further accelerate market growth over the forecast period.




    Regionally, North America continues to dominate the Data Quality Rules Engines for Health Data market, owing to its mature healthcare IT infrastructure, high regulatory compliance standards, and significant investments in digital health transformation. However, the Asia Pacific region is emerging as the fastest-growing market, driven by large-scale healthcare digitization initiatives, increasing healthcare expenditure, and a rising focus on data-driven healthcare delivery. Europe also holds a substantial market share, supported by strong regulatory frameworks and widespread adoption of electronic health records. Meanwhile, Latin America and the Middle East & Africa are witnessing steady growth as healthcare providers in these regions increasingly recognize the value of data quality management in improving patient outcomes and operational efficiency.





    Component Analysis



    The Component</b&g

  8. D

    Data Quality Management Report

    • archivemarketresearch.com
    doc, pdf, ppt
    Updated Feb 25, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Archive Market Research (2025). Data Quality Management Report [Dataset]. https://www.archivemarketresearch.com/reports/data-quality-management-46975
    Explore at:
    pdf, doc, pptAvailable download formats
    Dataset updated
    Feb 25, 2025
    Dataset authored and provided by
    Archive Market Research
    License

    https://www.archivemarketresearch.com/privacy-policyhttps://www.archivemarketresearch.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    Market Size and Growth Projections: The global Data Quality Management (DQM) market is projected to reach a market size of USD XX million by 2033, with a CAGR of XX% over the forecast period (2025-2033). The market growth is attributed to the increasing demand for accurate and reliable data for business decision-making, regulatory compliance, and customer satisfaction. Key Drivers and Trends: The key drivers of the DQM market include the proliferation of big data, the need to improve data accuracy and quality, and the adoption of data governance and compliance regulations. Trends shaping the market include the shift to cloud-based DQM solutions, the integration of artificial intelligence (AI) and machine learning (ML), and the increasing focus on data privacy. The market is segmented by type (on-premises and SaaS), application (BFSI, healthcare, retail, etc.), and region (North America, Asia Pacific, Europe, etc.). Data Quality Management: A Comprehensive Overview Data quality management (DQM) refers to the active process of ensuring the accuracy, completeness, and consistency of data. In today's data-driven world, reliable data is crucial for effective decision-making and business success. This report provides a comprehensive analysis of the DQM industry, exploring its key characteristics, market segments, regional insights, and future trends.

  9. f

    Comparison of data set completeness.

    • datasetcatalog.nlm.nih.gov
    Updated Aug 14, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Wilke, Claus O.; Marcotte, Edward M.; Boutz, Daniel R.; Sridhara, Viswanadham; Carroll, Sean M.; Houser, John R.; Sydykova, Dariya K.; Dasgupta, Aurko; Barnhart, Craig; Marx, Christopher J.; Barrick, Jeffrey E.; Trent, M. Stephen; Needham, Brittany D.; Papoulas, Ophelia; Michener, Joshua K. (2015). Comparison of data set completeness. [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0001878402
    Explore at:
    Dataset updated
    Aug 14, 2015
    Authors
    Wilke, Claus O.; Marcotte, Edward M.; Boutz, Daniel R.; Sridhara, Viswanadham; Carroll, Sean M.; Houser, John R.; Sydykova, Dariya K.; Dasgupta, Aurko; Barnhart, Craig; Marx, Christopher J.; Barrick, Jeffrey E.; Trent, M. Stephen; Needham, Brittany D.; Papoulas, Ophelia; Michener, Joshua K.
    Description

    *We counted proteins as observed if they appeared in at least 1 of 3 biological repeats, whereas Ref. [4] counted proteins that appeared in at least 1 of 2 biological repeats.Comparison of data set completeness.

  10. Analytical Standards Market Analysis, Size, and Forecast 2025-2029: North...

    • technavio.com
    pdf
    Updated Jul 24, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Technavio (2025). Analytical Standards Market Analysis, Size, and Forecast 2025-2029: North America (US and Canada), Europe (France, Germany, Italy, and UK), APAC (China, India, Japan, and South Korea), and Rest of World (ROW) [Dataset]. https://www.technavio.com/report/analytical-standards-market-industry-analysis
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jul 24, 2025
    Dataset provided by
    TechNavio
    Authors
    Technavio
    License

    https://www.technavio.com/content/privacy-noticehttps://www.technavio.com/content/privacy-notice

    Time period covered
    2025 - 2029
    Area covered
    Japan, United Kingdom, Germany, United States, Canada
    Description

    Snapshot img

    Analytical Standards Market Size 2025-2029

    The analytical standards market size is valued to increase by USD 734.1 million, at a CAGR of 7.1% from 2024 to 2029. Rapid growth in life science industry will drive the analytical standards market.

    Market Insights

    North America dominated the market and accounted for a 50% growth during the 2025-2029.
    By Type - Chromatography segment was valued at USD 509.10 million in 2023
    By Application - Food and beverages segment accounted for the largest market revenue share in 2023
    

    Market Size & Forecast

    Market Opportunities: USD 63.57 million 
    Market Future Opportunities 2024: USD 734.10 million
    CAGR from 2024 to 2029 : 7.1%
    

    Market Summary

    The market is experiencing significant growth, driven primarily by the expanding life sciences industry. These standards play a crucial role in ensuring the accuracy and consistency of analytical results, making them indispensable in various sectors such as pharmaceuticals, food and beverage, and environmental testing. The increasing adoption of customized analytical standards caters to the unique requirements of specific applications, further fueling market expansion. However, the market faces challenges, including the limited shelf life of analytical standards, which necessitates frequent replenishment. In a real-world business scenario, a global supply chain for a pharmaceutical company relies on a steady supply of analytical standards to maintain operational efficiency and ensure compliance with regulatory standards.
    Ensuring a consistent supply of high-quality standards is essential for the company's success, as any deviation could lead to costly delays or even product recalls. To address these challenges, market participants focus on innovation, such as developing stable, long-lasting standards, and improving supply chain management strategies.
    

    What will be the size of the Analytical Standards Market during the forecast period?

    Get Key Insights on Market Forecast (PDF) Request Free Sample

    The market is a dynamic and ever-evolving industry, driven by the increasing demand for accurate and reliable data in various sectors. According to recent studies, the market is witnessing significant growth, with an estimated 12% increase in demand for analytical standards in the pharmaceutical industry alone. This trend is attributed to the stringent regulatory requirements and the need for compliance with Good Laboratory Practice (GLP) and Good Manufacturing Practice (GMP) guidelines. Moreover, the adoption of advanced technologies such as data management systems, precision limits, and calibration intervals, is transforming the way analytical standards are used in laboratories. For instance, virtual assistants and automation tools are increasingly being used to streamline analytical workflows and improve system performance.
    The integration of statistical software and data analysis tools is also enabling more efficient data management and risk assessment procedures. In addition, method comparison studies and performance verification are crucial for ensuring accuracy and reducing measurement error. ISO standards and quality system elements are essential for maintaining data integrity and ensuring that analytical results meet the required accuracy criteria. Instrument maintenance and quality assurance are also critical for ensuring the reliability and consistency of analytical results. Overall, the market is poised for continued growth, driven by the need for accurate and reliable data in various industries, and the increasing adoption of advanced technologies to improve analytical workflows and ensure regulatory compliance.
    

    Unpacking the Analytical Standards Market Landscape

    In the realm of business operations, precision measurement plays a pivotal role in ensuring consistency and accuracy. The adoption of validation protocols and reference materials has led to a significant reduction in errors, with a reported 30% decrease in system suitability testing failures. Quality control metrics, such as precision evaluation and error analysis, have been instrumental in enhancing regulatory compliance and aligning with quality management systems. Laboratories employing calibration procedures and traceability standards have demonstrated a 25% improvement in instrument performance, leading to substantial cost savings. Analytical techniques, statistical process control, and performance indicators are integral to data integrity management and audit trails, enabling method validation studies and sample preparation methods to yield reliable results. Instrument calibration, method development, and documentation control are essential components of quality assurance systems, ensuring the accuracy of data processing software and uncertainty estimation. Ultimately, these practices contribute to the reproducibility of results and the effectiveness of quality control chart

  11. Unemployment Insurance Benefit Accuracy Measurement (BAM) Data

    • catalog.data.gov
    • s.cnmilf.com
    Updated Apr 18, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Employment and Training Administration (2024). Unemployment Insurance Benefit Accuracy Measurement (BAM) Data [Dataset]. https://catalog.data.gov/dataset/unemployment-insurance-benefit-accuracy-measurement-bam-data
    Explore at:
    Dataset updated
    Apr 18, 2024
    Dataset provided by
    Employment and Training Administrationhttps://www.dol.gov/agencies/eta
    Description

    This dataset includes the historical series of sample Unemployment Insurance (UI) data collected through the benefit accuracy measurement (BAM) program. BAM is a statistical survey used to identify and support resolutions of deficiencies in the state’s (UI) system as well as to estimate state UI improper payments to be reported to DOL as required by the Improper Payments Information Act (IPIA) and the Elimination and Recovery Act (IPERA). BAM is also used to identify the root causes of improper payments and supports other analyses conducted by DOL to highlight improper payment prevention strategies and measure progress in meeting improper payment reduction targets.

  12. D

    Data Quality Management Tool Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated Sep 21, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Data Quality Management Tool Report [Dataset]. https://www.datainsightsmarket.com/reports/data-quality-management-tool-1426872
    Explore at:
    pdf, doc, pptAvailable download formats
    Dataset updated
    Sep 21, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The global Data Quality Management (DQM) tool market is poised for steady growth, projected to reach approximately $694.1 million by 2025, with a Compound Annual Growth Rate (CAGR) of 3.4% expected to continue through 2033. This expansion is fueled by the escalating need for reliable and accurate data across all business functions. Organizations are increasingly recognizing that poor data quality directly impacts decision-making, operational efficiency, customer satisfaction, and regulatory compliance. As businesses generate and process ever-larger volumes of data from diverse sources, the imperative to cleanse, standardize, enrich, and monitor this data becomes paramount. The market is witnessing a significant surge in demand for DQM solutions that can handle complex data integration challenges and provide robust profiling and governance capabilities. The DQM market is being shaped by several key trends and drivers. A primary driver is the growing adoption of Big Data analytics and Artificial Intelligence (AI)/Machine Learning (ML), which heavily rely on high-quality data for accurate insights and predictive modeling. Furthermore, stringent data privacy regulations such as GDPR and CCPA are compelling organizations to invest in DQM tools to ensure data accuracy and compliance. The shift towards cloud-based solutions is another significant trend, offering scalability, flexibility, and cost-effectiveness. While on-premise solutions still hold a share, cloud adoption is rapidly gaining momentum. The market is segmented by application, with both Small and Medium-sized Enterprises (SMEs) and Large Enterprises demonstrating a growing need for effective DQM. Companies are increasingly investing in DQM as a strategic imperative rather than a purely tactical solution, underscoring its importance in the digital transformation journey. This report provides an in-depth analysis of the global Data Quality Management (DQM) Tool market, a critical segment of the data management landscape. The study encompasses a comprehensive historical period from 2019 to 2024, with the base year set for 2025 and an estimated year also in 2025. The forecast period extends from 2025 to 2033, offering valuable insights into future market trajectories. The DQM tool market is projected to witness significant expansion, with the global market size estimated to reach $12,500 million by 2025 and potentially exceeding $25,000 million by 2033. This growth is fueled by the increasing recognition of data as a strategic asset and the imperative for organizations to ensure data accuracy, completeness, and consistency for informed decision-making, regulatory compliance, and enhanced customer experiences.

  13. f

    Data from: The effect of network size and sampling completeness in...

    • datasetcatalog.nlm.nih.gov
    • figshare.com
    Updated Oct 12, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chown, Steven L.; Chapple, David G.; Henriksen, Marie Vestergaard; McGeoch, Melodie (2018). Data from: The effect of network size and sampling completeness in depauperate networks [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000675394
    Explore at:
    Dataset updated
    Oct 12, 2018
    Authors
    Chown, Steven L.; Chapple, David G.; Henriksen, Marie Vestergaard; McGeoch, Melodie
    Description

    The data consists of gall wasps and natural enemies emerged from galls sampled across thirteen subwebs.Full detail is given i Henriksen, M.V., Chapple, D.G., Chown, S.L., & McGeoch, M.A. (2018): "The effect of network size and sampling completeness in depauperate networks". Journal of Animal Ecology, https://doi.org/10.1111/1365-2656.12912

  14. d

    Data from: Reporting of measures of accuracy in systematic reviews of...

    • catalog.data.gov
    • odgavaprod.ogopendata.com
    Updated Sep 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institutes of Health (2025). Reporting of measures of accuracy in systematic reviews of diagnostic literature [Dataset]. https://catalog.data.gov/dataset/reporting-of-measures-of-accuracy-in-systematic-reviews-of-diagnostic-literature
    Explore at:
    Dataset updated
    Sep 7, 2025
    Dataset provided by
    National Institutes of Health
    Description

    Background There are a variety of ways in which accuracy of clinical tests can be summarised in systematic reviews. Variation in reporting of summary measures has only been assessed in a small survey restricted to meta-analyses of screening studies found in a single database. Therefore, we performed this study to assess the measures of accuracy used for reporting results of primary studies as well as their meta-analysis in systematic reviews of test accuracy studies. Methods Relevant reviews on test accuracy were selected from the Database of Abstracts of Reviews of Effectiveness (1994–2000), which electronically searches seven bibliographic databases and manually searches key resources. The structured abstracts of these reviews were screened and information on accuracy measures was extracted from the full texts of 90 relevant reviews, 60 of which used meta-analysis. Results Sensitivity or specificity was used for reporting the results of primary studies in 65/90 (72%) reviews, predictive values in 26/90 (28%), and likelihood ratios in 20/90 (22%). For meta-analysis, pooled sensitivity or specificity was used in 35/60 (58%) reviews, pooled predictive values in 11/60 (18%), pooled likelihood ratios in 13/60 (22%), and pooled diagnostic odds ratio in 5/60 (8%). Summary ROC was used in 44/60 (73%) of the meta-analyses. There were no significant differences in measures of test accuracy among reviews published earlier (1994–97) and those published later (1998–2000). Conclusions There is considerable variation in ways of reporting and summarising results of test accuracy studies in systematic reviews. There is a need for consensus about the best ways of reporting results of test accuracy studies in reviews.

  15. g

    Development Economics Data Group - Completeness of birth registration,...

    • gimi9.com
    Updated Jan 15, 2004
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2004). Development Economics Data Group - Completeness of birth registration, female (%) | gimi9.com [Dataset]. https://gimi9.com/dataset/worldbank_wb_wdi_sp_reg_brth_fe_zs/
    Explore at:
    Dataset updated
    Jan 15, 2004
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Completeness of birth registration is the percentage of children under age 5 whose births were registered at the time of the survey. The numerator of completeness of birth registration includes children whose birth certificate was seen by the interviewer or whose mother or caretaker says the birth has been registered.

  16. D

    Network KPI Data Quality Platform Market Research Report 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Network KPI Data Quality Platform Market Research Report 2033 [Dataset]. https://dataintelo.com/report/network-kpi-data-quality-platform-market
    Explore at:
    pptx, pdf, csvAvailable download formats
    Dataset updated
    Sep 30, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Network KPI Data Quality Platform Market Outlook



    According to our latest research, the global Network KPI Data Quality Platform market size reached USD 1.92 billion in 2024, demonstrating strong momentum as digital transformation accelerates across industries. The market is projected to expand at a robust CAGR of 14.6% from 2025 to 2033, reaching a forecasted value of USD 6.16 billion by 2033. This remarkable growth is primarily driven by the rising demand for reliable network performance data, the proliferation of IoT and 5G networks, and the critical need for real-time analytics to optimize network operations and customer experiences.




    A key growth factor for the Network KPI Data Quality Platform market is the exponential increase in network complexity, spurred by the widespread adoption of cloud computing, edge devices, and remote work infrastructures. Organizations are facing unprecedented challenges in monitoring and maintaining network performance, making high-quality KPI data essential for troubleshooting, capacity planning, and proactive management. As networks become more dynamic and distributed, the accuracy, timeliness, and completeness of KPI data is pivotal for ensuring service reliability, reducing downtime, and meeting stringent SLAs. This has led to a surge in investments in advanced data quality platforms that can automate anomaly detection, data cleansing, and enrichment processes, thereby empowering IT and network teams to make data-driven decisions.




    Another significant driver is the integration of artificial intelligence and machine learning into Network KPI Data Quality Platforms. AI-powered solutions are enabling organizations to identify patterns, predict failures, and optimize resource allocation with unprecedented precision. The ability to process vast volumes of network data in real time and generate actionable insights is transforming network management practices across sectors such as telecommunications, BFSI, and healthcare. Furthermore, regulatory requirements around data integrity and compliance are compelling enterprises to adopt robust data quality frameworks, further propelling market growth. The convergence of AI, big data analytics, and automation is expected to accelerate the adoption of these platforms, especially among large enterprises and managed service providers.




    The rapid deployment of 5G networks and the ongoing digitalization of industries are also fueling the growth of the Network KPI Data Quality Platform market. Telecom operators are under mounting pressure to deliver ultra-reliable, low-latency services, which necessitates real-time monitoring and validation of network KPIs. Similarly, sectors such as manufacturing and healthcare are leveraging industrial IoT and connected devices, which require continuous performance monitoring to ensure operational efficiency and compliance. As a result, the demand for scalable, cloud-based data quality platforms that can seamlessly integrate with existing network management systems is witnessing a notable uptick. Vendors are responding by enhancing their offerings with advanced analytics, visualization, and API integration capabilities.




    Regionally, North America currently dominates the Network KPI Data Quality Platform market, accounting for the largest share in 2024, driven by a mature telecom infrastructure, high cloud adoption, and significant investments in network automation. Europe follows closely, with strong growth in sectors such as BFSI and manufacturing. The Asia Pacific region is expected to exhibit the fastest CAGR through 2033, fueled by rapid digitalization, burgeoning mobile subscriber bases, and government initiatives to modernize network infrastructure. Latin America and the Middle East & Africa are also emerging as promising markets, supported by increasing investments in telecom modernization and enterprise IT transformation.



    Component Analysis



    The Component segment of the Network KPI Data Quality Platform market is bifurcated into Software and Services, each playing a crucial role in the delivery of end-to-end data quality solutions. The software segment dominates the market, accounting for a majority share in 2024, as enterprises prioritize automation, scalability, and advanced analytics capabilities. Modern data quality software platforms offer features such as real-time data validation, anomaly detection, and customizable KPI dashboards, enabling organizati

  17. D

    Data Quality Management Service Market Report | Global Forecast From 2025 To...

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 23, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2024). Data Quality Management Service Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-data-quality-management-service-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Sep 23, 2024
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Data Quality Management Service Market Outlook



    The global data quality management service market size was valued at approximately USD 1.8 billion in 2023 and is projected to reach USD 5.9 billion by 2032, growing at a compound annual growth rate (CAGR) of 14.1% during the forecast period. The primary growth factor driving this market is the increasing volume of data being generated across various industries, necessitating robust data quality management solutions to maintain data accuracy, reliability, and relevance.



    One of the key growth drivers for the data quality management service market is the exponential increase in data generation due to the proliferation of digital technologies such as IoT, big data analytics, and AI. Organizations are increasingly recognizing the importance of maintaining high data quality to derive actionable insights and make informed business decisions. Poor data quality can lead to significant financial losses, inefficiencies, and missed opportunities, thereby driving the demand for comprehensive data quality management services.



    Another significant growth factor is the rising regulatory and compliance requirements across various industry verticals such as BFSI, healthcare, and government. Regulations like the General Data Protection Regulation (GDPR) and the Health Insurance Portability and Accountability Act (HIPAA) necessitate organizations to maintain accurate and high-quality data. Non-compliance with these regulations can result in severe penalties and damage to the organization’s reputation, thus propelling the adoption of data quality management solutions.



    Additionally, the increasing adoption of cloud-based solutions is further fueling the growth of the data quality management service market. Cloud-based data quality management solutions offer scalability, flexibility, and cost-effectiveness, making them an attractive option for organizations of all sizes. The availability of advanced data quality management tools that integrate seamlessly with existing IT infrastructure and cloud platforms is encouraging enterprises to invest in these services to enhance their data management capabilities.



    From a regional perspective, North America is expected to hold the largest share of the data quality management service market, driven by the early adoption of advanced technologies and the presence of key market players. However, the Asia Pacific region is anticipated to witness the highest growth rate during the forecast period, owing to the rapid digital transformation, increasing investments in IT infrastructure, and growing awareness about the importance of data quality management in enhancing business operations and decision-making processes.



    Component Analysis



    The data quality management service market is segmented by component into software and services. The software segment encompasses various data quality tools and platforms that help organizations assess, improve, and maintain the quality of their data. These tools include data profiling, data cleansing, data enrichment, and data monitoring solutions. The increasing complexity of data environments and the need for real-time data quality monitoring are driving the demand for sophisticated data quality software solutions.



    Services, on the other hand, include consulting, implementation, and support services provided by data quality management service vendors. Consulting services assist organizations in identifying data quality issues, developing data governance frameworks, and implementing best practices for data quality management. Implementation services involve the deployment and integration of data quality tools with existing IT systems, while support services provide ongoing maintenance and troubleshooting assistance. The growing need for expert guidance and support in managing data quality is contributing to the growth of the services segment.



    The software segment is expected to dominate the market due to the continuous advancements in data quality management tools and the increasing adoption of AI and machine learning technologies for automated data quality processes. Organizations are increasingly investing in advanced data quality software to streamline their data management operations, reduce manual intervention, and ensure data accuracy and consistency across various data sources.



    Moreover, the services segment is anticipated to witness significant growth during the forecast period, driven by the increasing demand for professional services that can help organizations address complex dat

  18. D

    Cloud Data Quality Monitoring and Testing Market Report | Global Forecast...

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2024). Cloud Data Quality Monitoring and Testing Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-cloud-data-quality-monitoring-and-testing-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Sep 5, 2024
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Cloud Data Quality Monitoring and Testing Market Outlook



    The global cloud data quality monitoring and testing market size was valued at USD 1.5 billion in 2023 and is expected to reach USD 4.8 billion by 2032, growing at a compound annual growth rate (CAGR) of 13.8% during the forecast period. This robust growth is driven by increasing cloud adoption across various industries, coupled with the rising need for ensuring data quality and compliance.



    One of the primary growth factors of the cloud data quality monitoring and testing market is the exponential increase in data generation and consumption. As organizations continue to integrate cloud solutions, the volume of data being processed and stored on the cloud has surged dramatically. This data influx necessitates stringent quality monitoring to ensure data integrity, accuracy, and consistency, thus driving the demand for advanced data quality solutions. Moreover, as businesses enhance their data-driven decision-making processes, the need for high-quality data becomes ever more critical, further propelling market growth.



    Another significant driver is the growing complexity of data architectures due to diverse data sources and types. The modern data environment is characterized by a mix of structured, semi-structured, and unstructured data originating from various sources like IoT devices, social media platforms, and enterprise applications. Ensuring the quality of such heterogeneous data sets requires sophisticated monitoring and testing tools that can seamlessly operate within cloud ecosystems. Consequently, organizations are increasingly investing in cloud data quality solutions to manage this complexity, thereby fueling market expansion.



    Compliance and regulatory requirements also play a pivotal role in the growth of the cloud data quality monitoring and testing market. Industries such as BFSI, healthcare, and government are subject to stringent data governance and privacy regulations that mandate regular auditing and validation of data quality. Failure to comply with these regulations can result in severe penalties and reputational damage. Hence, companies are compelled to adopt cloud data quality monitoring and testing solutions to ensure compliance and mitigate risks associated with data breaches and inaccuracies.



    From a regional perspective, North America dominates the market due to its advanced IT infrastructure and early adoption of cloud technologies. However, significant growth is also expected in the Asia Pacific region, driven by rapid digital transformation initiatives and increasing investments in cloud infrastructure by emerging economies like China and India. Europe also presents substantial growth opportunities, with industries embracing cloud solutions to enhance operational efficiency and innovation. The regional dynamics indicate a wide-ranging impact of cloud data quality monitoring and testing solutions across the globe.



    Component Analysis



    The cloud data quality monitoring and testing market is broadly segmented into software and services. The software segment encompasses various tools and platforms designed to automate and streamline data quality monitoring processes. These solutions include data profiling, data cleansing, data integration, and master data management software. The demand for such software is on the rise due to its ability to provide real-time insights into data quality issues, thereby enabling organizations to take proactive measures in addressing discrepancies. Advanced software solutions often leverage AI and machine learning algorithms to enhance data accuracy and predictive capabilities.



    The services segment is equally crucial, offering a gamut of professional and managed services to support the implementation and maintenance of data quality monitoring systems. Professional services include consulting, system integration, and training services, which help organizations in the seamless adoption of data quality tools and best practices. Managed services, on the other hand, provide ongoing support and maintenance, ensuring that data quality standards are consistently met. As organizations seek to optimize their cloud data environments, the demand for comprehensive service offerings is expected to rise, driving market growth.



    One of the key trends within the component segment is the increasing integration of software and services to offer holistic data quality solutions. Vendors are increasingly bundling their software products with complementary services, providing a one-stop solution that covers all aspects of data quality managem

  19. G

    Data Quality Tools Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Aug 4, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). Data Quality Tools Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/data-quality-tools-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Aug 4, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Data Quality Tools Market Outlook



    According to our latest research, the global Data Quality Tools market size reached USD 2.65 billion in 2024, reflecting robust demand across industries for solutions that ensure data accuracy, consistency, and reliability. The market is poised to expand at a CAGR of 17.6% from 2025 to 2033, driven by increasing digital transformation initiatives, regulatory compliance requirements, and the exponential growth of enterprise data. By 2033, the Data Quality Tools market is forecasted to attain a value of USD 12.06 billion, as organizations worldwide continue to prioritize data-driven decision-making and invest in advanced data management solutions.




    A key growth factor propelling the Data Quality Tools market is the proliferation of data across diverse business ecosystems. Enterprises are increasingly leveraging big data analytics, artificial intelligence, and cloud computing, all of which demand high-quality data as a foundational element. The surge in unstructured and structured data from various sources such as customer interactions, IoT devices, and business operations has made data quality management a strategic imperative. Organizations recognize that poor data quality can lead to erroneous insights, operational inefficiencies, and compliance risks. As a result, the adoption of comprehensive Data Quality Tools for data profiling, cleansing, and enrichment is accelerating, particularly among industries with high data sensitivity like BFSI, healthcare, and retail.




    Another significant driver for the Data Quality Tools market is the intensifying regulatory landscape. Data privacy laws such as the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and other country-specific mandates require organizations to maintain high standards of data integrity and traceability. Non-compliance can result in substantial financial penalties and reputational damage. Consequently, businesses are investing in sophisticated Data Quality Tools that provide automated monitoring, data lineage, and audit trails to ensure regulatory adherence. This regulatory push is particularly prominent in sectors like finance, healthcare, and government, where the stakes for data accuracy and security are exceptionally high.




    Advancements in cloud technology and the growing trend of digital transformation across enterprises are also fueling market growth. Cloud-based Data Quality Tools offer scalability, flexibility, and cost-efficiency, enabling organizations to manage data quality processes remotely and in real-time. The shift towards Software-as-a-Service (SaaS) models has lowered the entry barrier for small and medium enterprises (SMEs), allowing them to implement enterprise-grade data quality solutions without substantial upfront investments. Furthermore, the integration of machine learning and artificial intelligence capabilities into data quality platforms is enhancing automation, reducing manual intervention, and improving the overall accuracy and efficiency of data management processes.




    From a regional perspective, North America continues to dominate the Data Quality Tools market due to its early adoption of advanced technologies, a mature IT infrastructure, and the presence of leading market players. However, the Asia Pacific region is emerging as a high-growth market, driven by rapid digitalization, increasing investments in IT, and a burgeoning SME sector. Europe maintains a strong position owing to stringent data privacy regulations and widespread enterprise adoption of data management solutions. Latin America and the Middle East & Africa, while relatively nascent, are witnessing growing awareness and adoption, particularly in the banking, government, and telecommunications sectors.





    Component Analysis



    The Component segment of the Data Quality Tools market is bifurcated into software and services. Software dominates the segment, accounting for a significant share of the global market revenue in 2024. This dominance is

  20. CheckM v1 reference data

    • zenodo.org
    • data.niaid.nih.gov
    application/gzip
    Updated Dec 6, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Donovan Parks; Donovan Parks (2022). CheckM v1 reference data [Dataset]. http://doi.org/10.5281/zenodo.7401545
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Dec 6, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Donovan Parks; Donovan Parks
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Precalculated data files for CheckM v1.

    Manuscript:

    CheckM: assessing the quality of microbial genomes recovered from isolates, single cells, and metagenomes.
    https://genome.cshlp.org/content/early/2015/05/14/gr.186072.114

    Installation instructions:
    https://github.com/Ecogenomics/CheckM/wiki/Installation#how-to-install-checkm.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Olga Kostopoulou; Brendan Delaney (2021). Measuring quality of routine primary care data [Dataset]. http://doi.org/10.5061/dryad.dncjsxkzh

Measuring quality of routine primary care data

Explore at:
zipAvailable download formats
Dataset updated
Mar 12, 2021
Dataset provided by
Imperial College London
Authors
Olga Kostopoulou; Brendan Delaney
License

https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

Description

Objective: Routine primary care data may be used for the derivation of clinical prediction rules and risk scores. We sought to measure the impact of a decision support system (DSS) on data completeness and freedom from bias.

Materials and Methods: We used the clinical documentation of 34 UK General Practitioners who took part in a previous study evaluating the DSS. They consulted with 12 standardized patients. In addition to suggesting diagnoses, the DSS facilitates data coding. We compared the documentation from consultations with the electronic health record (EHR) (baseline consultations) vs. consultations with the EHR-integrated DSS (supported consultations). We measured the proportion of EHR data items related to the physician’s final diagnosis. We expected that in baseline consultations, physicians would document only or predominantly observations related to their diagnosis, while in supported consultations, they would also document other observations as a result of exploring more diagnoses and/or ease of coding.

Results: Supported documentation contained significantly more codes (IRR=5.76 [4.31, 7.70] P<0.001) and less free text (IRR = 0.32 [0.27, 0.40] P<0.001) than baseline documentation. As expected, the proportion of diagnosis-related data was significantly lower (b=-0.08 [-0.11, -0.05] P<0.001) in the supported consultations, and this was the case for both codes and free text.

Conclusions: We provide evidence that data entry in the EHR is incomplete and reflects physicians’ cognitive biases. This has serious implications for epidemiological research that uses routine data. A DSS that facilitates and motivates data entry during the consultation can improve routine documentation.

Search
Clear search
Close search
Google apps
Main menu