77 datasets found
  1. Data from: Statistical Process Control as a Tool for Quality Improvement A...

    • figshare.com
    docx
    Updated Feb 23, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Canberk Elmalı; Özge Ural (2023). Statistical Process Control as a Tool for Quality Improvement A Case Study in Denim Pant Production [Dataset]. http://doi.org/10.6084/m9.figshare.22147508.v2
    Explore at:
    docxAvailable download formats
    Dataset updated
    Feb 23, 2023
    Dataset provided by
    Figsharehttp://figshare.com/
    Authors
    Canberk Elmalı; Özge Ural
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In this paper, we show that concept of Statistical Process Control tools was thoroughly examined and the definitions of quality control concepts were presented. This is significant because of it is anticipated that this study will contribute to the literature as an exemplary application that demonstrates the role of statistical process control (SPC) tools in quality improvement in the evaluation and decision-making phase.

    This is significant because of this study is to investigate applications of quality control, to clarify statistical control methods and problem-solving procedures, to generate proposals for problem-solving approaches, and to disseminate improvement studies in the ready-to-wear industry. The basic Statistical Process Control tools used in the study, the most repetitive faults were detected and these faults were divided into sub-headings for more detailed analysis. In this way, it was tried to prevent the repetition of faults by going down to the root causes of any detected fault. With this different perspective, it is expected that the study will contribute to other fields.

    We give consent for the publication of identifiable details, which can include photograph(s) and case history and details within the text (“Material”) to be published in the Journal of Quality Technology. We confirm that have seen and been given the opportunity to read both the Material and the Article (as attached) to be published by Taylor & Francis.

  2. Z

    Statistical Process Control Benchmark Dataset

    • data.niaid.nih.gov
    Updated Aug 15, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Schulze, Tobias; Louis, Huebser (2023). Statistical Process Control Benchmark Dataset [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_8246620
    Explore at:
    Dataset updated
    Aug 15, 2023
    Dataset provided by
    Laboratory for Machine Tools and Production Engineering (WZL), RWTH Aachen University
    Authors
    Schulze, Tobias; Louis, Huebser
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Data for benchmarking SPC against other process monitoring methods. The data consist of a one-dimensional timeseries of floats (x.csv). Addititionally information whether the data are within the specifications are provided as another time series (y.csv). The data are generated by solving an optimization problem for each time to generate a mixture distribution of different probability distributions. Then for each timestep one record is sampled. Inputs for the optimization problem are the given probability distributions, the lower and upper limit of the tolerance interval as well as the desired median of the data. Additionally weights of the different probability distributions can be given as boundary condions for the different time steps. Metadata generated from the solving are stored in k_matrix.csv (wheights at each time step) and distribs (probability distribution objects). The data consists of phases with data from a stable mixture distribution and phases with data from a mixture distribution that do not fulfill the stability criteria.

  3. f

    Data from: A change-point–based control chart for detecting sparse mean...

    • tandf.figshare.com
    txt
    Updated Jan 17, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zezhong Wang; Inez Maria Zwetsloot (2024). A change-point–based control chart for detecting sparse mean changes in high-dimensional heteroscedastic data [Dataset]. http://doi.org/10.6084/m9.figshare.24441804.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jan 17, 2024
    Dataset provided by
    Taylor & Francis
    Authors
    Zezhong Wang; Inez Maria Zwetsloot
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Because of the “curse of dimensionality,” high-dimensional processes present challenges to traditional multivariate statistical process monitoring (SPM) techniques. In addition, the unknown underlying distribution of and complicated dependency among variables such as heteroscedasticity increase the uncertainty of estimated parameters and decrease the effectiveness of control charts. In addition, the requirement of sufficient reference samples limits the application of traditional charts in high-dimension, low-sample-size scenarios (small n, large p). More difficulties appear when detecting and diagnosing abnormal behaviors caused by a small set of variables (i.e., sparse changes). In this article, we propose two change-point–based control charts to detect sparse shifts in the mean vector of high-dimensional heteroscedastic processes. Our proposed methods can start monitoring when the number of observations is a lot smaller than the dimensionality. The simulation results show that the proposed methods are robust to nonnormality and heteroscedasticity. Two real data examples are used to illustrate the effectiveness of the proposed control charts in high-dimensional applications. The R codes are provided online.

  4. f

    Data from: An Intrinsic Geometrical Approach for Statistical Process Control...

    • datasetcatalog.nlm.nih.gov
    • tandf.figshare.com
    Updated Sep 15, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zhao, Xueqi; del Castillo, Enrique (2021). An Intrinsic Geometrical Approach for Statistical Process Control of Surface and Manifold Data [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000779368
    Explore at:
    Dataset updated
    Sep 15, 2021
    Authors
    Zhao, Xueqi; del Castillo, Enrique
    Description

    We present a new method for statistical process control (SPC) of a discrete part manufacturing system based on intrinsic geometrical properties of the parts, estimated from three-dimensional sensor data. An intrinsic method has the computational advantage of avoiding the difficult part registration problem, necessary in previous SPC approaches of three-dimensional geometrical data, but inadequate if noncontact sensors are used. The approach estimates the spectrum of the Laplace–Beltrami (LB) operator of the scanned parts and uses a multivariate nonparametric control chart for online process control. Our proposal brings SPC closer to computer vision and computer graphics methods aimed to detect large differences in shape (but not in size). However, the SPC problem differs in that small changes in either shape or size of the parts need to be detected, keeping a controllable false alarm rate and without completely filtering noise. An online or “Phase II” method and a scheme for starting up in the absence of prior data (“Phase I”) are presented. Comparison with earlier approaches that require registration shows the LB spectrum method to be more sensitive to rapidly detect small changes in shape and size, including the practical case when the sequence of part datasets is in the form of large, unequal size meshes. A post-alarm diagnostic method to investigate the location of defects on the surface of a part is also presented. While we focus in this article on surface (triangulation) data, the methods can also be applied to point cloud and voxel metrology data.

  5. Statistical Process Control Benchmark Dataset

    • zenodo.org
    zip
    Updated Apr 15, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Tobias Schulze; Huebser Louis; Tobias Schulze; Huebser Louis (2024). Statistical Process Control Benchmark Dataset [Dataset]. http://doi.org/10.5281/zenodo.8249487
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 15, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Tobias Schulze; Huebser Louis; Tobias Schulze; Huebser Louis
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    Datasets to the planned publication "Generalized Statistical Process Control via 1D-ResNet Pretraining" by Tobias Schulze, Louis Huebser, Sebastian Beckschulte and Robert H. Schmitt (Chair for Intelligence in Quality Sensing, Laboratory for Machine Tools and Production Engineering, WZL of RWTH Aachen University)

    Data for benchmarking SPC against other process monitoring methods. The data consist of a one-dimensional timeseries of floats (x.csv). Addititionally information whether the data are within the specifications are provided as another time series (y.csv). The data are generated by solving an optimization problem for each time to generate a mixture distribution of different probability distributions. Then for each timestep one record is sampled. Inputs for the optimization problem are the given probability distributions, the lower and upper limit of the tolerance interval as well as the desired median of the data. Additionally weights of the different probability distributions can be given as boundary condions for the different time steps. Metadata generated from the solving are stored in k_matrix.csv (wheights at each time step) and distribs (probability distribution objects according to https://doi.org/10.5281/zenodo.8249487). The data consists of phases with data from a stable mixture distribution and phases with data from a mixture distribution that do not fulfill the stability criteria.

    The train data were used to train the G-SPC model. The test data were used for benchmarking purposes

    Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy – EXC-2023 Internet of Production – 390621612.

  6. Data from: Transparent Sequential Learning for Statistical Process Control...

    • tandf.figshare.com
    zip
    Updated Nov 28, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Peihua Qiu; Xiulin Xie (2023). Transparent Sequential Learning for Statistical Process Control of Serially Correlated Data [Dataset]. http://doi.org/10.6084/m9.figshare.14599925.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Nov 28, 2023
    Dataset provided by
    Taylor & Francishttps://taylorandfrancis.com/
    Authors
    Peihua Qiu; Xiulin Xie
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Machine learning methods have been widely used in different applications, including process control and monitoring. For handling statistical process control (SPC) problems, conventional supervised machine learning methods (e.g., artificial neural networks and support vector machines) would have some difficulties. For instance, a training dataset containing both in-control and out-of-control (OC) process observations is required by a supervised machine learning method, but it is rarely available in SPC applications. Furthermore, many machine learning methods work like black boxes. It is often difficult to interpret their learning mechanisms and the resulting decision rules in the context of an application. In the SPC literature, there have been some existing discussions on how to handle the lack of OC observations in the training data, using the one-class classification, artificial contrast, real-time contrast, and some other novel ideas. However, these approaches have their own limitations to handle SPC problems. In this article, we extend the self-starting process monitoring idea that has been employed widely in modern SPC research to a general learning framework for monitoring processes with serially correlated data. Under the new framework, process characteristics to learn are well specified in advance, and process learning is sequential in the sense that the learned process characteristics keep being updated during process monitoring. The learned process characteristics are then incorporated into a control chart for detecting process distributional shift based on all available data by the current observation time. Numerical studies show that process monitoring based on the new learning framework is more reliable and effective than some representative existing machine learning SPC approaches.

  7. f

    Data from: Dripper testing: Application of statistical quality control for...

    • figshare.com
    jpeg
    Updated Jun 2, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hermes S. da Rocha; Patricia A. A. Marques; Antonio P. de Camargo; Douglas L. dos Reis; Eric A. da Silva; José A. Frizzone (2023). Dripper testing: Application of statistical quality control for measurement system analysis [Dataset]. http://doi.org/10.6084/m9.figshare.5670937.v1
    Explore at:
    jpegAvailable download formats
    Dataset updated
    Jun 2, 2023
    Dataset provided by
    SciELO journals
    Authors
    Hermes S. da Rocha; Patricia A. A. Marques; Antonio P. de Camargo; Douglas L. dos Reis; Eric A. da Silva; José A. Frizzone
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    ABSTRACT Laboratory tests for technical evaluation or irrigation material testing involve the measurement of many variables, as well as monitoring and control of test conditions. This study, carried out in 2016, aimed at using statistical quality control techniques to evaluate results of dripper tests. Exponentially weighted moving average control charts were elaborated, besides capability indices for the measurement of the test pressure and water temperature; and study on repeatability and reproducibility (Gage RR) of flow measurement system using 10 replicates, in three work shifts (morning, afternoon and evening), with 25 emitters. Both the test pressure and water temperature remained stable, with “excellent” performance for the pressure adjustment process by integrative-derivative proportional controller. The variability between emitters was the component with highest contribution to the total variance of the flow measurements, with 96.77% of the total variance due to the variability between parts. The measurement system was classified as “acceptable” or “approved” by the Gage RR study; and non-random causes of significant variability were not identified in the routine of tests.

  8. w

    Global Statistical Process Control Software Market Research Report: By...

    • wiseguyreports.com
    Updated Aug 15, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Global Statistical Process Control Software Market Research Report: By Application (Manufacturing, Healthcare, Pharmaceuticals, Automotive, Food and Beverage), By Deployment Model (On-Premises, Cloud-Based, Hybrid), By End Use (Large Enterprises, Small and Medium Enterprises, Government), By Features (Data Analysis, Real-Time Monitoring, Reporting, Dashboarding) and By Regional (North America, Europe, South America, Asia Pacific, Middle East and Africa) - Forecast to 2035 [Dataset]. https://www.wiseguyreports.com/reports/statistical-process-control-software-market
    Explore at:
    Dataset updated
    Aug 15, 2025
    License

    https://www.wiseguyreports.com/pages/privacy-policyhttps://www.wiseguyreports.com/pages/privacy-policy

    Time period covered
    Aug 25, 2025
    Area covered
    Global
    Description
    BASE YEAR2024
    HISTORICAL DATA2019 - 2023
    REGIONS COVEREDNorth America, Europe, APAC, South America, MEA
    REPORT COVERAGERevenue Forecast, Competitive Landscape, Growth Factors, and Trends
    MARKET SIZE 20243.96(USD Billion)
    MARKET SIZE 20254.25(USD Billion)
    MARKET SIZE 20358.5(USD Billion)
    SEGMENTS COVEREDApplication, Deployment Model, End Use, Features, Regional
    COUNTRIES COVEREDUS, Canada, Germany, UK, France, Russia, Italy, Spain, Rest of Europe, China, India, Japan, South Korea, Malaysia, Thailand, Indonesia, Rest of APAC, Brazil, Mexico, Argentina, Rest of South America, GCC, South Africa, Rest of MEA
    KEY MARKET DYNAMICSIncreasing demand for quality control, Adoption of automation technologies, Growing manufacturing sector, Regulatory compliance requirements, Rising need for data-driven decisions
    MARKET FORECAST UNITSUSD Billion
    KEY COMPANIES PROFILEDRockwell Automation, Tableau, Minitab, InfinityQS, Alteryx, Oracle, PI System, Statgraphics, SAP, Cisco, Hexagon, SAS, Siemens, Qualityze, MathWorks, IBM
    MARKET FORECAST PERIOD2025 - 2035
    KEY MARKET OPPORTUNITIESIncreased demand for quality management, Integration with IoT devices, Growth in manufacturing automation, Adoption of machine learning techniques, Expansion in emerging markets
    COMPOUND ANNUAL GROWTH RATE (CAGR) 7.2% (2025 - 2035)
  9. I

    Global Statistical Process Control System (SPC) Market Industry Best...

    • statsndata.org
    excel, pdf
    Updated Oct 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stats N Data (2025). Global Statistical Process Control System (SPC) Market Industry Best Practices 2025-2032 [Dataset]. https://www.statsndata.org/report/statistical-process-control-system-spc-market-70426
    Explore at:
    excel, pdfAvailable download formats
    Dataset updated
    Oct 2025
    Dataset authored and provided by
    Stats N Data
    License

    https://www.statsndata.org/how-to-orderhttps://www.statsndata.org/how-to-order

    Area covered
    Global
    Description

    The Statistical Process Control System (SPC) market has emerged as a critical component in quality management and process optimization across various industries, significantly enhancing operational efficiency and product quality. SPC utilizes statistical methods and tools to monitor and control manufacturing process

  10. f

    Practical implementation of an End-to-end methodology for SPC of 3-D part...

    • tandf.figshare.com
    txt
    Updated Sep 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yulin An; Xueqi Zhao; Enrique del Castillo (2025). Practical implementation of an End-to-end methodology for SPC of 3-D part geometry: A case study [Dataset]. http://doi.org/10.6084/m9.figshare.29630811.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Sep 9, 2025
    Dataset provided by
    Taylor & Francis
    Authors
    Yulin An; Xueqi Zhao; Enrique del Castillo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Del Castillo and Zhao have recently proposed a new methodology for the Statistical Process Control (SPC) of discrete parts whose 3-dimensional (3D) geometrical data are acquired with non-contact sensors. The approach is based on monitoring the spectrum of the Laplace–Beltrami (LB) operator of each scanned part estimated using finite element methods (FEM). The spectrum of the LB operator is an intrinsic summary of the geometry of a part, independent of the ambient space. Hence, registration of scanned parts is unnecessary when comparing them. The primary goal of this case study paper is to demonstrate the practical implementation of the spectral SPC methodology through multiple examples using real scanned parts acquired with an industrial-grade laser scanner, including 3D-printed parts and commercial parts. We discuss the scanned mesh preprocessing needed in practice, including the type of remeshing found to be most beneficial for the FEM computations. For each part type, both the “phase I” and “phase II” stages of the spectral SPC methodology are showcased. In addition, we provide a new principled method to determine the number of eigenvalues of the LB operator to consider for efficient SPC of a given part geometry, and present an improved algorithm to automatically define a region of interest, particularly useful for large meshes. Computer codes that implement every method discussed in this paper, as well as all scanned part datasets used in the case studies, are made available and explained in the supplementary materials.

  11. I

    Global Statistical Process Control Software Market Economic and Social...

    • statsndata.org
    excel, pdf
    Updated Oct 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stats N Data (2025). Global Statistical Process Control Software Market Economic and Social Impact 2025-2032 [Dataset]. https://www.statsndata.org/report/statistical-process-control-software-market-117582
    Explore at:
    pdf, excelAvailable download formats
    Dataset updated
    Oct 2025
    Dataset authored and provided by
    Stats N Data
    License

    https://www.statsndata.org/how-to-orderhttps://www.statsndata.org/how-to-order

    Area covered
    Global
    Description

    The Statistical Process Control (SPC) Software market has increasingly become a cornerstone in quality management across various industries, including manufacturing, pharmaceuticals, and food processing. This specialized software leverages statistical methods to monitor and control production processes, ensuring tha

  12. m

    Simulation for data-driven sensor delay estimation in industrial processes...

    • data.mendeley.com
    Updated Jan 16, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Tim Offermans (2024). Simulation for data-driven sensor delay estimation in industrial processes using multivariate projection methods [Dataset]. http://doi.org/10.17632/32hv69mnj6.4
    Explore at:
    Dataset updated
    Jan 16, 2024
    Authors
    Tim Offermans
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Source code and results for exactly regenerating the data and results of the simulation study reported in (to be published). Industrial sensory data is simulated and subjected to several data-driven methods for estimating temporal delays between the sensors. A new method is proposed to estimate these delays by optimizing multivariate correlations, and is shown to be more accurate than the currently more standard method that optimizes bivariate correlations.

    Updated on November 6th, 2023 after internal review by authors.

  13. Data from: A New Process Control Chart for Monitoring Short-Range Serially...

    • tandf.figshare.com
    txt
    Updated Feb 15, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Peihua Qiu; Wendong Li; Jun Li (2024). A New Process Control Chart for Monitoring Short-Range Serially Correlated Data [Dataset]. http://doi.org/10.6084/m9.figshare.7624409.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Feb 15, 2024
    Dataset provided by
    Taylor & Francishttps://taylorandfrancis.com/
    Authors
    Peihua Qiu; Wendong Li; Jun Li
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Abstract–Statistical process control (SPC) charts are critically important for quality control and management in manufacturing industries, environmental monitoring, disease surveillance, and many other applications. Conventional SPC charts are designed for cases when process observations are independent at different observation times. In practice, however, serial data correlation almost always exists in sequential data. It has been well demonstrated in the literature that control charts designed for independent data are unstable for monitoring serially correlated data. Thus, it is important to develop control charts specifically for monitoring serially correlated data. To this end, there is some existing discussion in the SPC literature. Most existing methods are based on parametric time series modeling and residual monitoring, where the data are often assumed to be normally distributed. In applications, however, the assumed parametric time series model with a given order and the normality assumption are often invalid, resulting in unstable process monitoring. Although there is some nice discussion on robust design of such residual monitoring control charts, the suggested designs can only handle certain special cases well. In this article, we try to make another effort by proposing a novel control chart that makes use of the restarting mechanism of a CUSUM chart and the related spring length concept. Our proposed chart uses observations within the spring length of the current time point and ignores all history data that are beyond the spring length. It does not require any parametric time series model and/or a parametric process distribution. It only requires the assumption that process observation at a given time point is associated with nearby observations and independent of observations that are far away in observation times, which should be reasonable for many applications. Numerical studies show that it performs well in different cases.

  14. f

    Additional file 2 of Prospective monitoring of imaging guideline adherence...

    • datasetcatalog.nlm.nih.gov
    • springernature.figshare.com
    Updated May 14, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Kleer, Eduardo; Linsell, Susan; Qi, Ji; Denton, Brian; Inadomi, Michael; Dunn, Rodney; Singh, Karandeep; Montie, James; Hurley, Patrick; Ghani, Khurshid R. (2020). Additional file 2 of Prospective monitoring of imaging guideline adherence by physicians in a surgical collaborative: comparison of statistical process control methods for detecting outlying performance [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000593655
    Explore at:
    Dataset updated
    May 14, 2020
    Authors
    Kleer, Eduardo; Linsell, Susan; Qi, Ji; Denton, Brian; Inadomi, Michael; Dunn, Rodney; Singh, Karandeep; Montie, James; Hurley, Patrick; Ghani, Khurshid R.
    Description

    Additional file 2. Ethics Statement Supplement. Contains details of MUSIC practices’ IRB status.

  15. f

    Data from: Comparison of statistical methods and the use of quality control...

    • datasetcatalog.nlm.nih.gov
    • plos.figshare.com
    Updated Aug 30, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Portier, Chris; Espín-Pérez, Almudena; de Kok, Theo M. C. M.; van Veldhoven, Karin; Chadeau-Hyam, Marc; Kleinjans, Jos C. S. (2018). Comparison of statistical methods and the use of quality control samples for batch effect correction in human transcriptome data [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000664759
    Explore at:
    Dataset updated
    Aug 30, 2018
    Authors
    Portier, Chris; Espín-Pérez, Almudena; de Kok, Theo M. C. M.; van Veldhoven, Karin; Chadeau-Hyam, Marc; Kleinjans, Jos C. S.
    Description

    Batch effects are technical sources of variation introduced by the necessity of conducting gene expression analyses on different dates due to the large number of biological samples in population-based studies. The aim of this study is to evaluate the performances of linear mixed models (LMM) and Combat in batch effect removal. We also assessed the utility of adding quality control samples in the study design as technical replicates. In order to do so, we simulated gene expression data by adding “treatment” and batch effects to a real gene expression dataset. The performances of LMM and Combat, with and without quality control samples, are assessed in terms of sensitivity and specificity while correcting for the batch effect using a wide range of effect sizes, statistical noise, sample sizes and level of balanced/unbalanced designs. The simulations showed small differences among LMM and Combat. LMM identifies stronger relationships between big effect sizes and gene expression than Combat, while Combat identifies in general more true and false positives than LMM. However, these small differences can still be relevant depending on the research goal. When any of these methods are applied, quality control samples did not reduce the batch effect, showing no added value for including them in the study design.

  16. Data from: Phase I Distribution-Free Analysis of Multivariate Data

    • tandf.figshare.com
    pdf
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Giovanna Capizzi; Guido Masarotto (2023). Phase I Distribution-Free Analysis of Multivariate Data [Dataset]. http://doi.org/10.6084/m9.figshare.4519223
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Taylor & Francishttps://taylorandfrancis.com/
    Authors
    Giovanna Capizzi; Guido Masarotto
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In this study, a new distribution-free Phase I control chart for retrospectively monitoring multivariate data is developed. The suggested approach, based on the multivariate signed ranks, can be applied to individual or subgrouped data for detection of location shifts with an arbitrary pattern (e.g., isolated, transitory, sustained, progressive, etc.). The procedure is complemented with a LASSO-based post-signal diagnostic method for identification of the shifted variables. A simulation study shows that the method compares favorably with parametric control charts when the process is normally distributed, and largely outperforms other multivariate nonparametric control charts when the process distribution is skewed or heavy-tailed. An R package can be found in the supplementary material.

  17. H

    Diabetes quality improvement in rural Guatemala

    • dataverse.harvard.edu
    • search.dataone.org
    Updated May 5, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    David Flood (2017). Diabetes quality improvement in rural Guatemala [Dataset]. http://doi.org/10.7910/DVN/SZBDIE
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    May 5, 2017
    Dataset provided by
    Harvard Dataverse
    Authors
    David Flood
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Area covered
    Guatemala
    Description

    Replication dataset for "A quality improvement project using statistical process control methods for Type 2 diabetes control in a resource-limited setting" (doi: 10.1093/intqhc/mzx051).

  18. m

    Monte Carlo and SBPNN-based critical values for Data Snooping

    • data.mendeley.com
    Updated Nov 8, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Vinicius Rofatto (2021). Monte Carlo and SBPNN-based critical values for Data Snooping [Dataset]. http://doi.org/10.17632/77sfpx9b74.6
    Explore at:
    Dataset updated
    Nov 8, 2021
    Authors
    Vinicius Rofatto
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Monte Carlo
    Description

    corrMatrix.m: This Matlab function computes the correlation matrix of w-test statistics.

    KMC.m: This Matlab function computes the critical values for max-w test statistic based on Monte Carlo method. It is needed to run corrMatrix.m before use it.

    kNN.m: This Matlab function based on neural networks allows anyone to obtain the desired critical value with good control of type I error. In that case, you need to download file SBPNN.mat and save it in your folder. It is needed to run corrMatrix.m before use it.

    SBPNN.mat: MATLAB's flexible network object type (called SBPNN.mat) that allows anyone to obtain the desired critical value with good control of type I error.

    Examples.txt: File containing examples of both design and covariance matrices in adjustment problems of geodetic networks.

    rawMC.txt: Monte-Carlo-based critical values for the following signifiance levels: α′= 0.001, α′= 0.01, α′= 0.05, α′= 0.1 and α′= 0.5. The number of the observations (n) were fixed for each α ′with n = 5 to n= 100 by a increment of 5. For each "n" the correlation between the w-tests (ρwi,wj) were also fixed from ρwi,wj = 0.00 to ρwi,wj = 1.00, by increment of 0.1, considering also taking into account the correlation ρwi,wj = 0.999. For each combination of α′,"n" and ρwi,wj, m= 5,000,000 Monte Carlo experiments were run.

  19. w

    Global Production Quality Control of In Vitro Diagnostic Reagent Market...

    • wiseguyreports.com
    Updated Sep 15, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Global Production Quality Control of In Vitro Diagnostic Reagent Market Research Report: By Application (Clinical Diagnostics, Research Laboratories, Blood Banks, Public Health), By Product Type (Biochemical Reagents, Immunoassay Reagents, Nucleic Acid Reagents, Microbiology Reagents), By Quality Control Method (Statistical Process Control, Chemical Analysis, Real-Time PCR, Enzyme-linked Immunosorbent Assay), By End User (Hospitals, Diagnostic Laboratories, Pharmaceutical Companies, Research Institutions) and By Regional (North America, Europe, South America, Asia Pacific, Middle East and Africa) - Forecast to 2035 [Dataset]. https://www.wiseguyreports.com/reports/production-quality-control-of-in-vitro-diagnostic-reagent-market
    Explore at:
    Dataset updated
    Sep 15, 2025
    License

    https://www.wiseguyreports.com/pages/privacy-policyhttps://www.wiseguyreports.com/pages/privacy-policy

    Time period covered
    Sep 25, 2025
    Area covered
    Global
    Description
    BASE YEAR2024
    HISTORICAL DATA2019 - 2023
    REGIONS COVEREDNorth America, Europe, APAC, South America, MEA
    REPORT COVERAGERevenue Forecast, Competitive Landscape, Growth Factors, and Trends
    MARKET SIZE 20243.64(USD Billion)
    MARKET SIZE 20253.84(USD Billion)
    MARKET SIZE 20356.5(USD Billion)
    SEGMENTS COVEREDApplication, Product Type, Quality Control Method, End User, Regional
    COUNTRIES COVEREDUS, Canada, Germany, UK, France, Russia, Italy, Spain, Rest of Europe, China, India, Japan, South Korea, Malaysia, Thailand, Indonesia, Rest of APAC, Brazil, Mexico, Argentina, Rest of South America, GCC, South Africa, Rest of MEA
    KEY MARKET DYNAMICSRegulatory compliance requirements, Rising prevalence of diseases, Technological advancements in diagnostics, Increasing demand for accuracy, Growing awareness of quality assurance
    MARKET FORECAST UNITSUSD Billion
    KEY COMPANIES PROFILEDSiemens Healthineers, Hologic, Thermo Fisher Scientific, Danaher Corporation, bioMérieux, Becton Dickinson, PerkinElmer, F. HoffmannLa Roche, Ortho Clinical Diagnostics, Sysmex Corporation, Roche Diagnostics, Quidel Corporation, Abbott Laboratories, Agilent Technologies, DiaSorin
    MARKET FORECAST PERIOD2025 - 2035
    KEY MARKET OPPORTUNITIESRegulatory compliance advancements, Increasing prevalence of chronic diseases, Rising demand for personalized medicine, Innovation in quality control technologies, Growth of point-of-care testing solutions
    COMPOUND ANNUAL GROWTH RATE (CAGR) 5.4% (2025 - 2035)
  20. u

    Data from: Alternative parameter learning schemes for monitoring process...

    • researchdata.cab.unipd.it
    • tandf.figshare.com
    Updated 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Daniele Zago; Giovanna Capizzi (2023). Alternative parameter learning schemes for monitoring process stability [Dataset]. http://doi.org/10.6084/m9.figshare.24171168.v1
    Explore at:
    Dataset updated
    2023
    Dataset provided by
    Taylor & Francis
    Authors
    Daniele Zago; Giovanna Capizzi
    Description

    In statistical process control, accurately estimating in-control (IC) parameters is crucial for effective monitoring. This typically requires a Phase I analysis to obtain estimates before monitoring commences. The traditional “fixed” estimate (FE) approach uses these estimates exclusively, while the “adaptive” estimate (AE) approach updates the estimates with each new observation. Such extreme criteria reflect the traditional bias-variance tradeoff in the framework of the sequential parameter learning schemes. This paper proposes an intermediate update rule that generalizes two ad hoc criteria for monitoring univariate Gaussian data, by giving a lower probability to parameter updates when an out-of-control (OC) situation is likely, therefore updating more frequently when there is no evidence of an OC scenario. The simulation study shows that this approach improves the detection power for small and early shifts, which are commonly regarded as a weakness of control charts based on fully online adaptive estimation. The paper also shows that the proposed method performs similarly to the fully adaptive procedure for larger or later shifts. The proposed method is illustrated by monitoring the sudden increase in ICU counts during the 2020 COVID outbreak in New York.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Canberk Elmalı; Özge Ural (2023). Statistical Process Control as a Tool for Quality Improvement A Case Study in Denim Pant Production [Dataset]. http://doi.org/10.6084/m9.figshare.22147508.v2
Organization logo

Data from: Statistical Process Control as a Tool for Quality Improvement A Case Study in Denim Pant Production

Related Article
Explore at:
docxAvailable download formats
Dataset updated
Feb 23, 2023
Dataset provided by
Figsharehttp://figshare.com/
Authors
Canberk Elmalı; Özge Ural
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

In this paper, we show that concept of Statistical Process Control tools was thoroughly examined and the definitions of quality control concepts were presented. This is significant because of it is anticipated that this study will contribute to the literature as an exemplary application that demonstrates the role of statistical process control (SPC) tools in quality improvement in the evaluation and decision-making phase.

This is significant because of this study is to investigate applications of quality control, to clarify statistical control methods and problem-solving procedures, to generate proposals for problem-solving approaches, and to disseminate improvement studies in the ready-to-wear industry. The basic Statistical Process Control tools used in the study, the most repetitive faults were detected and these faults were divided into sub-headings for more detailed analysis. In this way, it was tried to prevent the repetition of faults by going down to the root causes of any detected fault. With this different perspective, it is expected that the study will contribute to other fields.

We give consent for the publication of identifiable details, which can include photograph(s) and case history and details within the text (“Material”) to be published in the Journal of Quality Technology. We confirm that have seen and been given the opportunity to read both the Material and the Article (as attached) to be published by Taylor & Francis.

Search
Clear search
Close search
Google apps
Main menu