Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In this paper, we show that concept of Statistical Process Control tools was thoroughly examined and the definitions of quality control concepts were presented. This is significant because of it is anticipated that this study will contribute to the literature as an exemplary application that demonstrates the role of statistical process control (SPC) tools in quality improvement in the evaluation and decision-making phase.
This is significant because of this study is to investigate applications of quality control, to clarify statistical control methods and problem-solving procedures, to generate proposals for problem-solving approaches, and to disseminate improvement studies in the ready-to-wear industry. The basic Statistical Process Control tools used in the study, the most repetitive faults were detected and these faults were divided into sub-headings for more detailed analysis. In this way, it was tried to prevent the repetition of faults by going down to the root causes of any detected fault. With this different perspective, it is expected that the study will contribute to other fields.
We give consent for the publication of identifiable details, which can include photograph(s) and case history and details within the text (“Material”) to be published in the Journal of Quality Technology. We confirm that have seen and been given the opportunity to read both the Material and the Article (as attached) to be published by Taylor & Francis.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Data for benchmarking SPC against other process monitoring methods. The data consist of a one-dimensional timeseries of floats (x.csv). Addititionally information whether the data are within the specifications are provided as another time series (y.csv). The data are generated by solving an optimization problem for each time to generate a mixture distribution of different probability distributions. Then for each timestep one record is sampled. Inputs for the optimization problem are the given probability distributions, the lower and upper limit of the tolerance interval as well as the desired median of the data. Additionally weights of the different probability distributions can be given as boundary condions for the different time steps. Metadata generated from the solving are stored in k_matrix.csv (wheights at each time step) and distribs (probability distribution objects). The data consists of phases with data from a stable mixture distribution and phases with data from a mixture distribution that do not fulfill the stability criteria.
Facebook
TwitterWe present a new method for statistical process control (SPC) of a discrete part manufacturing system based on intrinsic geometrical properties of the parts, estimated from three-dimensional sensor data. An intrinsic method has the computational advantage of avoiding the difficult part registration problem, necessary in previous SPC approaches of three-dimensional geometrical data, but inadequate if noncontact sensors are used. The approach estimates the spectrum of the Laplace–Beltrami (LB) operator of the scanned parts and uses a multivariate nonparametric control chart for online process control. Our proposal brings SPC closer to computer vision and computer graphics methods aimed to detect large differences in shape (but not in size). However, the SPC problem differs in that small changes in either shape or size of the parts need to be detected, keeping a controllable false alarm rate and without completely filtering noise. An online or “Phase II” method and a scheme for starting up in the absence of prior data (“Phase I”) are presented. Comparison with earlier approaches that require registration shows the LB spectrum method to be more sensitive to rapidly detect small changes in shape and size, including the practical case when the sequence of part datasets is in the form of large, unequal size meshes. A post-alarm diagnostic method to investigate the location of defects on the surface of a part is also presented. While we focus in this article on surface (triangulation) data, the methods can also be applied to point cloud and voxel metrology data.
Facebook
TwitterMIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Datasets to the planned publication "Generalized Statistical Process Control via 1D-ResNet Pretraining" by Tobias Schulze, Louis Huebser, Sebastian Beckschulte and Robert H. Schmitt (Chair for Intelligence in Quality Sensing, Laboratory for Machine Tools and Production Engineering, WZL of RWTH Aachen University)
Data for benchmarking SPC against other process monitoring methods. The data consist of a one-dimensional timeseries of floats (x.csv). Addititionally information whether the data are within the specifications are provided as another time series (y.csv). The data are generated by solving an optimization problem for each time to generate a mixture distribution of different probability distributions. Then for each timestep one record is sampled. Inputs for the optimization problem are the given probability distributions, the lower and upper limit of the tolerance interval as well as the desired median of the data. Additionally weights of the different probability distributions can be given as boundary condions for the different time steps. Metadata generated from the solving are stored in k_matrix.csv (wheights at each time step) and distribs (probability distribution objects according to https://doi.org/10.5281/zenodo.8249487). The data consists of phases with data from a stable mixture distribution and phases with data from a mixture distribution that do not fulfill the stability criteria.
The train data were used to train the G-SPC model. The test data were used for benchmarking purposes
Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy – EXC-2023 Internet of Production – 390621612.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Machine learning methods have been widely used in different applications, including process control and monitoring. For handling statistical process control (SPC) problems, conventional supervised machine learning methods (e.g., artificial neural networks and support vector machines) would have some difficulties. For instance, a training dataset containing both in-control and out-of-control (OC) process observations is required by a supervised machine learning method, but it is rarely available in SPC applications. Furthermore, many machine learning methods work like black boxes. It is often difficult to interpret their learning mechanisms and the resulting decision rules in the context of an application. In the SPC literature, there have been some existing discussions on how to handle the lack of OC observations in the training data, using the one-class classification, artificial contrast, real-time contrast, and some other novel ideas. However, these approaches have their own limitations to handle SPC problems. In this article, we extend the self-starting process monitoring idea that has been employed widely in modern SPC research to a general learning framework for monitoring processes with serially correlated data. Under the new framework, process characteristics to learn are well specified in advance, and process learning is sequential in the sense that the learned process characteristics keep being updated during process monitoring. The learned process characteristics are then incorporated into a control chart for detecting process distributional shift based on all available data by the current observation time. Numerical studies show that process monitoring based on the new learning framework is more reliable and effective than some representative existing machine learning SPC approaches.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Because of the “curse of dimensionality,” high-dimensional processes present challenges to traditional multivariate statistical process monitoring (SPM) techniques. In addition, the unknown underlying distribution of and complicated dependency among variables such as heteroscedasticity increase the uncertainty of estimated parameters and decrease the effectiveness of control charts. In addition, the requirement of sufficient reference samples limits the application of traditional charts in high-dimension, low-sample-size scenarios (small n, large p). More difficulties appear when detecting and diagnosing abnormal behaviors caused by a small set of variables (i.e., sparse changes). In this article, we propose two change-point–based control charts to detect sparse shifts in the mean vector of high-dimensional heteroscedastic processes. Our proposed methods can start monitoring when the number of observations is a lot smaller than the dimensionality. The simulation results show that the proposed methods are robust to nonnormality and heteroscedasticity. Two real data examples are used to illustrate the effectiveness of the proposed control charts in high-dimensional applications. The R codes are provided online.
Facebook
Twitter
According to our latest research, the global Statistical Process Control (SPC) for Aerospace Manufacturing market size reached USD 1.43 billion in 2024, reflecting the increasing adoption of advanced quality management solutions across the aerospace sector. The market is projected to expand at a robust CAGR of 8.7% from 2025 to 2033, culminating in a forecasted market value of USD 3.09 billion by 2033. This growth is primarily driven by the escalating need for precision, regulatory compliance, and operational efficiency in aerospace manufacturing environments, as companies seek to minimize defects, reduce costs, and enhance product reliability.
The growth trajectory of the SPC for Aerospace Manufacturing market is significantly influenced by the aerospace industry’s relentless pursuit of quality and safety. As aircraft components become increasingly complex and regulatory bodies enforce stricter standards, manufacturers are compelled to implement robust process control methodologies. Statistical Process Control enables real-time monitoring and analysis of manufacturing processes, allowing for immediate identification and correction of deviations. This proactive approach reduces the risk of costly recalls and ensures that products consistently meet both customer and regulatory expectations. The integration of SPC with Industry 4.0 technologies, such as the Industrial Internet of Things (IIoT) and artificial intelligence, further enhances its value proposition by providing predictive insights and automating quality assurance tasks.
Another critical growth factor is the rising adoption of digital transformation initiatives across aerospace manufacturing facilities. Companies are investing heavily in digital SPC solutions to streamline data collection, facilitate advanced analytics, and enable remote monitoring. This digital shift is not only improving process visibility and traceability but is also fostering a culture of continuous improvement. As the aerospace sector faces mounting pressure to accelerate production cycles and reduce time-to-market, the ability to quickly identify process inefficiencies and implement corrective actions becomes a key competitive differentiator. In addition, the growing prevalence of multi-site manufacturing operations necessitates standardized quality control systems, further fueling demand for scalable SPC platforms.
The market’s expansion is also supported by the increasing complexity of aerospace supply chains. With the proliferation of global sourcing and the involvement of numerous suppliers, maintaining consistent quality standards has become more challenging. OEMs and Tier 1 suppliers are mandating the use of SPC tools among their supply chain partners to ensure uniformity and compliance with stringent aerospace standards, such as AS9100 and ISO 9001. This trend is particularly pronounced in regions with rapidly growing aerospace sectors, such as Asia Pacific and Europe, where local manufacturers are striving to meet international benchmarks. Furthermore, the ongoing advancements in SPC software, including cloud-based deployment and real-time data integration, are making these solutions more accessible and cost-effective for organizations of all sizes.
Regionally, North America continues to dominate the SPC for Aerospace Manufacturing market, owing to the presence of major aerospace OEMs, a mature regulatory environment, and early adoption of advanced manufacturing technologies. However, Asia Pacific is emerging as the fastest-growing region, driven by substantial investments in aerospace infrastructure, expanding manufacturing capabilities, and increasing focus on quality management. European manufacturers are also prioritizing SPC adoption to maintain their competitive edge and comply with evolving regulatory frameworks. As the global aerospace industry becomes more interconnected, cross-regional collaborations and harmonization of quality standards are expected to further accelerate the adoption of SPC solutions worldwide.
Facebook
Twitterhttps://www.statsndata.org/how-to-orderhttps://www.statsndata.org/how-to-order
The Statistical Process Control System (SPC) market has emerged as a critical component in quality management and process optimization across various industries, significantly enhancing operational efficiency and product quality. SPC utilizes statistical methods and tools to monitor and control manufacturing process
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
ABSTRACT For the release of pharmaceutical products into the drug market; most of the pharmaceutical companies depend on acceptance criteria - that are set internally, regulatory and/or pharmacopeially. However, statistical process control monitoring is underestimated in most quality control in cases; although it is important not only for process stability and efficiency assessment but also for compliance with all appropriate pharmaceutical practices such as good manufacturing practice and good laboratory practice, known collectively as GXP. The current work aims to investigate two tablet inspection characteristics monitored during in-process control viz. tablet average weight and hardness. Both properties were assessed during the compression phase of the tablet and before the coating stage. Data gathering was performed by the Quality Assurance Team and processed by Commercial Statistical Software packages. Screening of collected results of 31 batches of an antibacterial tablet - based on Fluoroquinolone -showed that all the tested lots met the release specifications, although the process mean has been unstable which could be strongly evident in the variable control chart. Accordingly, the two inspected processes were not in the state of control and require strong actions to correct for the non-compliance to GXP. What is not controlled cannot be predicted in the future and thus the capability analysis would be of no value except to show the process capability retrospectively only. Setting the rules for the application of Statistical Process Control (SPC) should be mandated by Regulatory Agencies.
Facebook
Twitterhttps://www.wiseguyreports.com/pages/privacy-policyhttps://www.wiseguyreports.com/pages/privacy-policy
| BASE YEAR | 2024 |
| HISTORICAL DATA | 2019 - 2023 |
| REGIONS COVERED | North America, Europe, APAC, South America, MEA |
| REPORT COVERAGE | Revenue Forecast, Competitive Landscape, Growth Factors, and Trends |
| MARKET SIZE 2024 | 3.96(USD Billion) |
| MARKET SIZE 2025 | 4.25(USD Billion) |
| MARKET SIZE 2035 | 8.5(USD Billion) |
| SEGMENTS COVERED | Application, Deployment Model, End Use, Features, Regional |
| COUNTRIES COVERED | US, Canada, Germany, UK, France, Russia, Italy, Spain, Rest of Europe, China, India, Japan, South Korea, Malaysia, Thailand, Indonesia, Rest of APAC, Brazil, Mexico, Argentina, Rest of South America, GCC, South Africa, Rest of MEA |
| KEY MARKET DYNAMICS | Increasing demand for quality control, Adoption of automation technologies, Growing manufacturing sector, Regulatory compliance requirements, Rising need for data-driven decisions |
| MARKET FORECAST UNITS | USD Billion |
| KEY COMPANIES PROFILED | Rockwell Automation, Tableau, Minitab, InfinityQS, Alteryx, Oracle, PI System, Statgraphics, SAP, Cisco, Hexagon, SAS, Siemens, Qualityze, MathWorks, IBM |
| MARKET FORECAST PERIOD | 2025 - 2035 |
| KEY MARKET OPPORTUNITIES | Increased demand for quality management, Integration with IoT devices, Growth in manufacturing automation, Adoption of machine learning techniques, Expansion in emerging markets |
| COMPOUND ANNUAL GROWTH RATE (CAGR) | 7.2% (2025 - 2035) |
Facebook
TwitterSynchronization of variable trajectories from batch process data is a delicate operation that can induce artifacts in the definition of multivariate statistical process control (MSPC) models for real-time monitoring of batch processes. The current paper introduces a new synchronization-free approach for online batch MSPC. This approach is based on the use of local MSPC models that cover a normal operating conditions (NOC) trajectory defined from principal component analysis (PCA) modeling of non-synchronized historical batches. The rationale behind is that, although non-synchronized NOC batches are used, an overall NOC trajectory with a consistent evolution pattern can be described, even if batch-to-batch natural delays and differences between process starting and end points exist. Afterwards, the local MSPC models are used to monitor the evolution of new batches and derive the related MSPC chart. During the real-time monitoring of a new batch, this strategy allows testing whether every new observation is following or not the NOC trajectory. For a NOC observation, an additional indication of the batch process progress is provided based on the identification of the local MSPC model that provides the lowest residuals. When an observation deviates from the NOC behavior, contribution plots based on the projection of the observation to the best local MSPC model identified in the last NOC observation are used to diagnose the variables related to the fault. This methodology is illustrated using two real examples of NIR-monitored batch processes: a fluidized bed drying process and a batch distillation of gasoline blends with ethanol.
Facebook
Twitterhttps://www.statsndata.org/how-to-orderhttps://www.statsndata.org/how-to-order
The Statistical Process Control (SPC) Software market has increasingly become a cornerstone in quality management across various industries, including manufacturing, pharmaceuticals, and food processing. This specialized software leverages statistical methods to monitor and control production processes, ensuring tha
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Del Castillo and Zhao have recently proposed a new methodology for the Statistical Process Control (SPC) of discrete parts whose 3-dimensional (3D) geometrical data are acquired with non-contact sensors. The approach is based on monitoring the spectrum of the Laplace–Beltrami (LB) operator of each scanned part estimated using finite element methods (FEM). The spectrum of the LB operator is an intrinsic summary of the geometry of a part, independent of the ambient space. Hence, registration of scanned parts is unnecessary when comparing them. The primary goal of this case study paper is to demonstrate the practical implementation of the spectral SPC methodology through multiple examples using real scanned parts acquired with an industrial-grade laser scanner, including 3D-printed parts and commercial parts. We discuss the scanned mesh preprocessing needed in practice, including the type of remeshing found to be most beneficial for the FEM computations. For each part type, both the “phase I” and “phase II” stages of the spectral SPC methodology are showcased. In addition, we provide a new principled method to determine the number of eigenvalues of the LB operator to consider for efficient SPC of a given part geometry, and present an improved algorithm to automatically define a region of interest, particularly useful for large meshes. Computer codes that implement every method discussed in this paper, as well as all scanned part datasets used in the case studies, are made available and explained in the supplementary materials.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Source code and results for exactly regenerating the data and results of the simulation study reported in (to be published). Industrial sensory data is simulated and subjected to several data-driven methods for estimating temporal delays between the sensors. A new method is proposed to estimate these delays by optimizing multivariate correlations, and is shown to be more accurate than the currently more standard method that optimizes bivariate correlations.
Updated on November 6th, 2023 after internal review by authors.
Facebook
TwitterAdditional file 2. Ethics Statement Supplement. Contains details of MUSIC practices’ IRB status.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Abstract–Statistical process control (SPC) charts are critically important for quality control and management in manufacturing industries, environmental monitoring, disease surveillance, and many other applications. Conventional SPC charts are designed for cases when process observations are independent at different observation times. In practice, however, serial data correlation almost always exists in sequential data. It has been well demonstrated in the literature that control charts designed for independent data are unstable for monitoring serially correlated data. Thus, it is important to develop control charts specifically for monitoring serially correlated data. To this end, there is some existing discussion in the SPC literature. Most existing methods are based on parametric time series modeling and residual monitoring, where the data are often assumed to be normally distributed. In applications, however, the assumed parametric time series model with a given order and the normality assumption are often invalid, resulting in unstable process monitoring. Although there is some nice discussion on robust design of such residual monitoring control charts, the suggested designs can only handle certain special cases well. In this article, we try to make another effort by proposing a novel control chart that makes use of the restarting mechanism of a CUSUM chart and the related spring length concept. Our proposed chart uses observations within the spring length of the current time point and ignores all history data that are beyond the spring length. It does not require any parametric time series model and/or a parametric process distribution. It only requires the assumption that process observation at a given time point is associated with nearby observations and independent of observations that are far away in observation times, which should be reasonable for many applications. Numerical studies show that it performs well in different cases.
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Replication dataset for "A quality improvement project using statistical process control methods for Type 2 diabetes control in a resource-limited setting" (doi: 10.1093/intqhc/mzx051).
Facebook
TwitterBatch effects are technical sources of variation introduced by the necessity of conducting gene expression analyses on different dates due to the large number of biological samples in population-based studies. The aim of this study is to evaluate the performances of linear mixed models (LMM) and Combat in batch effect removal. We also assessed the utility of adding quality control samples in the study design as technical replicates. In order to do so, we simulated gene expression data by adding “treatment” and batch effects to a real gene expression dataset. The performances of LMM and Combat, with and without quality control samples, are assessed in terms of sensitivity and specificity while correcting for the batch effect using a wide range of effect sizes, statistical noise, sample sizes and level of balanced/unbalanced designs. The simulations showed small differences among LMM and Combat. LMM identifies stronger relationships between big effect sizes and gene expression than Combat, while Combat identifies in general more true and false positives than LMM. However, these small differences can still be relevant depending on the research goal. When any of these methods are applied, quality control samples did not reduce the batch effect, showing no added value for including them in the study design.
Facebook
Twitterhttps://www.wiseguyreports.com/pages/privacy-policyhttps://www.wiseguyreports.com/pages/privacy-policy
| BASE YEAR | 2024 |
| HISTORICAL DATA | 2019 - 2023 |
| REGIONS COVERED | North America, Europe, APAC, South America, MEA |
| REPORT COVERAGE | Revenue Forecast, Competitive Landscape, Growth Factors, and Trends |
| MARKET SIZE 2024 | 3.64(USD Billion) |
| MARKET SIZE 2025 | 3.84(USD Billion) |
| MARKET SIZE 2035 | 6.5(USD Billion) |
| SEGMENTS COVERED | Application, Product Type, Quality Control Method, End User, Regional |
| COUNTRIES COVERED | US, Canada, Germany, UK, France, Russia, Italy, Spain, Rest of Europe, China, India, Japan, South Korea, Malaysia, Thailand, Indonesia, Rest of APAC, Brazil, Mexico, Argentina, Rest of South America, GCC, South Africa, Rest of MEA |
| KEY MARKET DYNAMICS | Regulatory compliance requirements, Rising prevalence of diseases, Technological advancements in diagnostics, Increasing demand for accuracy, Growing awareness of quality assurance |
| MARKET FORECAST UNITS | USD Billion |
| KEY COMPANIES PROFILED | Siemens Healthineers, Hologic, Thermo Fisher Scientific, Danaher Corporation, bioMérieux, Becton Dickinson, PerkinElmer, F. HoffmannLa Roche, Ortho Clinical Diagnostics, Sysmex Corporation, Roche Diagnostics, Quidel Corporation, Abbott Laboratories, Agilent Technologies, DiaSorin |
| MARKET FORECAST PERIOD | 2025 - 2035 |
| KEY MARKET OPPORTUNITIES | Regulatory compliance advancements, Increasing prevalence of chronic diseases, Rising demand for personalized medicine, Innovation in quality control technologies, Growth of point-of-care testing solutions |
| COMPOUND ANNUAL GROWTH RATE (CAGR) | 5.4% (2025 - 2035) |
Facebook
TwitterIn statistical process monitoring, control charts typically depend on a set of tuning parameters besides its control limit(s). Proper selection of these tuning parameters is crucial to their performance. In a specific application, a control chart is often designed for detecting a target process distributional shift. In such cases, the tuning parameters should be chosen such that some characteristic of the out-of-control (OC) run length of the chart, such as its average, is minimized for detecting the target shift, while the control limit is set to maintain a desired in-control (IC) performance. However, explicit solutions for such a design are unavailable for most control charts, and thus numerical optimization methods are needed. In such cases, Monte Carlo-based methods are often a viable alternative for finding suitable design constants. The computational cost associated with such scenarios is often substantial, and thus computational efficiency is a key requirement. To address this problem, a two-step design based on stochastic approximations is presented in this paper, which is shown to be much more computationally efficient than some representative existing methods. A detailed discussion about the new algorithm’s implementation along with some examples are provided to demonstrate the broad applicability of the proposed methodology for the optimal design of univariate and multivariate control charts. Computer codes in the Julia programming language are also provided in the supplemental material.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In this paper, we show that concept of Statistical Process Control tools was thoroughly examined and the definitions of quality control concepts were presented. This is significant because of it is anticipated that this study will contribute to the literature as an exemplary application that demonstrates the role of statistical process control (SPC) tools in quality improvement in the evaluation and decision-making phase.
This is significant because of this study is to investigate applications of quality control, to clarify statistical control methods and problem-solving procedures, to generate proposals for problem-solving approaches, and to disseminate improvement studies in the ready-to-wear industry. The basic Statistical Process Control tools used in the study, the most repetitive faults were detected and these faults were divided into sub-headings for more detailed analysis. In this way, it was tried to prevent the repetition of faults by going down to the root causes of any detected fault. With this different perspective, it is expected that the study will contribute to other fields.
We give consent for the publication of identifiable details, which can include photograph(s) and case history and details within the text (“Material”) to be published in the Journal of Quality Technology. We confirm that have seen and been given the opportunity to read both the Material and the Article (as attached) to be published by Taylor & Francis.