U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is generally divided into two sequential problems: a joint state-parameter estimation problem, in which, using the model, the health of a system or component is determined based on the observations; and a prediction problem, in which, using the model, the state-parameter distribution is simulated forward in time to compute end of life and remaining useful life. The first problem is typically solved through the use of a state observer, or filter. The choice of filter depends on the assumptions that may be made about the system, and on the desired algorithm performance. In this paper, we review three separate filters for the solution to the first problem: the Daum filter, an exact nonlinear filter; the unscented Kalman filter, which approximates nonlinearities through the use of a deterministic sampling method known as the unscented transform; and the particle filter, which approximates the state distribution using a finite set of discrete, weighted samples, called particles. Using a centrifugal pump as a case study, we conduct a number of simulation-based experiments investigating the performance of the different algorithms as applied to prognostics.
In this work we compute a reasonably comprehensive set of tables for current and next generation survey facility filter conversions. Almost all useful transforms are included with the ProSpect software package described in Robotham et al (2020) Users are free to provide their own filters and compute their own transforms, where the included package examples outline the approach. This arXiv document will be relatively frequently updated, so people are encouraged to get in touch with their suggestions for additional utility (i.e. new filter sets).
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Sampling the aquatic environment for microplastic concentration is inherently difficult because of variations in microplastic concentration, shape, and density and the potential for contamination. We present an assessment of a method for microplastic sampling that uses a peristaltic pump to pump water through a series of in-line stainless-steel mesh filters. Following filtration, the stainless-steel filters were treated using previously published methods to isolate microplastics, adjusted for the stainless-steel mesh filters. Microplastics were identified using micro-Fourier Transform Infrared (µFTIR) spectroscopy in transmission mode. This method was tested in the laboratory using standard polyethylene beads and was applied to two sample sites at the Las Vegas Wash in Nevada. The results showed 70% of the polyethylene beads were recovered after the peristaltic pump and laboratory steps with minimal blank contamination. The advantages of the peristaltic pump sampling method are it (1) supports a range of sample volumes, (2) reduces sample handling, (3) reduces the potential for contamination, (4) provides flexibility in sampling locations, and (5) supports a variety of filter types. Using stainless-steel mesh filters allows for (1) streamlined and direct field-to-laboratory sample processing, (2) µFTIR transmission mode analysis of filter-mounted microplastics, and (3) reduced filter and sample processing costs.
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
The global market for Automatic Particulate Filter Efficiency Testers is experiencing robust growth, driven by increasing regulatory mandates for air quality and a rising demand for personal protective equipment (PPE) like respirators and masks. The market, estimated at $500 million in 2025, is projected to witness a Compound Annual Growth Rate (CAGR) of 7% from 2025 to 2033, reaching approximately $850 million by 2033. This growth is fueled by several factors. Stringent safety regulations in industries like healthcare, manufacturing, and construction are mandating the use of high-efficiency filters, leading to a corresponding increase in the demand for testing equipment. Furthermore, the ongoing focus on respiratory health, particularly post-pandemic, has significantly boosted the demand for reliable filter testing solutions. The market is segmented by application (masks, respirators, and others) and type (single-channel and multi-channel testers), with multi-channel systems gaining traction due to their increased efficiency in testing multiple filters simultaneously. Key players in the market are constantly innovating to improve testing accuracy, speed, and automation. The Asia-Pacific region, driven by significant manufacturing activities and a growing focus on environmental protection in China and India, is expected to dominate the market. Technological advancements are transforming the automatic particulate filter efficiency tester market. Miniaturization, improved sensor technology, and advanced data analysis capabilities are leading to more compact, accurate, and user-friendly testing devices. The increasing adoption of sophisticated software and cloud-based data management systems allows for remote monitoring and streamlined data analysis, enhancing the overall efficiency of filter testing processes. However, the high initial investment cost of these advanced systems remains a significant restraint. The competitive landscape is characterized by a mix of established players and emerging companies, with a focus on developing innovative products and expanding their geographical reach. The ongoing demand for higher testing accuracy, improved automation, and better data analytics will shape future market developments. The integration of AI and machine learning in testing protocols is likely to further accelerate market growth in the coming years.
Gravity data measure small changes in gravity due to changes in the density of rocks beneath the Earth's surface. The data collected are processed via standard methods to ensure the response recorded is that due only to the rocks in the ground. The results produce datasets that can be interpreted to reveal the geological structure of the sub-surface. The processed data is checked for quality by GA geophysicists to ensure that the final data released by GA are fit-for-purpose. This National Gravity Compilation 2019 DGIR tilt grid is produced from the 2019 Australian National Gravity Grids A series. These gravity data were acquired under the project No. 202008. The grid has a cell size of 0.00417 degrees (approximately 435m). The data are derived from ground observations stored in the Australian National Gravity Database (ANGD) as at September 2019, supplemented by offshore data sourced from v28.1 of the Global Gravity grid developed using data from the Scripps Institution of Oceanography, the National Oceanic and Atmospheric Administration (NOAA), and National Geospatial-Intelligence Agency (NGA) at Scripps Institution of Oceanography, University of California San Diego. Out of the approximately 1.8 million gravity observations, nearly 1.4 million gravity stations in the ANGD together with marine data were used to generate this grid. The ground gravity data used in the national grid has been acquired by the Commonwealth, State and Territory Governments, the mining and exploration industry, universities and research organisations from the 1940's to the present day. Station spacing for ground observations varies from approximately 11 km down to less than 1 km, with major parts of the continent having station spacing between 2.5 and 7 km. The DGIR was obtained by subtracting 3 quantities (i.e., the near-field isostatic correction, the far-field isostatic correction, and a first order trend correction) from Complete Bouguer Anomaly data (CBA) of the 2019 Australian National Gravity Grids A series. The grid shows a tilt of the de-trended global isostatic residual (DGIR) anomalies (A series) over Australia and its continental margins. A tilt filter was calculated by applying a fast Fourier transform (FFT) process to the DGIR grid of the 2019 Australian National Gravity Grids A series. A tilt filter is a ratio of the vertical derivative to the total horizontal derivative and is used for detection of edges of geological units.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This paper introduces a method aiming at enhancing the efficacy of speaker identification systems within challenging acoustic environments characterized by noise and reverberation. The methodology encompasses the utilization of diverse feature extraction techniques, including Mel-Frequency Cepstral Coefficients (MFCCs) and discrete transforms, such as Discrete Cosine Transform (DCT), Discrete Sine Transform (DST), and Discrete Wavelet Transform (DWT). Additionally, an Artificial Neural Network (ANN) serves as the classifier for this method. Reverberation is modeled using varying-length comb filters, and its impact on pitch frequency estimation is explored via the Auto Correlation Function (ACF). This paper also contributes to the field of cancelable speaker identification in both open and reverberation environments. The proposed method depends on comb filtering at the feature level, deliberately distorting MFCCs. This distortion, incorporated within a cancelable framework, serves to obscure speaker identities, rendering the system resilient to potential intruders. Three systems are presented in this work; a reverberation-affected speaker identification system, a system depending on cancelable features through comb filtering, and a novel cancelable speaker identification system within reverbration environments. The findings revealed that, in both scenarios with and without reverberation effects, the DWT-based features exhibited superior performance within the speaker identification system. Conversely, within the cancelable speaker identification system, the DCT-based features represent the top-performing choice.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Test dataset.
https://www.imarcgroup.com/privacy-policyhttps://www.imarcgroup.com/privacy-policy
The spectrum analyzer market size reached US$ 1.6 Billion in 2023. Looking forward, IMARC Group expects the market to reach US$ 2.9 Billion by 2032, exhibiting a growth rate (CAGR) of 7.1% during 2024-2032. The market is experiencing steady growth because of the heightened focus on strengthening the military and defense sector, increasing expansion of the telecommunication industry and availability of 5G connection, and rising demand for high frequency electronic devices.
Report Attribute
|
Key Statistics
|
---|---|
Base Year
|
2023
|
Forecast Years
|
2024-2032
|
Historical Years
|
2018-2023
|
Market Size in 2023 | US$ 1.6 Billion |
Market Forecast in 2032 | US$ 2.9 Billion |
Market Growth Rate (2024-2032) | 7.1% |
Rising Demand from Various End Use Industry
The increasing use of spectrum analyzers in a variety of end-use industries highlights how essential they are for the progress of current technological infrastructure. Robust spectrum analyzers are essential for the efficient installation, optimization, and troubleshooting of complex, high-speed networks like 5G and the impending 6G in the telecommunications industry. This is to ensure that the networks fulfill regulatory and performance standards. According to the IMARC Group, the global 5G services market is expected to exhibit a growth rate (CAGR) of 44.1% during 2024-2032.
The aerospace and defense sectors also depend on these instruments for tasks where accuracy and dependability are critical, like electronic warfare, satellite tracking, and testing radar systems. Maintaining national security requires the use of sophisticated, trustworthy spectrum analyzers more than ever as technological warfare and geopolitical tensions are rising.
Growing Need for High-Frequency Electronic Devices
The market for spectrum analyzers is driven by the increasing demand for high-frequency electronic devices, which are necessary to guarantee the functionality and conformance of different technologies that operate at high frequencies. A growing number of industries, such as consumer electronics, automotive, aerospace, and telecommunications, are utilizing high-frequency applications, which drives the need for accurate and dependable spectrum analysis.
Spectrum analyzers are becoming necessary as consumer electronics, such as smartphones, tablets, and other connected devices, allow high-frequency operations to ensure faster data transmission and increased performance. These instruments are used by manufacturers to test devices throughout development and production for making sure they fulfill the necessary requirements for radio frequency (RF) and electromagnetic interference (EMI) performance, as well as regulatory compliance.
Boost in Wireless Communication Technologies
Many industries are benefiting greatly from the utilization of wireless communication technologies, which have increased productivity, connectedness, and the availability of new services. The increased use of wireless technologies is also catalyzing the need for sophisticated testing and monitoring instruments, such as spectrum analyzers, which are necessary to guarantee that wireless systems are efficient and adhere to global norms. The debut of 5G was a huge advancement in wireless communication technology, and the ongoing research into 6G offers an even larger leap forward. These technologies offer reduced latency, quicker communication rates, and the ability to connect multiple devices at once. This is necessary to sustain the expanding Internet of Things (IoT) ecosystem, which depends on the constant and smooth transfer of data between devices via wireless networks. Moreover, the IMARC Group predicts that the global IoT market is expected to reach US$ 3,174.2 Billion by 2032. This will further increase the use of spectrum analyzers for better data transmission.
Increasing Electronics Manufacturing
To control radio frequencies and guarantee device interoperability, the continuous invention and release of high-tech consumer goods like smartphones, tablets, and wearables require stringent testing. The need for spectrum analyzers, which are crucial for reducing signal interference and confirming product operation, is strongly correlated with this necessity. For accommodating higher frequencies and wider bandwidths, the growth of wireless communication standards, including 5G and beyond, requires increased testing. The growing demand for spectrum analyzers capable of handling these new standards is encouraging manufacturers to create more advanced devices. Electronic control systems are used in production settings due to automation and robotics, and these systems need to be tested on a regular basis to ensure that their functionality is maintained. A key function of spectrum analyzers in these procedures is to ensure that all electronic components function without any hindrance.
Risk of Unauthorized Drone Activities
The use of spectrum analyzers is essential for identifying the radio frequencies that drones use. Security officers can find and identify unlawful drones in sensitive or restricted regions with the aid of these sensors, which identify the exact frequencies and signals linked to drone activities. This feature is crucial in places like airports where unapproved drones can seriously jeopardize aircraft safety. Spectrum analyzers are also used to collect signal information, which goes beyond simple detection. Understanding the kind of communication that occurs between a drone and its controller is essential for creating countermeasures against possible dangers. Apart from this, spectrum analyzers assist in the development of jamming tactics to efficiently deactivate unlicensed drones in security-sensitive regions.
Technological Advancements
Ongoing advancements in the creation of spectrum analyzers is impelling the growth of the market. The proliferation of IoT devices across automotive industries, the healthcare sector, smart cities, and agriculture depends on a robust network system that operates without interference. Spectrum analyzers play a crucial role in monitoring and troubleshooting these systems, ensuring the efficient transmission of data and assisting in the maintenance of several IoT systems. Apart from this, there is an increase in the need for portable, handheld spectrum analyzers that present real time monitoring capabilities. This is particularly essential in field applications like environmental monitoring or on site communication system diagnostics. For example, Anritsu offers a wide range of handheld spectrum analyzers that cover frequencies from 9kHz to 170 GHz.
Government
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Output recognition rates of the speaker identification system for different feature extraction techniques at different SNRs without reverberation effect.
https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy
The global high-temperature baghouse dust collector market, valued at $5762 million in 2025, is projected to experience steady growth, driven by stringent environmental regulations across key industries like mining, steel, and cement. The increasing focus on reducing particulate matter emissions and improving air quality is a significant catalyst for market expansion. Technological advancements in filter bag materials, enabling higher temperature tolerance and longer operational lifespans, are further fueling market growth. The mining and steel sectors are major consumers, owing to the high dust generation during their processes. However, the high initial investment costs associated with these systems and the need for regular maintenance can act as restraints. The market is segmented by application (mining, steel & metallurgy, cement, electricity, others) and type (mechanical shaker, pulse jet, reverse air). While the pulse jet type currently dominates due to its efficiency, the mechanical shaker type maintains a significant presence in existing installations. The Asia-Pacific region, particularly China and India, is expected to exhibit robust growth due to industrial expansion and infrastructure development. North America and Europe, while mature markets, will continue to see moderate growth driven by upgrades and replacements of existing systems. The forecast period of 2025-2033 suggests a continuous, albeit gradual, expansion reflecting sustained demand and ongoing technological improvements. This growth will be influenced by regional economic conditions and the implementation of stricter environmental standards globally. The competitive landscape is characterized by a mix of established players like FLSmidth, Donaldson, and Nederman, along with regional and specialized manufacturers. These companies are focused on product innovation, strategic partnerships, and geographical expansion to maintain their market share. Future growth will depend on the ability of manufacturers to offer cost-effective solutions while ensuring high efficiency and reliability. The market is likely to witness increased consolidation as companies seek economies of scale and broader product portfolios. The rising adoption of smart technologies, such as sensor-based monitoring and predictive maintenance, will transform operational efficiency and further shape the market trajectory in the coming years. Factors such as fluctuating raw material prices and global economic uncertainties could potentially impact market growth but are anticipated to be offset by the long-term trend towards cleaner production processes.
https://www.zionmarketresearch.com/privacy-policyhttps://www.zionmarketresearch.com/privacy-policy
Global spectrum analyzer market is expected to revenue of around USD 3.32 billion by 2032, growing at a CAGR of around 8.1% between 2024 and 2032.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Setting parameters for S-G filter, FIR filter, AWT, and MODWT.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Output recognition rates of the cancelable speaker identification system in the presence of reverberation for different feature extraction techniques at different SNRs.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Accuracies of the pair-wise registrations.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Comparison of the fitting accuracies.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Correlation matrix from raw wastewater.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Water quality characteristics from 0.5 m column depth effluent.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Physicochemical properties of the zeolite material used.
Not seeing a result you expected?
Learn how you can add new datasets to our index.
U.S. Government Workshttps://www.usa.gov/government-works
License information was derived automatically
Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is generally divided into two sequential problems: a joint state-parameter estimation problem, in which, using the model, the health of a system or component is determined based on the observations; and a prediction problem, in which, using the model, the state-parameter distribution is simulated forward in time to compute end of life and remaining useful life. The first problem is typically solved through the use of a state observer, or filter. The choice of filter depends on the assumptions that may be made about the system, and on the desired algorithm performance. In this paper, we review three separate filters for the solution to the first problem: the Daum filter, an exact nonlinear filter; the unscented Kalman filter, which approximates nonlinearities through the use of a deterministic sampling method known as the unscented transform; and the particle filter, which approximates the state distribution using a finite set of discrete, weighted samples, called particles. Using a centrifugal pump as a case study, we conduct a number of simulation-based experiments investigating the performance of the different algorithms as applied to prognostics.