100+ datasets found
  1. Data from: Normalized data

    • figshare.com
    txt
    Updated Jun 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yalbi Balderas (2022). Normalized data [Dataset]. http://doi.org/10.6084/m9.figshare.20076047.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jun 15, 2022
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Yalbi Balderas
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Normalize data

  2. R

    Corporate Registry Data Normalization Market Research Report 2033

    • researchintelo.com
    csv, pdf, pptx
    Updated Oct 1, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Intelo (2025). Corporate Registry Data Normalization Market Research Report 2033 [Dataset]. https://researchintelo.com/report/corporate-registry-data-normalization-market
    Explore at:
    pdf, csv, pptxAvailable download formats
    Dataset updated
    Oct 1, 2025
    Dataset authored and provided by
    Research Intelo
    License

    https://researchintelo.com/privacy-and-policyhttps://researchintelo.com/privacy-and-policy

    Time period covered
    2024 - 2033
    Area covered
    Global
    Description

    Corporate Registry Data Normalization Market Outlook



    According to our latest research, the Global Corporate Registry Data Normalization market size was valued at $1.72 billion in 2024 and is projected to reach $5.36 billion by 2033, expanding at a CAGR of 13.2% during 2024–2033. One major factor driving the growth of this market globally is the escalating demand for accurate, real-time corporate data to support compliance, risk management, and operational efficiency across diverse sectors. As organizations increasingly digitize their operations, the need to standardize and normalize disparate registry data from multiple sources has become critical to ensure regulatory adherence, enable robust Know Your Customer (KYC) and Anti-Money Laundering (AML) processes, and foster seamless integration with internal and external systems. This trend is further amplified by the proliferation of cross-border business activities and the mounting complexity of global regulatory frameworks, making data normalization solutions indispensable for businesses seeking agility and resilience in a rapidly evolving digital landscape.



    Regional Outlook



    North America currently commands the largest share of the global Corporate Registry Data Normalization market, accounting for over 38% of the total market value in 2024. The region’s dominance is underpinned by its mature digital infrastructure, early adoption of advanced data management technologies, and stringent regulatory requirements that mandate comprehensive corporate transparency and compliance. Major economies such as the United States and Canada have witnessed significant investments in data normalization platforms, driven by the robust presence of multinational corporations, sophisticated financial institutions, and a dynamic legal environment. Additionally, the region benefits from a thriving ecosystem of technology vendors and solution providers, fostering continuous innovation and the rapid deployment of cutting-edge software and services. These factors collectively reinforce North America’s leadership position, making it a bellwether for global market trends and technological advancements in corporate registry data normalization.



    In contrast, the Asia Pacific region is emerging as the fastest-growing market, projected to register a remarkable CAGR of 16.7% during the forecast period. This accelerated expansion is fueled by rapid digital transformation initiatives, burgeoning fintech and legaltech sectors, and a rising emphasis on corporate governance across countries such as China, India, Singapore, and Australia. Governments in the region are actively promoting regulatory modernization and digital identity frameworks, which in turn drive the adoption of data normalization solutions to streamline compliance and mitigate operational risks. Furthermore, the influx of foreign direct investment and the proliferation of cross-border business transactions are compelling enterprises to invest in robust data management tools that can harmonize corporate information from disparate jurisdictions. These dynamics are creating fertile ground for solution providers and service vendors to expand their footprint and address the unique needs of Asia Pacific’s diverse and rapidly evolving corporate landscape.



    Meanwhile, emerging economies in Latin America, the Middle East, and Africa present a mixed outlook, characterized by growing awareness but slower adoption of corporate registry data normalization solutions. Challenges such as legacy IT infrastructure, fragmented regulatory environments, and limited access to advanced technology solutions continue to impede market penetration in these regions. However, a gradual shift is underway as governments and enterprises recognize the value of standardized corporate data for combating financial crime, fostering transparency, and attracting international investment. Localized demand is also being shaped by sector-specific needs, particularly in banking, government, and healthcare, where regulatory compliance and risk management are gaining prominence. Policy reforms and international collaborations are expected to play a pivotal role in accelerating adoption, though progress will likely be uneven across different countries and industry verticals.



    Report Scope



    Attri

  3. f

    DataSheet1_TimeNorm: a novel normalization method for time course microbiome...

    • datasetcatalog.nlm.nih.gov
    • frontiersin.figshare.com
    Updated Sep 24, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    An, Lingling; Lu, Meng; Butt, Hamza; Luo, Qianwen; Du, Ruofei; Lytal, Nicholas; Jiang, Hongmei (2024). DataSheet1_TimeNorm: a novel normalization method for time course microbiome data.pdf [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0001407445
    Explore at:
    Dataset updated
    Sep 24, 2024
    Authors
    An, Lingling; Lu, Meng; Butt, Hamza; Luo, Qianwen; Du, Ruofei; Lytal, Nicholas; Jiang, Hongmei
    Description

    Metagenomic time-course studies provide valuable insights into the dynamics of microbial systems and have become increasingly popular alongside the reduction in costs of next-generation sequencing technologies. Normalization is a common but critical preprocessing step before proceeding with downstream analysis. To the best of our knowledge, currently there is no reported method to appropriately normalize microbial time-series data. We propose TimeNorm, a novel normalization method that considers the compositional property and time dependency in time-course microbiome data. It is the first method designed for normalizing time-series data within the same time point (intra-time normalization) and across time points (bridge normalization), separately. Intra-time normalization normalizes microbial samples under the same condition based on common dominant features. Bridge normalization detects and utilizes a group of most stable features across two adjacent time points for normalization. Through comprehensive simulation studies and application to a real study, we demonstrate that TimeNorm outperforms existing normalization methods and boosts the power of downstream differential abundance analysis.

  4. Identification of Novel Reference Genes Suitable for qRT-PCR Normalization...

    • plos.figshare.com
    tiff
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yu Hu; Shuying Xie; Jihua Yao (2023). Identification of Novel Reference Genes Suitable for qRT-PCR Normalization with Respect to the Zebrafish Developmental Stage [Dataset]. http://doi.org/10.1371/journal.pone.0149277
    Explore at:
    tiffAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Yu Hu; Shuying Xie; Jihua Yao
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Reference genes used in normalizing qRT-PCR data are critical for the accuracy of gene expression analysis. However, many traditional reference genes used in zebrafish early development are not appropriate because of their variable expression levels during embryogenesis. In the present study, we used our previous RNA-Seq dataset to identify novel reference genes suitable for gene expression analysis during zebrafish early developmental stages. We first selected 197 most stably expressed genes from an RNA-Seq dataset (29,291 genes in total), according to the ratio of their maximum to minimum RPKM values. Among the 197 genes, 4 genes with moderate expression levels and the least variation throughout 9 developmental stages were identified as candidate reference genes. Using four independent statistical algorithms (delta-CT, geNorm, BestKeeper and NormFinder), the stability of qRT-PCR expression of these candidates was then evaluated and compared to that of actb1 and actb2, two commonly used zebrafish reference genes. Stability rankings showed that two genes, namely mobk13 (mob4) and lsm12b, were more stable than actb1 and actb2 in most cases. To further test the suitability of mobk13 and lsm12b as novel reference genes, they were used to normalize three well-studied target genes. The results showed that mobk13 and lsm12b were more suitable than actb1 and actb2 with respect to zebrafish early development. We recommend mobk13 and lsm12b as new optimal reference genes for zebrafish qRT-PCR analysis during embryogenesis and early larval stages.

  5. n

    Methods for normalizing microbiome data: an ecological perspective

    • data.niaid.nih.gov
    • datadryad.org
    zip
    Updated Oct 30, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Donald T. McKnight; Roger Huerlimann; Deborah S. Bower; Lin Schwarzkopf; Ross A. Alford; Kyall R. Zenger (2018). Methods for normalizing microbiome data: an ecological perspective [Dataset]. http://doi.org/10.5061/dryad.tn8qs35
    Explore at:
    zipAvailable download formats
    Dataset updated
    Oct 30, 2018
    Dataset provided by
    James Cook University
    University of New England
    Authors
    Donald T. McKnight; Roger Huerlimann; Deborah S. Bower; Lin Schwarzkopf; Ross A. Alford; Kyall R. Zenger
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Description
    1. Microbiome sequencing data often need to be normalized due to differences in read depths, and recommendations for microbiome analyses generally warn against using proportions or rarefying to normalize data and instead advocate alternatives, such as upper quartile, CSS, edgeR-TMM, or DESeq-VS. Those recommendations are, however, based on studies that focused on differential abundance testing and variance standardization, rather than community-level comparisons (i.e., beta diversity), Also, standardizing the within-sample variance across samples may suppress differences in species evenness, potentially distorting community-level patterns. Furthermore, the recommended methods use log transformations, which we expect to exaggerate the importance of differences among rare OTUs, while suppressing the importance of differences among common OTUs. 2. We tested these theoretical predictions via simulations and a real-world data set. 3. Proportions and rarefying produced more accurate comparisons among communities and were the only methods that fully normalized read depths across samples. Additionally, upper quartile, CSS, edgeR-TMM, and DESeq-VS often masked differences among communities when common OTUs differed, and they produced false positives when rare OTUs differed. 4. Based on our simulations, normalizing via proportions may be superior to other commonly used methods for comparing ecological communities.
  6. k

    Data from: When Normalizing Monetary Policy, the Order of Operations Matters...

    • kansascityfed.org
    pdf
    Updated Oct 14, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2021). When Normalizing Monetary Policy, the Order of Operations Matters [Dataset]. https://www.kansascityfed.org/research/economic-bulletin/when-normalizing-monetary-policy-the-order-of-operations-matters/
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Oct 14, 2021
    Description

    As economic conditions in the United States continue to improve, the FOMC may consider normalizing monetary policy. Whether the FOMC reduces the balance sheet before raising the federal funds rate (or vice versa) may affect the shape of the yield curve, with consequences for financial institutions. Drawing lessons from the previous normalization in 2015–19, we conclude that normalizing the balance sheet before raising the funds rate might forestall yield curve inversion and, in turn, support economic stability.

  7. D

    Security Data Normalization Platform Market Research Report 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Security Data Normalization Platform Market Research Report 2033 [Dataset]. https://dataintelo.com/report/security-data-normalization-platform-market
    Explore at:
    pdf, csv, pptxAvailable download formats
    Dataset updated
    Sep 30, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Security Data Normalization Platform Market Outlook



    According to our latest research, the global Security Data Normalization Platform market size reached USD 1.48 billion in 2024, reflecting robust demand across industries for advanced security data management solutions. The market is registering a compound annual growth rate (CAGR) of 18.7% and is projected to achieve a value of USD 7.18 billion by 2033. The ongoing surge in sophisticated cyber threats and the increasing complexity of enterprise IT environments are among the primary growth factors driving the adoption of security data normalization platforms worldwide.




    The growth of the Security Data Normalization Platform market is primarily fuelled by the exponential rise in cyberattacks and the proliferation of digital transformation initiatives across various sectors. As organizations accumulate vast amounts of security data from disparate sources, the need for platforms that can aggregate, normalize, and analyze this data has become critical. Enterprises are increasingly recognizing that traditional security information and event management (SIEM) systems fall short in handling the volume, velocity, and variety of data generated by modern IT infrastructures. Security data normalization platforms address this challenge by transforming heterogeneous data into a standardized format, enabling more effective threat detection, investigation, and response. This capability is particularly vital as organizations move toward zero trust architectures and require real-time insights to secure their digital assets.




    Another significant growth driver for the Security Data Normalization Platform market is the evolving regulatory landscape. Governments and regulatory bodies worldwide are introducing stringent data protection and cybersecurity regulations, compelling organizations to enhance their security postures. Compliance requirements such as GDPR, HIPAA, and CCPA demand that organizations not only secure their data but also maintain comprehensive audit trails and reporting mechanisms. Security data normalization platforms facilitate compliance by providing unified, normalized logs and reports that simplify audit processes and ensure regulatory adherence. The market is also witnessing increased adoption in sectors such as BFSI, healthcare, and government, where data integrity and compliance are paramount.




    Technological advancements are further accelerating the adoption of security data normalization platforms. The integration of artificial intelligence (AI) and machine learning (ML) capabilities into these platforms is enabling automated threat detection, anomaly identification, and predictive analytics. Cloud-based deployment models are gaining traction, offering scalability, flexibility, and cost-effectiveness to organizations of all sizes. As the threat landscape becomes more dynamic and sophisticated, organizations are prioritizing investments in advanced security data normalization solutions that can adapt to evolving risks and support proactive security strategies. The growing ecosystem of managed security service providers (MSSPs) is also contributing to market expansion by delivering normalization as a service to organizations with limited in-house expertise.




    From a regional perspective, North America continues to dominate the Security Data Normalization Platform market, accounting for the largest share in 2024 due to the presence of major technology vendors, high cybersecurity awareness, and significant investments in digital infrastructure. Europe follows closely, driven by strict regulatory mandates and increasing cyber threats targeting critical sectors. The Asia Pacific region is emerging as a high-growth market, propelled by rapid digitization, expanding IT ecosystems, and rising cybercrime incidents. Latin America and the Middle East & Africa are also witnessing steady growth, albeit from a smaller base, as organizations in these regions accelerate their cybersecurity modernization efforts. The global outlook for the Security Data Normalization Platform market remains positive, with sustained demand expected across all major regions through 2033.



    Component Analysis



    The Security Data Normalization Platform market is segmented by component into software and services. Software solutions form the core of this market, providing the essential functionalities for data aggregation, normalization, enrichment, and integration with downs

  8. ARCS White Beam Vanadium Normalization Data for SNS Cycle 2022B (May 15 -...

    • osti.gov
    Updated May 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Spallation Neutron Source (SNS) (2025). ARCS White Beam Vanadium Normalization Data for SNS Cycle 2022B (May 15 - Jun., 14, 2022) [Dataset]. http://doi.org/10.14461/oncat.data/2568320
    Explore at:
    Dataset updated
    May 30, 2025
    Dataset provided by
    Department of Energy Basic Energy Sciences Programhttp://science.energy.gov/user-facilities/basic-energy-sciences/
    Office of Sciencehttp://www.er.doe.gov/
    Spallation Neutron Source (SNS)
    Description

    A data set used to normalize the detector response of the ARCS instrument see ARCS_226797.md in the data set for more details.

  9. G

    Security Data Normalization Platform Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Aug 22, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). Security Data Normalization Platform Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/security-data-normalization-platform-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Aug 22, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Security Data Normalization Platform Market Outlook



    According to our latest research, the global Security Data Normalization Platform market size reached USD 1.87 billion in 2024, driven by the rapid escalation of cyber threats and the growing complexity of enterprise security infrastructures. The market is expected to grow at a robust CAGR of 12.5% during the forecast period, reaching an estimated USD 5.42 billion by 2033. Growth is primarily fueled by the increasing adoption of advanced threat intelligence solutions, regulatory compliance demands, and the proliferation of connected devices across various industries.




    The primary growth factor for the Security Data Normalization Platform market is the exponential rise in cyberattacks and security breaches across all sectors. Organizations are increasingly realizing the importance of normalizing diverse security data sources to enable efficient threat detection, incident response, and compliance management. As security environments become more complex with the integration of cloud, IoT, and hybrid infrastructures, the need for platforms that can aggregate, standardize, and correlate data from disparate sources has become paramount. This trend is particularly pronounced in sectors such as BFSI, healthcare, and government, where data sensitivity and regulatory requirements are highest. The growing sophistication of cyber threats has compelled organizations to invest in robust security data normalization platforms to ensure comprehensive visibility and proactive risk mitigation.




    Another significant driver is the evolving regulatory landscape, which mandates stringent data protection and reporting standards. Regulations such as the General Data Protection Regulation (GDPR), Health Insurance Portability and Accountability Act (HIPAA), and various national cybersecurity frameworks have compelled organizations to enhance their security postures. Security data normalization platforms play a crucial role in facilitating compliance by providing unified and actionable insights from heterogeneous data sources. These platforms enable organizations to automate compliance reporting, streamline audit processes, and reduce the risk of penalties associated with non-compliance. The increasing focus on regulatory alignment is pushing both large enterprises and SMEs to adopt advanced normalization solutions as part of their broader security strategies.




    The proliferation of digital transformation initiatives and the accelerated adoption of cloud-based solutions are further propelling market growth. As organizations migrate critical workloads to the cloud and embrace remote work models, the volume and variety of security data have surged dramatically. This shift has created new challenges in terms of data integration, normalization, and real-time analysis. Security data normalization platforms equipped with advanced analytics and machine learning capabilities are becoming indispensable for managing the scale and complexity of modern security environments. Vendors are responding to this demand by offering scalable, cloud-native solutions that can seamlessly integrate with existing security information and event management (SIEM) systems, threat intelligence platforms, and incident response tools.




    From a regional perspective, North America continues to dominate the Security Data Normalization Platform market, accounting for the largest revenue share in 2024. The region’s leadership is attributed to the high concentration of technology-driven enterprises, robust cybersecurity regulations, and significant investments in advanced security infrastructure. Europe and Asia Pacific are also witnessing strong growth, driven by increasing digitalization, rising threat landscapes, and the adoption of stringent data protection laws. Emerging markets in Latin America and the Middle East & Africa are gradually catching up, supported by growing awareness of cybersecurity challenges and the need for standardized security data management solutions.





    Component Analysis



  10. c

    Data from: LVMED: Dataset of Latvian text normalisation samples for the...

    • repository.clarin.lv
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Viesturs Jūlijs Lasmanis; Normunds Grūzītis (2023). LVMED: Dataset of Latvian text normalisation samples for the medical domain [Dataset]. https://repository.clarin.lv/repository/xmlui/handle/20.500.12574/85
    Explore at:
    Dataset updated
    May 30, 2023
    Authors
    Viesturs Jūlijs Lasmanis; Normunds Grūzītis
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    The CSV dataset contains sentence pairs for a text-to-text transformation task: given a sentence that contains 0..n abbreviations, rewrite (normalize) the sentence in full words (word forms).

    Training dataset: 64,665 sentence pairs Validation dataset: 7,185 sentence pairs. Testing dataset: 7,984 sentence pairs.

    All sentences are extracted from a public web corpus (https://korpuss.lv/id/Tīmeklis2020) and contain at least one medical term.

  11. d

    WLCI - Important Agricultural Lands Assessment (Input Raster: Normalized...

    • catalog.data.gov
    • data.usgs.gov
    • +2more
    Updated Oct 30, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2025). WLCI - Important Agricultural Lands Assessment (Input Raster: Normalized Antelope Damage Claims) [Dataset]. https://catalog.data.gov/dataset/wlci-important-agricultural-lands-assessment-input-raster-normalized-antelope-damage-claim
    Explore at:
    Dataset updated
    Oct 30, 2025
    Dataset provided by
    U.S. Geological Survey
    Description

    The values in this raster are unit-less scores ranging from 0 to 1 that represent normalized dollars per acre damage claims from antelope on Wyoming lands. This raster is one of 9 inputs used to calculate the "Normalized Importance Index."

  12. G

    Flight Data Normalization Platform Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Oct 4, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). Flight Data Normalization Platform Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/flight-data-normalization-platform-market
    Explore at:
    pptx, csv, pdfAvailable download formats
    Dataset updated
    Oct 4, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Flight Data Normalization Platform Market Outlook



    According to our latest research, the global flight data normalization platform market size reached USD 1.12 billion in 2024, exhibiting robust industry momentum. The market is projected to grow at a CAGR of 10.3% from 2025 to 2033, reaching an estimated value of USD 2.74 billion by 2033. This growth is primarily driven by the increasing adoption of advanced analytics in aviation, the rising need for operational efficiency, and the growing emphasis on regulatory compliance and safety enhancements across the aviation sector.




    A key growth factor for the flight data normalization platform market is the rapid digital transformation within the aviation industry. Airlines, airports, and maintenance organizations are increasingly relying on digital platforms to aggregate, process, and normalize vast volumes of flight data generated by modern aircraft systems. The transition from legacy systems to integrated digital solutions is enabling real-time data analysis, predictive maintenance, and enhanced situational awareness. This shift is not only improving operational efficiency but also reducing downtime and maintenance costs, making it an essential strategy for airlines and operators aiming to remain competitive in a highly regulated environment.




    Another significant driver fueling the expansion of the flight data normalization platform market is the stringent regulatory landscape governing aviation safety and compliance. Aviation authorities worldwide, such as the Federal Aviation Administration (FAA) and the European Union Aviation Safety Agency (EASA), are mandating the adoption of advanced flight data monitoring and normalization solutions to ensure adherence to safety protocols and to facilitate incident investigation. These regulatory requirements are compelling aviation stakeholders to invest in platforms that can seamlessly normalize and analyze data from diverse sources, thereby supporting proactive risk management and compliance reporting.




    Additionally, the growing complexity of aircraft systems and the proliferation of connected devices in aviation have led to an exponential increase in the volume and variety of flight data. The need to harmonize disparate data formats and sources into a unified, actionable format is driving demand for sophisticated flight data normalization platforms. These platforms enable stakeholders to extract actionable insights from raw flight data, optimize flight operations, and support advanced analytics use cases such as fuel efficiency optimization, fleet management, and predictive maintenance. As the aviation industry continues to embrace data-driven decision-making, the demand for robust normalization solutions is expected to intensify.




    Regionally, North America continues to dominate the flight data normalization platform market owing to the presence of major airlines, advanced aviation infrastructure, and early adoption of digital technologies. Europe is also witnessing significant growth, driven by stringent safety regulations and increasing investments in aviation digitization. Meanwhile, the Asia Pacific region is emerging as a lucrative market, fueled by rapid growth in air travel, expanding airline fleets, and government initiatives to modernize aviation infrastructure. Latin America and the Middle East & Africa are gradually embracing these platforms, supported by ongoing efforts to enhance aviation safety and operational efficiency.





    Component Analysis



    The component segment of the flight data normalization platform market is broadly categorized into software, hardware, and services. The software segment accounts for the largest share, driven by the increasing adoption of advanced analytics, machine learning, and artificial intelligence technologies for data processing and normalization. Software solutions are essential for aggregating raw flight data from multiple sources, standardizing formats, and providing actionable insights for decision-makers. With the rise of clou

  13. ARCS White Beam Vanadium Normalization Data for SNS Cycle 2022B

    • osti.gov
    Updated Feb 12, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    High Flux Isotope Reactor (HFIR) & Spallation Neutron Source (SNS), Oak Ridge National Laboratory (ORNL) (2025). ARCS White Beam Vanadium Normalization Data for SNS Cycle 2022B [Dataset]. http://doi.org/10.14461/oncat.data/2515590
    Explore at:
    Dataset updated
    Feb 12, 2025
    Dataset provided by
    Department of Energy Basic Energy Sciences Programhttp://science.energy.gov/user-facilities/basic-energy-sciences/
    Office of Sciencehttp://www.er.doe.gov/
    High Flux Isotope Reactor (HFIR) & Spallation Neutron Source (SNS), Oak Ridge National Laboratory (ORNL)
    Spallation Neutron Source (SNS)
    Description

    Neutron scattering Data from a Vanadium cylinder. Acquired on the ARCS spectrometer in white beam mode to normalize the detector efficiencies. During Cycle 2022B

  14. Affymetrix Normalization Required Files in GIANT tool suite

    • search.datacite.org
    Updated Jun 25, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julie Dubois-Chevalier (2020). Affymetrix Normalization Required Files in GIANT tool suite [Dataset]. http://doi.org/10.5281/zenodo.3908285
    Explore at:
    Dataset updated
    Jun 25, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    DataCitehttps://www.datacite.org/
    Authors
    Julie Dubois-Chevalier
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This archive contains affymetrix files necessary to normalize microarrays data and modified annotations files required in GIANT APT-Normalize tool for annotation of normalized data.

  15. Z

    Data from: Adapting Phrase-based Machine Translation to Normalise Medical...

    • data-staging.niaid.nih.gov
    • data.niaid.nih.gov
    • +1more
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Limsopatham, Nut; Collier, Nigel (2020). Adapting Phrase-based Machine Translation to Normalise Medical Terms in Social Media Messages [Dataset]. https://data-staging.niaid.nih.gov/resources?id=zenodo_27354
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    University of Cambridge
    Authors
    Limsopatham, Nut; Collier, Nigel
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Data and supplementary information for the paper entitled "Adapting Phrase-based Machine Translation to Normalise Medical Terms in Social Media Messages" to be published at EMNLP 2015: Conference on Empirical Methods in Natural Language Processing — September 17–21, 2015 — Lisboa, Portugal.

    ABSTRACT: Previous studies have shown that health reports in social media, such as DailyStrength and Twitter, have potential for monitoring health conditions (e.g. adverse drug reactions, infectious diseases) in particular communities. However, in order for a machine to understand and make inferences on these health conditions, the ability to recognise when laymen's terms refer to a particular medical concept (i.e. text normalisation) is required. To achieve this, we propose to adapt an existing phrase-based machine translation (MT) technique and a vector representation of words to map between a social media phrase and a medical concept. We evaluate our proposed approach using a collection of phrases from tweets related to adverse drug reactions. Our experimental results show that the combination of a phrase-based MT technique and the similarity between word vector representations outperforms the baselines that apply only either of them by up to 55%.

  16. Input data and some models (all except multi-model ensembles) for JAMES...

    • zenodo.org
    tar
    Updated Nov 8, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ryan Lagerquist; Ryan Lagerquist (2023). Input data and some models (all except multi-model ensembles) for JAMES paper "Machine-learned uncertainty quantification is not magic" [Dataset]. http://doi.org/10.5281/zenodo.10081205
    Explore at:
    tarAvailable download formats
    Dataset updated
    Nov 8, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Ryan Lagerquist; Ryan Lagerquist
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The tar file contains two directories: data and models. Within "data," there are 4 subdirectories: "training" (the clean training data -- without perturbations), "training_all_perturbed_for_uq" (the lightly perturbed training data), "validation_all_perturbed_for_uq" (the moderately perturbed validation data), and "testing_all_perturbed_for_uq" (the heavily perturbed validation data). The data in these directories are unnormalized. The subdirectories "training" and "training_all_perturbed_for_uq" each contain a normalization file. These normalization files contain parameters used to normalize the data (from physical units to z-scores) for Experiment 1 and Experiment 2, respectively. To do the normalization, you can use the script normalize_examples.py in the code library (ml4rt) with the argument input_normalization_file_name set to one of these two file paths. The other arguments should be as follows:

    --uniformize=1

    --predictor_norm_type_string="z_score"

    --vector_target_norm_type_string=""

    --scalar_target_norm_type_string=""

    Within the directory "models," there are 6 subdirectories: for the BNN-only models trained with clean and lightly perturbed data, for the CRPS-only models trained with clean and lightly perturbed data, and for the BNN/CRPS models trained with clean and lightly perturbed data. To read the models into Python, you can use the method neural_net.read_model in the ml4rt library.

  17. D

    Cloud EHR Data Normalization Platforms Market Research Report 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Cloud EHR Data Normalization Platforms Market Research Report 2033 [Dataset]. https://dataintelo.com/report/cloud-ehr-data-normalization-platforms-market
    Explore at:
    pptx, pdf, csvAvailable download formats
    Dataset updated
    Sep 30, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Cloud EHR Data Normalization Platforms Market Outlook



    According to our latest research, the global Cloud EHR Data Normalization Platforms market size in 2024 reached USD 1.2 billion, reflecting robust adoption across healthcare sectors worldwide. The market is experiencing a strong growth trajectory, with a compound annual growth rate (CAGR) of 16.5% projected from 2025 to 2033. By the end of 2033, the market is expected to attain a value of approximately USD 4.3 billion. This expansion is primarily fueled by the rising demand for integrated healthcare data systems, the proliferation of electronic health records (EHRs), and the critical need for seamless interoperability between disparate healthcare IT systems.




    One of the principal growth factors driving the Cloud EHR Data Normalization Platforms market is the global healthcare sector's increasing focus on digitization and interoperability. As healthcare organizations strive to improve patient outcomes and operational efficiencies, the adoption of cloud-based EHR data normalization solutions has become essential. These platforms enable the harmonization of heterogeneous data sources, ensuring that clinical, administrative, and financial data are standardized across multiple systems. This standardization is critical for supporting advanced analytics, clinical decision support, and population health management initiatives. Moreover, the growing adoption of value-based care models is compelling healthcare providers to invest in technologies that facilitate accurate data aggregation and reporting, further propelling market growth.




    Another significant growth catalyst is the rapid advancement in cloud computing technologies and the increasing availability of scalable, secure cloud infrastructure. Cloud EHR data normalization platforms leverage these technological advancements to offer healthcare organizations flexible deployment options, robust data security, and real-time access to normalized datasets. The scalability of cloud platforms allows healthcare providers to efficiently manage large volumes of data generated from diverse sources, including EHRs, laboratory systems, imaging centers, and wearable devices. Additionally, the integration of artificial intelligence and machine learning algorithms into these platforms enhances their ability to map, clean, and standardize data with greater accuracy and speed, resulting in improved clinical and operational insights.




    Regulatory and compliance requirements are also playing a pivotal role in shaping the growth trajectory of the Cloud EHR Data Normalization Platforms market. Governments and regulatory bodies across major regions are mandating the adoption of interoperable health IT systems to improve patient safety, data privacy, and care coordination. Initiatives such as the 21st Century Cures Act in the United States and similar regulations in Europe and Asia Pacific are driving healthcare organizations to implement advanced data normalization solutions. These platforms help ensure compliance with data standards such as HL7, FHIR, and SNOMED CT, thereby reducing the risk of data silos and enhancing the continuity of care. As a result, the market is witnessing increased investments from both public and private stakeholders aiming to modernize healthcare IT infrastructure.




    From a regional perspective, North America holds the largest share of the Cloud EHR Data Normalization Platforms market, driven by the presence of advanced healthcare infrastructure, high EHR adoption rates, and supportive regulatory frameworks. Europe follows closely, with significant investments in health IT modernization and interoperability initiatives. The Asia Pacific region is emerging as a high-growth market due to rising healthcare expenditures, expanding digital health initiatives, and increasing awareness about the benefits of data normalization. Latin America and the Middle East & Africa are also witnessing gradual adoption, supported by ongoing healthcare reforms and investments in digital health technologies. Collectively, these regional dynamics underscore the global momentum toward interoperable, cloud-based healthcare data ecosystems.



    Component Analysis



    The Cloud EHR Data Normalization Platforms market is segmented by component into software and services, each playing a distinct and critical role in driving the market's growth. Software solutions form the technological backbone of the market, enabling healthcare organizations to autom

  18. o

    Data from: White Beam Normalization

    • osti.gov
    Updated Jun 28, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Abernathy, Douglas; Goyette, Rick; Granroth, Garrett (2023). White Beam Normalization [Dataset]. https://www.osti.gov/dataexplorer/biblio/1987352
    Explore at:
    Dataset updated
    Jun 28, 2023
    Dataset provided by
    USDOE Office of Science (SC), Basic Energy Sciences (BES)
    High Flux Isotope Reactor (HFIR) & Spallation Neutron Source (SNS), Oak Ridge National Laboratory (ORNL); Spallation Neutron Source (SNS)
    Authors
    Abernathy, Douglas; Goyette, Rick; Granroth, Garrett
    Description

    Raw data used to normalize detector performance on the ARCS instrument For the run cycle starting in June 2023. This is a V cylinder and the T0 chopper is set to 150 Hz and phased for 300 meV. All Fermi Choppers are out of the Beam.

  19. D

    Equipment Runtime Normalization Analytics Market Research Report 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Oct 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Equipment Runtime Normalization Analytics Market Research Report 2033 [Dataset]. https://dataintelo.com/report/equipment-runtime-normalization-analytics-market
    Explore at:
    pdf, csv, pptxAvailable download formats
    Dataset updated
    Oct 1, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Equipment Runtime Normalization Analytics Market Outlook



    According to our latest research, the global Equipment Runtime Normalization Analytics market size reached USD 2.31 billion in 2024, demonstrating robust momentum across diverse industrial sectors. The market is expected to grow at a CAGR of 12.8% from 2025 to 2033, reaching a forecasted value of USD 6.88 billion by 2033. This remarkable growth is primarily driven by the increasing adoption of industrial automation, the proliferation of IoT-enabled equipment, and the rising need for predictive maintenance and operational efficiency across manufacturing, energy, and other critical industries.




    A key growth factor for the Equipment Runtime Normalization Analytics market is the accelerating pace of digital transformation within asset-intensive industries. As organizations strive to maximize the productivity and lifespan of their machinery, there is a growing emphasis on leveraging advanced analytics to normalize equipment runtime data across heterogeneous fleets and varying operational contexts. The integration of AI and machine learning algorithms enables enterprises to standardize runtime metrics, providing a unified view of equipment performance regardless of manufacturer, model, or deployment environment. This normalization is crucial for benchmarking, identifying inefficiencies, and implementing data-driven maintenance strategies that reduce unplanned downtime and optimize resource allocation.




    Another significant driver is the rise of Industry 4.0 and the increasing connectivity of industrial assets through IoT sensors and cloud-based platforms. These technological advancements have generated an unprecedented volume of equipment performance data, necessitating sophisticated analytics solutions capable of normalizing and interpreting runtime information at scale. Equipment Runtime Normalization Analytics platforms facilitate seamless data aggregation from disparate sources, allowing organizations to derive actionable insights that enhance operational agility and competitiveness. Additionally, the shift towards outcome-based service models in sectors such as manufacturing, energy, and transportation is fueling demand for analytics that can accurately measure and compare equipment utilization, efficiency, and reliability across diverse operational scenarios.




    The growing focus on sustainability and regulatory compliance is also propelling the adoption of Equipment Runtime Normalization Analytics. As governments and industry bodies impose stricter standards on energy consumption, emissions, and equipment maintenance, enterprises are increasingly turning to analytics tools that can provide standardized, auditable reports on equipment runtime and performance. These solutions not only help organizations meet compliance requirements but also support sustainability initiatives by identifying opportunities to reduce energy consumption, minimize waste, and extend equipment lifecycles. The convergence of these market forces is expected to sustain strong demand for Equipment Runtime Normalization Analytics solutions in the years ahead.




    Regionally, North America currently leads the market, accounting for the largest share in 2024, followed closely by Europe and Asia Pacific. The dominance of North America can be attributed to the early adoption of industrial IoT, advanced analytics, and a mature manufacturing base. Europe’s strong emphasis on sustainability and regulatory compliance further drives adoption, while Asia Pacific is emerging as a high-growth region due to rapid industrialization, government initiatives to modernize manufacturing, and increasing investments in smart factory technologies. Latin America and the Middle East & Africa are also witnessing steady growth, supported by expanding industrial infrastructure and the increasing penetration of digital technologies.



    Component Analysis



    The Component segment of the Equipment Runtime Normalization Analytics market is categorized into Software, Hardware, and Services. Software solutions form the backbone of this market, comprising advanced analytics platforms, AI-driven data processing engines, and visualization tools that enable users to normalize and interpret equipment runtime data. These software offerings are designed to aggregate data from multiple sources, apply normalization algorithms, and generate actionable insights for operational decision-making. The demand for robust

  20. f

    Data from: proteiNorm – A User-Friendly Tool for Normalization and Analysis...

    • datasetcatalog.nlm.nih.gov
    Updated Sep 30, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Byrd, Alicia K; Zafar, Maroof K; Graw, Stefan; Tang, Jillian; Byrum, Stephanie D; Peterson, Eric C.; Bolden, Chris (2020). proteiNorm – A User-Friendly Tool for Normalization and Analysis of TMT and Label-Free Protein Quantification [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000568582
    Explore at:
    Dataset updated
    Sep 30, 2020
    Authors
    Byrd, Alicia K; Zafar, Maroof K; Graw, Stefan; Tang, Jillian; Byrum, Stephanie D; Peterson, Eric C.; Bolden, Chris
    Description

    The technological advances in mass spectrometry allow us to collect more comprehensive data with higher quality and increasing speed. With the rapidly increasing amount of data generated, the need for streamlining analyses becomes more apparent. Proteomics data is known to be often affected by systemic bias from unknown sources, and failing to adequately normalize the data can lead to erroneous conclusions. To allow researchers to easily evaluate and compare different normalization methods via a user-friendly interface, we have developed “proteiNorm”. The current implementation of proteiNorm accommodates preliminary filters on peptide and sample levels followed by an evaluation of several popular normalization methods and visualization of the missing value. The user then selects an adequate normalization method and one of the several imputation methods used for the subsequent comparison of different differential expression methods and estimation of statistical power. The application of proteiNorm and interpretation of its results are demonstrated on two tandem mass tag multiplex (TMT6plex and TMT10plex) and one label-free spike-in mass spectrometry example data set. The three data sets reveal how the normalization methods perform differently on different experimental designs and the need for evaluation of normalization methods for each mass spectrometry experiment. With proteiNorm, we provide a user-friendly tool to identify an adequate normalization method and to select an appropriate method for differential expression analysis.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Yalbi Balderas (2022). Normalized data [Dataset]. http://doi.org/10.6084/m9.figshare.20076047.v1
Organization logoOrganization logo

Data from: Normalized data

Related Article
Explore at:
txtAvailable download formats
Dataset updated
Jun 15, 2022
Dataset provided by
figshare
Figsharehttp://figshare.com/
Authors
Yalbi Balderas
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

Normalize data

Search
Clear search
Close search
Google apps
Main menu