100+ datasets found
  1. Areas where global marketers saw the greatest benefits of improving data...

    • statista.com
    Updated Jul 2, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2025). Areas where global marketers saw the greatest benefits of improving data quality 2023 [Dataset]. https://www.statista.com/forecasts/1372895/areas-where-global-marketers-saw-the-greatest-benefits-of-improving-data-quality-2023
    Explore at:
    Dataset updated
    Jul 2, 2025
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Jan 16, 2023 - Jan 22, 2023
    Area covered
    Worldwide
    Description

    During a January 2023 global survey among marketing decision-makers, ** percent said customer experience benefitted the most from improving marketing data quality. Around ** percent of respondents mentioned engagement, while ** percent cited lead generation.

  2. Data from: DATA QUALITY ON THE WEB: INTEGRATIVE REVIEW OF PUBLICATION...

    • scielo.figshare.com
    tiff
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Morgana Carneiro de Andrade; Maria José Baños Moreno; Juan-Antonio Pastor-Sánchez (2023). DATA QUALITY ON THE WEB: INTEGRATIVE REVIEW OF PUBLICATION GUIDELINES [Dataset]. http://doi.org/10.6084/m9.figshare.22815541.v1
    Explore at:
    tiffAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    SciELOhttp://www.scielo.org/
    Authors
    Morgana Carneiro de Andrade; Maria José Baños Moreno; Juan-Antonio Pastor-Sánchez
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    ABSTRACT The exponential increase of published data and the diversity of systems require the adoption of good practices to achieve quality indexes that enable discovery, access, and reuse. To identify good practices, an integrative review was used, as well as procedures from the ProKnow-C methodology. After applying the ProKnow-C procedures to the documents retrieved from the Web of Science, Scopus and Library, Information Science & Technology Abstracts databases, an analysis of 31 items was performed. This analysis allowed observing that in the last 20 years the guidelines for publishing open government data had a great impact on the Linked Data model implementation in several domains and currently the FAIR principles and the Data on the Web Best Practices are the most highlighted in the literature. These guidelines presents orientations in relation to various aspects for the publication of data in order to contribute to the optimization of quality, independent of the context in which they are applied. The CARE and FACT principles, on the other hand, although they were not formulated with the same objective as FAIR and the Best Practices, represent great challenges for information and technology scientists regarding ethics, responsibility, confidentiality, impartiality, security, and transparency of data.

  3. D

    Cloud Data Quality Monitoring and Testing Market Report | Global Forecast...

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2024). Cloud Data Quality Monitoring and Testing Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/global-cloud-data-quality-monitoring-and-testing-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Sep 5, 2024
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Cloud Data Quality Monitoring and Testing Market Outlook



    The global cloud data quality monitoring and testing market size was valued at USD 1.5 billion in 2023 and is expected to reach USD 4.8 billion by 2032, growing at a compound annual growth rate (CAGR) of 13.8% during the forecast period. This robust growth is driven by increasing cloud adoption across various industries, coupled with the rising need for ensuring data quality and compliance.



    One of the primary growth factors of the cloud data quality monitoring and testing market is the exponential increase in data generation and consumption. As organizations continue to integrate cloud solutions, the volume of data being processed and stored on the cloud has surged dramatically. This data influx necessitates stringent quality monitoring to ensure data integrity, accuracy, and consistency, thus driving the demand for advanced data quality solutions. Moreover, as businesses enhance their data-driven decision-making processes, the need for high-quality data becomes ever more critical, further propelling market growth.



    Another significant driver is the growing complexity of data architectures due to diverse data sources and types. The modern data environment is characterized by a mix of structured, semi-structured, and unstructured data originating from various sources like IoT devices, social media platforms, and enterprise applications. Ensuring the quality of such heterogeneous data sets requires sophisticated monitoring and testing tools that can seamlessly operate within cloud ecosystems. Consequently, organizations are increasingly investing in cloud data quality solutions to manage this complexity, thereby fueling market expansion.



    Compliance and regulatory requirements also play a pivotal role in the growth of the cloud data quality monitoring and testing market. Industries such as BFSI, healthcare, and government are subject to stringent data governance and privacy regulations that mandate regular auditing and validation of data quality. Failure to comply with these regulations can result in severe penalties and reputational damage. Hence, companies are compelled to adopt cloud data quality monitoring and testing solutions to ensure compliance and mitigate risks associated with data breaches and inaccuracies.



    From a regional perspective, North America dominates the market due to its advanced IT infrastructure and early adoption of cloud technologies. However, significant growth is also expected in the Asia Pacific region, driven by rapid digital transformation initiatives and increasing investments in cloud infrastructure by emerging economies like China and India. Europe also presents substantial growth opportunities, with industries embracing cloud solutions to enhance operational efficiency and innovation. The regional dynamics indicate a wide-ranging impact of cloud data quality monitoring and testing solutions across the globe.



    Component Analysis



    The cloud data quality monitoring and testing market is broadly segmented into software and services. The software segment encompasses various tools and platforms designed to automate and streamline data quality monitoring processes. These solutions include data profiling, data cleansing, data integration, and master data management software. The demand for such software is on the rise due to its ability to provide real-time insights into data quality issues, thereby enabling organizations to take proactive measures in addressing discrepancies. Advanced software solutions often leverage AI and machine learning algorithms to enhance data accuracy and predictive capabilities.



    The services segment is equally crucial, offering a gamut of professional and managed services to support the implementation and maintenance of data quality monitoring systems. Professional services include consulting, system integration, and training services, which help organizations in the seamless adoption of data quality tools and best practices. Managed services, on the other hand, provide ongoing support and maintenance, ensuring that data quality standards are consistently met. As organizations seek to optimize their cloud data environments, the demand for comprehensive service offerings is expected to rise, driving market growth.



    One of the key trends within the component segment is the increasing integration of software and services to offer holistic data quality solutions. Vendors are increasingly bundling their software products with complementary services, providing a one-stop solution that covers all aspects of data quality managem

  4. f

    Percentage of papers that check aspects of data quality for online...

    • datasetcatalog.nlm.nih.gov
    Updated Sep 11, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bieler, Rüdiger; LaFrance, Raphael; Brenskelle, Laura; Sierwald, Petra; Ball-Damerow, Joan E.; Ariño, Arturo H.; Soltis, Pamela S.; Barve, Narayani; Guralnick, Robert P. (2019). Percentage of papers that check aspects of data quality for online occurrence data—Separated by the six top uses of these databases. [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000111236
    Explore at:
    Dataset updated
    Sep 11, 2019
    Authors
    Bieler, Rüdiger; LaFrance, Raphael; Brenskelle, Laura; Sierwald, Petra; Ball-Damerow, Joan E.; Ariño, Arturo H.; Soltis, Pamela S.; Barve, Narayani; Guralnick, Robert P.
    Description

    Nine data types with the lowest percentages were removed from table. The top data type for each research use is bolded, and percentage values above 10% are highlighted yellow (10–29%), orange (30–49%), and red (>50%).

  5. Commonly Used Data Elements

    • catalog.data.gov
    Updated Jul 6, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    FEMA/Off of Policy & Pgm Analysis/ENTERPRISE ANALYTICS DIV (2025). Commonly Used Data Elements [Dataset]. https://catalog.data.gov/dataset/commonly-used-data-elements
    Explore at:
    Dataset updated
    Jul 6, 2025
    Dataset provided by
    Federal Emergency Management Agencyhttp://www.fema.gov/
    Description

    The Commonly Used Data Element Standard aims to update and consolidate FEMA's existing data element standards, ensuring a consistent and high-quality approach to managing commonly used data elements. Incorrect data ingestion can lead to decision-making errors and challenge analysts with limited resources to correct these errors. Ensuring data quality begins with identifying and structuring key data elements that are vital to business operations. By standardizing the management of these elements, we can eliminate inconsistencies, reduce errors, and foster a culture of data excellence. This document provides comprehensive guidelines for adopting these data elements across systems and analytical reports to enhance data accuracy and support our strategic objectives. While compliance to these standards is not mandatory, it is highly recommended to achieve the full benefits of standardized, high-quality data practices within FEMA. By offering these standards as flexible recommendations rather than strict requirements, we allow programs and systems the flexibility to adapt while guiding FEMA towards exemplary data practices. Implementing these standards is crucial for improving our data management strategy, resulting in higher data quality and better alignment with our strategic goals.

  6. Z

    Conceptualization of public data ecosystems

    • data.niaid.nih.gov
    Updated Sep 26, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Martin, Lnenicka (2024). Conceptualization of public data ecosystems [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_13842001
    Explore at:
    Dataset updated
    Sep 26, 2024
    Dataset provided by
    Martin, Lnenicka
    Anastasija, Nikiforova
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains data collected during a study "Understanding the development of public data ecosystems: from a conceptual model to a six-generation model of the evolution of public data ecosystems" conducted by Martin Lnenicka (University of Hradec Králové, Czech Republic), Anastasija Nikiforova (University of Tartu, Estonia), Mariusz Luterek (University of Warsaw, Warsaw, Poland), Petar Milic (University of Pristina - Kosovska Mitrovica, Serbia), Daniel Rudmark (Swedish National Road and Transport Research Institute, Sweden), Sebastian Neumaier (St. Pölten University of Applied Sciences, Austria), Karlo Kević (University of Zagreb, Croatia), Anneke Zuiderwijk (Delft University of Technology, Delft, the Netherlands), Manuel Pedro Rodríguez Bolívar (University of Granada, Granada, Spain).

    As there is a lack of understanding of the elements that constitute different types of value-adding public data ecosystems and how these elements form and shape the development of these ecosystems over time, which can lead to misguided efforts to develop future public data ecosystems, the aim of the study is: (1) to explore how public data ecosystems have developed over time and (2) to identify the value-adding elements and formative characteristics of public data ecosystems. Using an exploratory retrospective analysis and a deductive approach, we systematically review 148 studies published between 1994 and 2023. Based on the results, this study presents a typology of public data ecosystems and develops a conceptual model of elements and formative characteristics that contribute most to value-adding public data ecosystems, and develops a conceptual model of the evolutionary generation of public data ecosystems represented by six generations called Evolutionary Model of Public Data Ecosystems (EMPDE). Finally, three avenues for a future research agenda are proposed.

    This dataset is being made public both to act as supplementary data for "Understanding the development of public data ecosystems: from a conceptual model to a six-generation model of the evolution of public data ecosystems ", Telematics and Informatics*, and its Systematic Literature Review component that informs the study.

    Description of the data in this data set

    PublicDataEcosystem_SLR provides the structure of the protocol

    Spreadsheet#1 provides the list of results after the search over three indexing databases and filtering out irrelevant studies

    Spreadsheets #2 provides the protocol structure.

    Spreadsheets #3 provides the filled protocol for relevant studies.

    The information on each selected study was collected in four categories:(1) descriptive information,(2) approach- and research design- related information,(3) quality-related information,(4) HVD determination-related information

    Descriptive Information

    Article number

    A study number, corresponding to the study number assigned in an Excel worksheet

    Complete reference

    The complete source information to refer to the study (in APA style), including the author(s) of the study, the year in which it was published, the study's title and other source information.

    Year of publication

    The year in which the study was published.

    Journal article / conference paper / book chapter

    The type of the paper, i.e., journal article, conference paper, or book chapter.

    Journal / conference / book

    Journal article, conference, where the paper is published.

    DOI / Website

    A link to the website where the study can be found.

    Number of words

    A number of words of the study.

    Number of citations in Scopus and WoS

    The number of citations of the paper in Scopus and WoS digital libraries.

    Availability in Open Access

    Availability of a study in the Open Access or Free / Full Access.

    Keywords

    Keywords of the paper as indicated by the authors (in the paper).

    Relevance for our study (high / medium / low)

    What is the relevance level of the paper for our study

    Approach- and research design-related information

    Approach- and research design-related information

    Objective / Aim / Goal / Purpose & Research Questions

    The research objective and established RQs.

    Research method (including unit of analysis)

    The methods used to collect data in the study, including the unit of analysis that refers to the country, organisation, or other specific unit that has been analysed such as the number of use-cases or policy documents, number and scope of the SLR etc.

    Study’s contributions

    The study’s contribution as defined by the authors

    Qualitative / quantitative / mixed method

    Whether the study uses a qualitative, quantitative, or mixed methods approach?

    Availability of the underlying research data

    Whether the paper has a reference to the public availability of the underlying research data e.g., transcriptions of interviews, collected data etc., or explains why these data are not openly shared?

    Period under investigation

    Period (or moment) in which the study was conducted (e.g., January 2021-March 2022)

    Use of theory / theoretical concepts / approaches? If yes, specify them

    Does the study mention any theory / theoretical concepts / approaches? If yes, what theory / concepts / approaches? If any theory is mentioned, how is theory used in the study? (e.g., mentioned to explain a certain phenomenon, used as a framework for analysis, tested theory, theory mentioned in the future research section).

    Quality-related information

    Quality concerns

    Whether there are any quality concerns (e.g., limited information about the research methods used)?

    Public Data Ecosystem-related information

    Public data ecosystem definition

    How is the public data ecosystem defined in the paper and any other equivalent term, mostly infrastructure. If an alternative term is used, how is the public data ecosystem called in the paper?

    Public data ecosystem evolution / development

    Does the paper define the evolution of the public data ecosystem? If yes, how is it defined and what factors affect it?

    What constitutes a public data ecosystem?

    What constitutes a public data ecosystem (components & relationships) - their "FORM / OUTPUT" presented in the paper (general description with more detailed answers to further additional questions).

    Components and relationships

    What components does the public data ecosystem consist of and what are the relationships between these components? Alternative names for components - element, construct, concept, item, helix, dimension etc. (detailed description).

    Stakeholders

    What stakeholders (e.g., governments, citizens, businesses, Non-Governmental Organisations (NGOs) etc.) does the public data ecosystem involve?

    Actors and their roles

    What actors does the public data ecosystem involve? What are their roles?

    Data (data types, data dynamism, data categories etc.)

    What data do the public data ecosystem cover (is intended / designed for)? Refer to all data-related aspects, including but not limited to data types, data dynamism (static data, dynamic, real-time data, stream), prevailing data categories / domains / topics etc.

    Processes / activities / dimensions, data lifecycle phases

    What processes, activities, dimensions and data lifecycle phases (e.g., locate, acquire, download, reuse, transform, etc.) does the public data ecosystem involve or refer to?

    Level (if relevant)

    What is the level of the public data ecosystem covered in the paper? (e.g., city, municipal, regional, national (=country), supranational, international).

    Other elements or relationships (if any)

    What other elements or relationships does the public data ecosystem consist of?

    Additional comments

    Additional comments (e.g., what other topics affected the public data ecosystems and their elements, what is expected to affect the public data ecosystems in the future, what were important topics by which the period was characterised etc.).

    New papers

    Does the study refer to any other potentially relevant papers?

    Additional references to potentially relevant papers that were found in the analysed paper (snowballing).

    Format of the file.xls, .csv (for the first spreadsheet only), .docx

    Licenses or restrictionsCC-BY

    For more info, see README.txt

  7. Short-form data quality indicators: Canada, provinces and territories,...

    • www150.statcan.gc.ca
    • www12.statcan.gc.ca
    • +2more
    Updated Aug 17, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Government of Canada, Statistics Canada (2022). Short-form data quality indicators: Canada, provinces and territories, census metropolitan areas, census agglomerations and census subdivisions [Dataset]. http://doi.org/10.25318/9810016501-eng
    Explore at:
    Dataset updated
    Aug 17, 2022
    Dataset provided by
    Statistics Canadahttps://statcan.gc.ca/en
    Area covered
    Canada
    Description

    Data on short-form data quality indicators for 2021 Census, Canada, provinces and territories, census metropolitan areas, census agglomerations and census subdivisions.

  8. Dataset and R code from RDA WG Discipline-Specific Guidance on DMP - Online...

    • zenodo.org
    zip
    Updated Sep 21, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Briana Wham; Briana Wham; Daniela Hausen; Daniela Hausen; Ivonne Andres; Ivonne Andres; Shannon Sheridan; Shannon Sheridan; Santosh Ilamparuthi; Santosh Ilamparuthi; Yasemin Turkyilmaz-van der Velden; Yasemin Turkyilmaz-van der Velden (2023). Dataset and R code from RDA WG Discipline-Specific Guidance on DMP - Online Survey [Dataset]. http://doi.org/10.5281/zenodo.8367217
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 21, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Briana Wham; Briana Wham; Daniela Hausen; Daniela Hausen; Ivonne Andres; Ivonne Andres; Shannon Sheridan; Shannon Sheridan; Santosh Ilamparuthi; Santosh Ilamparuthi; Yasemin Turkyilmaz-van der Velden; Yasemin Turkyilmaz-van der Velden
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Dataset from the Online Survey of the Research Data Alliance's Discipline-Specific Guidance for Data Management Plans Working Group.


    The data was collected from November 8, 2021 to January 14, 2022.

    The survey was divided into the following areas after a brief introduction on "Purpose of this survey" and "Use of the information you provide."

    • Demographics
    • Data Description and Collection
    • Data Documentation & Quality
    • Data Archiving, Publishing & Sharing After the Project
    • Guidelines, Principles & Best Practices
    • Follow-up interviews and group discussions

    The analysis of the online survey was focused on the four areas: Natural Sciences, Life Sciences, Humanities & Social Sciences, and Engineering. The results of the evaluation will be presented in a separate publication.

    In addition to the data, the variables and values are also published here.

    The online survey questions can be accessed here: https://doi.org/10.5281/zenodo.7443373

    A more detailed analysis and description can be found in the paper "Discipline-specific Aspects in Data Management Planning" submitted to Data Science Journal (2022-12-15).

  9. v

    Data from: Ethical Data Management

    • data.virginiabeach.gov
    • data.virginia.gov
    Updated Nov 23, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    City of Virginia Beach - Online Mapping (2022). Ethical Data Management [Dataset]. https://data.virginiabeach.gov/documents/2949ba73014d49fba67bb7717280a8aa
    Explore at:
    Dataset updated
    Nov 23, 2022
    Dataset authored and provided by
    City of Virginia Beach - Online Mapping
    Description

    Ethical Data ManagementExecutive SummaryIn the age of data and information, it is imperative that the City of Virginia Beach strategically utilize its data assets. Through expanding data access, improving quality, maintaining pace with advanced technologies, and strengthening capabilities, IT will ensure that the city remains at the forefront of digital transformation and innovation. The Data and Information Management team works under the purpose:“To promote a data-driven culture at all levels of the decision making process by supporting and enabling business capabilities with relevant and accurate information that can be accessed securely anytime, anywhere, and from any platform.”To fulfill this mission, IT will implement and utilize new and advanced technologies, enhanced data management and infrastructure, and will expand internal capabilities and regional collaboration.Introduction and JustificationThe Information technology (IT) department’s resources are integral features of the social, political and economic welfare of the City of Virginia Beach residents. In regard to local administration, the IT department makes it possible for the Data and Information Management Team to provide the general public with high-quality services, generate and disseminate knowledge, and facilitate growth through improved productivity.For the Data and Information Management Team, it is important to maximize the quality and security of the City’s data; to develop and apply the coherent management of information resources and management policies that aim to keep the general public constantly informed, protect their rights as subjects, improve the productivity, efficiency, effectiveness and public return of its projects and to promote responsible innovation. Furthermore, as technology evolves, it is important for public institutions to manage their information systems in such a way as to identify and minimize the security and privacy risks associated with the new capacities of those systems.The responsible and ethical use of data strategy is part of the City’s Master Technology Plan 2.0 (MTP), which establishes the roadmap designed by improve data and information accessibility, quality, and capabilities throughout the entire City. The strategy is being put into practice in the shape of a plan that involves various programs. Although these programs was specifically conceived as a conceptual framework for achieving a cultural change in terms of the public perception of data, it basically covers all the aspects of the MTP that concern data, and in particular the open-data and data-commons strategies, data-driven projects, with the aim of providing better urban services and interoperability based on metadata schemes and open-data formats, permanent access and data use and reuse, with the minimum possible legal, economic and technological barriers within current legislation.Fundamental valuesThe City of Virginia Beach’s data is a strategic asset and a valuable resource that enables our local government carry out its mission and its programs effectively. Appropriate access to municipal data significantly improves the value of the information and the return on the investment involved in generating it. In accordance with the Master Technology Plan 2.0 and its emphasis on public innovation, the digital economy and empowering city residents, this data-management strategy is based on the following considerations.Within this context, this new management and use of data has to respect and comply with the essential values applicable to data. For the Data and Information Team, these values are:Shared municipal knowledge. Municipal data, in its broadest sense, has a significant social dimension and provides the general public with past, present and future knowledge concerning the government, the city, society, the economy and the environment.The strategic value of data. The team must manage data as a strategic value, with an innovative vision, in order to turn it into an intellectual asset for the organization.Geared towards results. Municipal data is also a means of ensuring the administration’s accountability and transparency, for managing services and investments and for maintaining and improving the performance of the economy, wealth and the general public’s well-being.Data as a common asset. City residents and the common good have to be the central focus of the City of Virginia Beach’s plans and technological platforms. Data is a source of wealth that empowers people who have access to it. Making it possible for city residents to control the data, minimizing the digital gap and preventing discriminatory or unethical practices is the essence of municipal technological sovereignty.Transparency and interoperability. Public institutions must be open, transparent and responsible towards the general public. Promoting openness and interoperability, subject to technical and legal requirements, increases the efficiency of operations, reduces costs, improves services, supports needs and increases public access to valuable municipal information. In this way, it also promotes public participation in government.Reuse and open-source licenses. Making municipal information accessible, usable by everyone by default, without having to ask for prior permission, and analyzable by anyone who wishes to do so can foster entrepreneurship, social and digital innovation, jobs and excellence in scientific research, as well as improving the lives of Virginia Beach residents and making a significant contribution to the city’s stability and prosperity.Quality and security. The city government must take firm steps to ensure and maximize the quality, objectivity, usefulness, integrity and security of municipal information before disclosing it, and maintain processes to effectuate requests for amendments to the publicly-available information.Responsible organization. Adding value to the data and turning it into an asset, with the aim of promoting accountability and citizens’ rights, requires new actions, new integrated procedures, so that the new platforms can grow in an organic, transparent and cross-departmental way. A comprehensive governance strategy makes it possible to promote this revision and avoid redundancies, increased costs, inefficiency and bad practices.Care throughout the data’s life cycle. Paying attention to the management of municipal registers, from when they are created to when they are destroyed or preserved, is an essential part of data management and of promoting public responsibility. Being careful with the data throughout its life cycle combined with activities that ensure continued access to digital materials for as long as necessary, help with the analytic exploitation of the data, but also with the responsible protection of historic municipal government registers and safeguarding the economic and legal rights of the municipal government and the city’s residents.Privacy “by design”. Protecting privacy is of maximum importance. The Data and Information Management Team has to consider and protect individual and collective privacy during the data life cycle, systematically and verifiably, as specified in the general regulation for data protection.Security. Municipal information is a strategic asset subject to risks, and it has to be managed in such a way as to minimize those risks. This includes privacy, data protection, algorithmic discrimination and cybersecurity risks that must be specifically established, promoting ethical and responsible data architecture, techniques for improving privacy and evaluating the social effects. Although security and privacy are two separate, independent fields, they are closely related, and it is essential for the units to take a coordinated approach in order to identify and manage cybersecurity and risks to privacy with applicable requirements and standards.Open Source. It is obligatory for the Data and Information Management Team to maintain its Open Data- Open Source platform. The platform allows citizens to access open data from multiple cities in a central location, regional universities and colleges to foster continuous education, and aids in the development of data analytics skills for citizens. Continuing to uphold the Open Source platform with allow the City to continually offer citizens the ability to provide valuable input on the structure and availability of its data. Strategic areasIn order to deploy the strategy for the responsible and ethical use of data, the following areas of action have been established, which we will detail below, together with the actions and emblematic projects associated with them.In general, the strategy pivots on the following general principals, which form the basis for the strategic areas described in this section.Data sovereigntyOpen data and transparencyThe exchange and reuse of dataPolitical decision-making informed by dataThe life cycle of data and continual or permanent accessData GovernanceData quality and accessibility are crucial for meaningful data analysis, and must be ensured through the implementation of data governance. IT will establish a Data Governance Board, a collaborative organizational capability made up of the city’s data and analytics champions, who will work together to develop policies and practices to treat and use data as a strategic asset.Data governance is the overall management of the availability, usability, integrity and security of data used in the city. Increased data quality will positively impact overall trust in data, resulting in increased use and adoption. The ownership, accessibility, security, and quality, of the data is defined and maintained by the Data Governance Board.To improve operational efficiency, an enterprise-wide data catalog will be created to inventory data and track metadata from various data sources to allow for rapid data asset discovery. Through the data catalog, the city will

  10. U

    Compilation of wells sampled, physical characteristics of wells, links to...

    • data.usgs.gov
    • datasets.ai
    • +3more
    Updated Sep 19, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Daniel Galeone (2017). Compilation of wells sampled, physical characteristics of wells, links to water-quality data, and quality assurance and quality control data for domestic wells sampled by the U.S. Geological Survey in Potter County, Pennsylvania, April-September 2017 [Dataset]. http://doi.org/10.5066/P9EBORD5
    Explore at:
    Dataset updated
    Sep 19, 2017
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Authors
    Daniel Galeone
    License

    U.S. Government Workshttps://www.usa.gov/government-works
    License information was derived automatically

    Time period covered
    Apr 24, 2017 - Sep 19, 2017
    Area covered
    Potter County, Pennsylvania
    Description

    This dataset contains the lithologic class and topographic position index information and quality-assurance and quality-control data not available in the online National Water Information System for 47 domestic wells sampled by the U.S. Geological Survey in Potter County, Pennsylvania, April-September 2017. The topographic position index (TPI) for each well location was computed on the basis of a 25-meter digital elevation model (U.S. Geological Survey, 2009) using criteria reported by Llewellyn (2014) to indicate potential classes for topographic setting. The bedrock geologic unit and primary lithology were determined for each well location on the basis of the digital bedrock geologic map of Pennsylvania (Miles and Whitfield, 2001). The quality-assurance and quality-control data (such as blanks or replicates) were collected at a subset of sites to ensure that the data met specific data-quality objectives outlined for the study.

  11. g

    Data Quality Assessment Areas (USACE IENC) | gimi9.com

    • gimi9.com
    Updated Oct 24, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2022). Data Quality Assessment Areas (USACE IENC) | gimi9.com [Dataset]. https://gimi9.com/dataset/data-gov_data-quality-assessment-areas-usace-ienc/
    Explore at:
    Dataset updated
    Oct 24, 2022
    Description

    🇺🇸 미국

  12. a

    Data Quality Assessment Areas (USACE IENC)

    • hifld-geoplatform.opendata.arcgis.com
    Updated Jun 16, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    GeoPlatform ArcGIS Online (2017). Data Quality Assessment Areas (USACE IENC) [Dataset]. https://hifld-geoplatform.opendata.arcgis.com/datasets/data-quality-assessment-areas-usace-ienc
    Explore at:
    Dataset updated
    Jun 16, 2017
    Dataset authored and provided by
    GeoPlatform ArcGIS Online
    Area covered
    Description

    Quality of data Definition: An area within which a uniform assessment of the quality of the data exists. Distinction: accuracy of data; survey reliability;

  13. Test Data Management Market Analysis, Size, and Forecast 2025-2029: North...

    • technavio.com
    pdf
    Updated May 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Technavio, Test Data Management Market Analysis, Size, and Forecast 2025-2029: North America (US and Canada), Europe (France, Germany, Italy, and UK), APAC (Australia, China, India, and Japan), and Rest of World (ROW) [Dataset]. https://www.technavio.com/report/test-data-management-market-industry-analysis
    Explore at:
    pdfAvailable download formats
    Dataset updated
    May 1, 2025
    Dataset provided by
    TechNavio
    Authors
    Technavio
    Time period covered
    2025 - 2029
    Area covered
    Canada, United States
    Description

    Snapshot img

    Test Data Management Market Size 2025-2029

    The test data management market size is forecast to increase by USD 727.3 million, at a CAGR of 10.5% between 2024 and 2029.

    The market is experiencing significant growth, driven by the increasing adoption of automation by enterprises to streamline their testing processes. The automation trend is fueled by the growing consumer spending on technological solutions, as businesses seek to improve efficiency and reduce costs. However, the market faces challenges, including the lack of awareness and standardization in test data management practices. This obstacle hinders the effective implementation of test data management solutions, requiring companies to invest in education and training to ensure successful integration. To capitalize on market opportunities and navigate challenges effectively, businesses must stay informed about emerging trends and best practices in test data management. By doing so, they can optimize their testing processes, reduce risks, and enhance overall quality.

    What will be the Size of the Test Data Management Market during the forecast period?

    Explore in-depth regional segment analysis with market size data - historical 2019-2023 and forecasts 2025-2029 - in the full report.
    Request Free SampleThe market continues to evolve, driven by the ever-increasing volume and complexity of data. Data exploration and analysis are at the forefront of this dynamic landscape, with data ethics and governance frameworks ensuring data transparency and integrity. Data masking, cleansing, and validation are crucial components of data management, enabling data warehousing, orchestration, and pipeline development. Data security and privacy remain paramount, with encryption, access control, and anonymization key strategies. Data governance, lineage, and cataloging facilitate data management software automation and reporting. Hybrid data management solutions, including artificial intelligence and machine learning, are transforming data insights and analytics. Data regulations and compliance are shaping the market, driving the need for data accountability and stewardship. Data visualization, mining, and reporting provide valuable insights, while data quality management, archiving, and backup ensure data availability and recovery. Data modeling, data integrity, and data transformation are essential for data warehousing and data lake implementations. Data management platforms are seamlessly integrated into these evolving patterns, enabling organizations to effectively manage their data assets and gain valuable insights. Data management services, cloud and on-premise, are essential for organizations to adapt to the continuous changes in the market and effectively leverage their data resources.

    How is this Test Data Management Industry segmented?

    The test data management industry research report provides comprehensive data (region-wise segment analysis), with forecasts and estimates in 'USD million' for the period 2025-2029, as well as historical data from 2019-2023 for the following segments. ApplicationOn-premisesCloud-basedComponentSolutionsServicesEnd-userInformation technologyTelecomBFSIHealthcare and life sciencesOthersSectorLarge enterpriseSMEsGeographyNorth AmericaUSCanadaEuropeFranceGermanyItalyUKAPACAustraliaChinaIndiaJapanRest of World (ROW).

    By Application Insights

    The on-premises segment is estimated to witness significant growth during the forecast period.In the realm of data management, on-premises testing represents a popular approach for businesses seeking control over their infrastructure and testing process. This approach involves establishing testing facilities within an office or data center, necessitating a dedicated team with the necessary skills. The benefits of on-premises testing extend beyond control, as it enables organizations to upgrade and configure hardware and software at their discretion, providing opportunities for exploration testing. Furthermore, data security is a significant concern for many businesses, and on-premises testing alleviates the risk of compromising sensitive information to third-party companies. Data exploration, a crucial aspect of data analysis, can be carried out more effectively with on-premises testing, ensuring data integrity and security. Data masking, cleansing, and validation are essential data preparation techniques that can be executed efficiently in an on-premises environment. Data warehousing, data pipelines, and data orchestration are integral components of data management, and on-premises testing allows for seamless integration and management of these elements. Data governance frameworks, lineage, catalogs, and metadata are essential for maintaining data transparency and compliance. Data security, encryption, and access control are paramount, and on-premises testing offers greater control over these aspects. Data reporting, visualization, and insigh

  14. d

    Water-quality Data

    • catalog.data.gov
    • gimi9.com
    • +1more
    Updated Jul 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. Geological Survey (2024). Water-quality Data [Dataset]. https://catalog.data.gov/dataset/water-quality-data
    Explore at:
    Dataset updated
    Jul 6, 2024
    Dataset provided by
    United States Geological Surveyhttp://www.usgs.gov/
    Description

    In 2015-2016, physicochemical properties and chemical characteristics of stream water, bed sediment, groundwater, and soil were determined in watersheds located outside of, but in proximity to, the Peason Ridge Training Area and Main Post at the Joint Readiness Training Center and Fort Polk boundaries to document background trace element concentrations. Water samples were analyzed for physicochemical properties, major inorganic ions, selected trace elements, and dissolved organic carbon. Selected trace elements included antimony, arsenic, cadmium, copper, iron, lead, manganese, mercury, and zinc. Stream bed-sediment and soil samples were analyzed for major inorganic ions, selected trace elements, and grain size distribution. Surface-water samples were collected near the downstream transect of each stream reach. Monitoring wells were located adjacent to the stream reach and in close proximity to the surface-water sampling sites. Bulk bed-sediment samples were collected during normal low-flow conditions. Each sample consisted of a composite sample from five locations (right edge, left edge, and center of a middle transect, then upstream and downstream of the middle transect) within each stream reach. Three soil samples, one from hilltops, one from side slopes, and one from riparian zones, were collected from areas adjacent to each stream reach. Each soil sample consisted of 5 to 10 grab samples collected by a 21-inch-long, 5/8-inch internal diameter stainless-steel hand auger and composited in Teflon lined pans. All samples were collected following USGS sampling protocols. This data release provides database and mapping information for assessment of trace element concentrations in stream water, bed sediment, groundwater, and soil found in relatively pristine and undisturbed watersheds in proximity to watersheds used for military training.

  15. a

    Data Quality Assessment Areas (USACE IENC)

    • arc-gis-hub-home-arcgishub.hub.arcgis.com
    • hub.arcgis.com
    • +6more
    Updated Feb 21, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    GeoPlatform ArcGIS Online (2018). Data Quality Assessment Areas (USACE IENC) [Dataset]. https://arc-gis-hub-home-arcgishub.hub.arcgis.com/datasets/geoplatform::data-quality-assessment-areas-usace-ienc/explore
    Explore at:
    Dataset updated
    Feb 21, 2018
    Dataset authored and provided by
    GeoPlatform ArcGIS Online
    Area covered
    Description

    The USACE IENCs coverage area consists of 7,260 miles across 21 rivers primarily located in the Central United States. IENCs apply to inland waterways that are maintained for navigation by USACE for shallow-draft vessels (e.g., maintained at a depth of 9-14 feet, dependent upon the waterway project authorization). Generally, IENCs are produced for those commercially navigable waterways which the National Oceanic and Atmospheric Administration (NOAA) does not produce Electronic Navigational Charts (ENCs). However, Special Purpose IENCs may be produced in agreement with NOAA. IENC POC: IENC_POC@usace.army.mil

  16. M

    Data quality for PM10 concentrations in OECD urban areas

    • data.mfe.govt.nz
    Updated Oct 9, 2015
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ministry for the Environment (2015). Data quality for PM10 concentrations in OECD urban areas [Dataset]. https://data.mfe.govt.nz/document/11578-data-quality-for-pm10-concentrations-in-oecd-urban-areas/
    Explore at:
    Dataset updated
    Oct 9, 2015
    Dataset authored and provided by
    Ministry for the Environment
    License

    https://data.mfe.govt.nz/license/attribution-3-0-new-zealand/https://data.mfe.govt.nz/license/attribution-3-0-new-zealand/

    Description

    Geospatial data about Data quality for PM10 concentrations in OECD urban areas. Export to CAD, GIS, PDF, CSV and access via API.

  17. Long-form data quality indicators for expenditures: Canada, provinces and...

    • www150.statcan.gc.ca
    • datasets.ai
    • +2more
    Updated Feb 8, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Government of Canada, Statistics Canada (2023). Long-form data quality indicators for expenditures: Canada, provinces and territories, census metropolitan areas, census agglomerations and census subdivisions [Dataset]. http://doi.org/10.25318/9810057301-eng
    Explore at:
    Dataset updated
    Feb 8, 2023
    Dataset provided by
    Statistics Canadahttps://statcan.gc.ca/en
    Area covered
    Canada
    Description

    Data on long-form data quality indicators for 2021 Census expenditures content, Canada, provinces and territories, census metropolitan areas, census agglomerations and census subdivisions.

  18. Minimum Data Elements

    • fisheries.noaa.gov
    Updated Mar 14, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gulf States Marine Fisheries Commission (2018). Minimum Data Elements [Dataset]. https://www.fisheries.noaa.gov/inport/item/2948
    Explore at:
    Dataset updated
    Mar 14, 2018
    Dataset provided by
    Gulf States Marine Fisheries Commission
    Description

    The description for this record is not currently available.

  19. d

    TagX Web Browsing clickstream Data - 300K Users North America, EU - GDPR -...

    • datarade.ai
    .json, .csv, .xls
    Updated Sep 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    TagX (2024). TagX Web Browsing clickstream Data - 300K Users North America, EU - GDPR - CCPA Compliant [Dataset]. https://datarade.ai/data-products/tagx-web-browsing-clickstream-data-300k-users-north-america-tagx
    Explore at:
    .json, .csv, .xlsAvailable download formats
    Dataset updated
    Sep 16, 2024
    Dataset authored and provided by
    TagX
    Area covered
    Finland, United States of America, Luxembourg, Macedonia (the former Yugoslav Republic of), Japan, Ireland, Holy See, China, Andorra, Switzerland
    Description

    TagX Web Browsing Clickstream Data: Unveiling Digital Behavior Across North America and EU Unique Insights into Online User Behavior TagX Web Browsing clickstream Data offers an unparalleled window into the digital lives of 1 million users across North America and the European Union. This comprehensive dataset stands out in the market due to its breadth, depth, and stringent compliance with data protection regulations. What Makes Our Data Unique?

    Extensive Geographic Coverage: Spanning two major markets, our data provides a holistic view of web browsing patterns in developed economies. Large User Base: With 300K active users, our dataset offers statistically significant insights across various demographics and user segments. GDPR and CCPA Compliance: We prioritize user privacy and data protection, ensuring that our data collection and processing methods adhere to the strictest regulatory standards. Real-time Updates: Our clickstream data is continuously refreshed, providing up-to-the-minute insights into evolving online trends and user behaviors. Granular Data Points: We capture a wide array of metrics, including time spent on websites, click patterns, search queries, and user journey flows.

    Data Sourcing: Ethical and Transparent Our web browsing clickstream data is sourced through a network of partnered websites and applications. Users explicitly opt-in to data collection, ensuring transparency and consent. We employ advanced anonymization techniques to protect individual privacy while maintaining the integrity and value of the aggregated data. Key aspects of our data sourcing process include:

    Voluntary user participation through clear opt-in mechanisms Regular audits of data collection methods to ensure ongoing compliance Collaboration with privacy experts to implement best practices in data anonymization Continuous monitoring of regulatory landscapes to adapt our processes as needed

    Primary Use Cases and Verticals TagX Web Browsing clickstream Data serves a multitude of industries and use cases, including but not limited to:

    Digital Marketing and Advertising:

    Audience segmentation and targeting Campaign performance optimization Competitor analysis and benchmarking

    E-commerce and Retail:

    Customer journey mapping Product recommendation enhancements Cart abandonment analysis

    Media and Entertainment:

    Content consumption trends Audience engagement metrics Cross-platform user behavior analysis

    Financial Services:

    Risk assessment based on online behavior Fraud detection through anomaly identification Investment trend analysis

    Technology and Software:

    User experience optimization Feature adoption tracking Competitive intelligence

    Market Research and Consulting:

    Consumer behavior studies Industry trend analysis Digital transformation strategies

    Integration with Broader Data Offering TagX Web Browsing clickstream Data is a cornerstone of our comprehensive digital intelligence suite. It seamlessly integrates with our other data products to provide a 360-degree view of online user behavior:

    Social Media Engagement Data: Combine clickstream insights with social media interactions for a holistic understanding of digital footprints. Mobile App Usage Data: Cross-reference web browsing patterns with mobile app usage to map the complete digital journey. Purchase Intent Signals: Enrich clickstream data with purchase intent indicators to power predictive analytics and targeted marketing efforts. Demographic Overlays: Enhance web browsing data with demographic information for more precise audience segmentation and targeting.

    By leveraging these complementary datasets, businesses can unlock deeper insights and drive more impactful strategies across their digital initiatives. Data Quality and Scale We pride ourselves on delivering high-quality, reliable data at scale:

    Rigorous Data Cleaning: Advanced algorithms filter out bot traffic, VPNs, and other non-human interactions. Regular Quality Checks: Our data science team conducts ongoing audits to ensure data accuracy and consistency. Scalable Infrastructure: Our robust data processing pipeline can handle billions of daily events, ensuring comprehensive coverage. Historical Data Availability: Access up to 24 months of historical data for trend analysis and longitudinal studies. Customizable Data Feeds: Tailor the data delivery to your specific needs, from raw clickstream events to aggregated insights.

    Empowering Data-Driven Decision Making In today's digital-first world, understanding online user behavior is crucial for businesses across all sectors. TagX Web Browsing clickstream Data empowers organizations to make informed decisions, optimize their digital strategies, and stay ahead of the competition. Whether you're a marketer looking to refine your targeting, a product manager seeking to enhance user experience, or a researcher exploring digital trends, our cli...

  20. f

    The association between facility characteristics and the frequency of...

    • plos.figshare.com
    xls
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Veronica Muthee; Aaron F. Bochner; Allison Osterman; Nzisa Liku; Willis Akhwale; James Kwach; Mehta Prachi; Joyce Wamicwe; Jacob Odhiambo; Fredrick Onyango; Nancy Puttkammer (2023). The association between facility characteristics and the frequency of concordant data elements in paper records and KenyaEMR during baseline RDQAs. [Dataset]. http://doi.org/10.1371/journal.pone.0195362.t003
    Explore at:
    xlsAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Veronica Muthee; Aaron F. Bochner; Allison Osterman; Nzisa Liku; Willis Akhwale; James Kwach; Mehta Prachi; Joyce Wamicwe; Jacob Odhiambo; Fredrick Onyango; Nancy Puttkammer
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The association between facility characteristics and the frequency of concordant data elements in paper records and KenyaEMR during baseline RDQAs.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Statista (2025). Areas where global marketers saw the greatest benefits of improving data quality 2023 [Dataset]. https://www.statista.com/forecasts/1372895/areas-where-global-marketers-saw-the-greatest-benefits-of-improving-data-quality-2023
Organization logo

Areas where global marketers saw the greatest benefits of improving data quality 2023

Explore at:
Dataset updated
Jul 2, 2025
Dataset authored and provided by
Statistahttp://statista.com/
Time period covered
Jan 16, 2023 - Jan 22, 2023
Area covered
Worldwide
Description

During a January 2023 global survey among marketing decision-makers, ** percent said customer experience benefitted the most from improving marketing data quality. Around ** percent of respondents mentioned engagement, while ** percent cited lead generation.

Search
Clear search
Close search
Google apps
Main menu