Facebook
TwitterA survey conducted in April and May 2023 revealed that around ** percent of the companies that do business in the European Union (EU) and the United Kingdom (UK) found it challenging to adapt to new or changing requirements of the General Data Protection Regulation (GDPR) or Data Protection Act 2018 (DPA). A further ** percent of the survey respondents said it was challenging to increase the budget because of the changes in the data privacy laws.
Facebook
TwitterAs of February 2025, the largest fine issued for violation of the General Data Protection Regulation (GDPR) in the United Kingdom (UK) was more than 22 million euros, received by British Airways in October 2020. Another fine received by Marriott International Inc. in the same month was the second-highest in the UK and amounted to over 20 million euros.
Facebook
TwitterThe Government has surveyed UK businesses and charities to find out how they approach cyber security and help them learn more about the cyber security issues faced by industry. The research informs Government policy on cyber security and how Government works with industry to make Britain one of the most secure places to do business online.
This specific release is being in published in advance of the full report of the 2018 Cyber Security Breaches Survey, to provide insight into how aware and prepared businesses and charities are for the General Data Protection Regulation (or GDPR), the foundation of the new Data Protection Act which is due to be introduced in May 2018.
24 January 2018
The findings are taken from survey telephone interviews, which took place between October and December 2017.
UK
The survey is part of the Government’s National Cyber Security Programme.
Cyber security guidance and information for businesses, including details of free training and support, can be found on the National Cyber Security Centre website and GOV.UK at: http://www.ncsc.gov.uk/guidance">www.ncsc.gov.uk/guidance and www.gov.uk.
The survey was carried out by Ipsos MORI and its partner, the Institute of Criminal Justice Studies (ICJS) at the University of Portsmouth.
This release is published in accordance with the Code of Practice for Official Statistics (2009), as produced by the UK Statistics Authority. The UKSA has the overall objective of promoting and safeguarding the production and publication of official statistics that serve the public good. It monitors and reports on all official statistics, and promotes good practice in this area.
The document above contains a list of ministers and officials who have received privileged early access to this release. In line with best practice, the list has been kept to a minimum and those given access for briefing purposes had a maximum of 24 hours.
The responsible statistician for this release is Rishi Vaidya. For any queries please contact 020 7211 2320 or evidence@culture.gov.uk.
Facebook
TwitterThis statistic shows the results of a survey on the share of respondents that agreed with selected statements on the EU General Data Protection Regulations (GDPR) legislation in the United Kingdom (UK) in 2017. During the survey, ** percent of IT decision makers either strongly agreed or agreed that they faced some serious challenges in being compliant with the EU GDPR by ************.
Facebook
TwitterA survey conducted in April and May 2023 among companies that do business in the European Union and the United Kingdom (UK) found that over half of the respondents, ** percent, felt very prepared for the General Data Protection Regulation (GDPR). A further ** percent of the companies believed they were moderately prepared, while ** percent said they were slightly ready to comply with the EU and UK privacy legislations.
Facebook
TwitterThis dataset is a central catalogue of Data Protection Impact Assessments (DPIAs) of smart city projects that collect personal information in public spaces. By publishing this in one place for the first time, it will enable public transparency and support good practice among operators. A DPIA helps to identify and minimise the risks of a project that uses personal data. Further information: DPIA registration form: https://www.london.gov.uk/dpia-register-form Information Commissioner DPIA: https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance/data-protection-impact-assessments/
Facebook
TwitterAt the Medicines and Healthcare products Regulatory Agency (the Agency) we are committed to protecting and respecting your privacy.
This privacy notice describes how we collect and use your personal information, in accordance with the Data Protection Act 2018 and the UK General Data Protection Regulation (GDPR) 2016/279.
This Privacy Notice applies to anyone (except staff) whose personal data we might process, for example, members of the public, manufacturers, wholesalers, and other authorities.
If you work for the Agency, please refer to our intranet for details of how we process your personal data – ex-members of staff should contact: dataprotection@mhra.gov.uk.
If you have queries about how the Agency protects and uses your personal data, please contact dataprotection@mhra.gov.uk in the first instance. You may also contact the DHSC Data Protection Officer at data_protection@dhsc.gov.uk.
Alternatively, you can contact us in writing:
Data Protection Officer
MHRA
10 South Colonnade
London
E14 4PU
Or
Data Protection Officer
DHSC
1st Floor North
39 Victoria Street
London
SW1H 0EU
Facebook
TwitterThis statistic shows the results of a survey on how aware consumers were of their rights regarding data protection under the upcoming GDPR legislation in the United Kingdom (UK) as of December 2017. The survey, that looked into consumer attitudes towards sharing their personal data with businesses, found that ** percent of respondents stated to never have heard of the new data protection regulations that will apply from ********** onwards.
Facebook
TwitterA Data Protection Impact Assessment (DPIA) is one of the ways to find out what privacy risks people face when information is collected, used, stored, or shared about them. This helps the London Borough of Barnet find issues so that risks can be taken away or lowered to a level that is acceptable. It also cuts down on privacy breaches and complaints that could hurt the Council's reputation or lead to action by the Information Commissioner (the government watchdog). The London Borough of Barnet makes DPIAs public in with its Data Charter and the 2018 Data Protection Act and UK GDPR.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Advances in digital technology have led to large amounts of personal data being recorded and retained by industry, constituting an invaluable asset to private organizations. The implementation of the General Data Protection Regulation in the EU, including the UK, fundamentally reshaped how data is handled across every sector. It enables the general public to access data collected about them by organisations, opening up the possibility of this data being used for research that benefits the public themselves; for example, to uncover lifestyle causes of poor health outcomes. A significant barrier for using this commercial data for academic research, however, is the lack of publicly acceptable research frameworks. Data donation—the act of an individual actively consenting to donate their personal data for research—could enable the use of commercial data for the benefit of society. However, it is not clear which motives, if any, would drive people to donate their personal data for this purpose. In this paper we present the results of a large-scale survey (N = 1,300) that studied intentions and reasons to donate personal data. We found that over half of individuals are willing to donate their personal data for research that could benefit the wider general public. We identified three distinct reasons to donate personal data: an opportunity to achieve self-benefit, social duty, and the need to understand the purpose of data donation. We developed a questionnaire to measure those three reasons and provided further evidence on the validity of the scales. Our results demonstrate that these reasons predict people’s intentions to donate personal data over and above generic altruistic motives. We show that a social duty is the strongest predictor of the intention to donate personal data, while understanding the purpose of data donation also positively predicts the intentions to donate personal data. In contrast, self-serving motives show a negative association with intentions to donate personal data. The findings presented here examine people’s reasons for data donation to help inform the ethical use of commercially collected personal data for academic research for public good.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
BackgroundThe COVID-19 pandemic brought global disruption to health, society and economy, including to the conduct of clinical research. In the European Union (EU), the legal and ethical framework for research is complex and divergent. Many challenges exist in relation to the interplay of the various applicable rules, particularly with respect to compliance with the General Data Protection Regulation (GDPR). This study aimed to gain insights into the experience of key clinical research stakeholders [investigators, ethics committees (ECs), and data protection officers (DPOs)/legal experts working with clinical research sponsors] across the EU and the UK on the main challenges related to data protection in clinical research before and during the pandemic.Materials and methodsThe study consisted of an online survey and follow-up semi-structured interviews. Data collection occurred between April and December 2021. Survey data was analyzed descriptively, and the interviews underwent a framework analysis.Results and conclusionIn total, 191 respondents filled in the survey, of whom fourteen participated in the follow-up interviews. Out of the targeted 28 countries (EU and UK), 25 were represented in the survey. The majority of stakeholders were based in Western Europe. This study empirically elucidated numerous key legal and ethical issues related to GDPR compliance in the context of (cross-border) clinical research. It showed that the lack of legal harmonization remains the biggest challenge in the field, and that it is present not only at the level of the interplay of key EU legislative acts and national implementation of the GDPR, but also when it comes to interpretation at local, regional and institutional levels. Moreover, the role of ECs in data protection was further explored and possible ways forward for its normative delineation were discussed. According to the participants, the pandemic did not bring additional legal challenges. Although practical challenges (for instance, mainly related to the provision of information to patients) were high due to the globally enacted crisis measures, the key problematic issues on (cross-border) health research, interpretations of the legal texts and compliance strategies remained largely the same.
Facebook
TwitterThis policy explains your rights as an individual when using services provided by His Majesty’s Passport Office (HMPO). It reflects your rights under data protection legislation including the General Data Protection Regulation and lets you know how HMPO looks after and uses your personal information and how you can request a copy of your information.
Facebook
TwitterSince the enforcement of the General Data Protection Regulation (GDPR) in May 2018, fines have been issued for several types of violations. As of February 2025, the most significant share of penalties was due to companies' non-compliance with general data processing principles. This violation has led to over 2.4 billion euros worth of fines.
Facebook
TwitterThe Secretary of State for Health and Social Care, acting through the executive agency of the Department of Health and Social Care, Public Health England, has commissioned the provision of various services to support members of the public during the coronavirus (COVID-19) pandemic.
These services are part of the Pandemic and Health Emergency Response Services (PHERS) which supplements the response provided by primary care during pandemics and other health-related emergencies.
These documents explain how personal data is used, in line with the UK General Data Protection Regulation (GDPR) and the Data Protection Act 2018. It includes information on the purpose and categories of data processed, and your rights if information about you is included.
Facebook
TwitterBetween 2018 and 2022, there has been a significant increase in the level of awareness around the General Data Protection Regulation (GDPR) among European users. In 2018, when the GDPR was first applied, the United Kingdom had the highest level of awareness, with 32 percent of respondents agreeing or strongly agreeing with the statement: "I am aware of the new General Data Protection Regulation (GDPR) that will be introduced in May 2018". In 2022, the share of UK respondents agreeing with the statement increased to 73 percent. France had the lowest level of awareness in 2018, 20 percent, whereas in 2022 it reached 47 percent but remained the lowest among other European markets.
Facebook
TwitterList of properties licensed under the Council’s HMO Licensing Scheme. This list is updated on a monthly basis. Under section 232 of The Housing Act 2004 the London Borough of Barnet is required to maintain and make available a public register of licensed Houses in Multiple Occupations. An extract of the register is published on the Council’s website here. The dataset is not intended for marketing purposes and none of the individuals or organisations mentioned within this register have given their consent for such use. Companies wishing to use this data for commercial purposes, and marketing in particular, are advised to consider whether their use of this data complies with the UK General Data Protection Regulations (GDPR), Data Protection Act 2018, and the Privacy and Electronic Communications Regulations 2003. Information Rights are upheld by the Information Commissioners Office, for further information, see the Information Commissioner’s Office website at ww.ico.org.uk. More information on Houses in Multiple Occupancy can be found on our website as well as on the council's Planning portal. You will need to select "Houses in Multiple Occupation" from the drop-down menu and click "Search":
Facebook
Twitterhttps://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy
As per our latest research, the global Age-Appropriate Design Compliance AI market size reached USD 1.42 billion in 2024, reflecting a robust momentum driven by regulatory mandates and the increasing digital engagement of children. The market is projected to expand at a CAGR of 23.7% from 2025 to 2033, resulting in a forecasted market value of USD 11.65 billion by 2033. The primary growth factor is the surge in global legislation such as the UK’s Age Appropriate Design Code and California’s Age-Appropriate Design Code Act, which are compelling digital service providers to adopt AI-driven compliance solutions to protect children’s data and online experiences.
A significant growth driver for the Age-Appropriate Design Compliance AI market is the rapid proliferation of digital platforms frequented by children and adolescents. As social media, gaming, educational technology, and content streaming services increasingly cater to younger audiences, the need for AI-powered compliance tools has become essential. These tools enable organizations to automatically detect, categorize, and restrict access to age-inappropriate content, ensuring adherence to evolving privacy and safety standards. The growing sophistication of AI algorithms has made it possible to analyze user behavior, verify ages, and personalize content delivery in a compliant manner, which further propels the adoption of such solutions across industries.
Another key factor accelerating market growth is the heightened awareness among parents, educators, and policymakers regarding online child safety. The rising incidents of cyberbullying, exposure to inappropriate content, and data breaches involving minors have led to increased scrutiny of digital platforms. This has prompted organizations to invest in advanced compliance AI solutions that not only fulfill legal obligations but also build trust with users and stakeholders. The integration of AI with privacy-enhancing technologies, such as differential privacy and federated learning, has enabled platforms to offer robust protection without compromising user experience, further boosting market expansion.
In addition to regulatory and societal pressures, technological advancements in natural language processing, computer vision, and behavioral analytics are transforming the landscape of age-appropriate design compliance. AI-powered systems are now capable of contextually understanding multimedia content, detecting subtle cues that indicate age-inappropriate material, and dynamically adapting interfaces to suit different age groups. This technological progress is reducing the manual burden on compliance teams and enabling scalable implementation across global operations. As a result, organizations are increasingly viewing compliance AI not just as a regulatory necessity, but as a strategic differentiator in a competitive digital market.
From a regional perspective, North America currently leads the Age-Appropriate Design Compliance AI market, accounting for nearly 39% of the global revenue in 2024. The region’s dominance is attributed to early adoption of privacy regulations, high digital penetration, and a strong presence of technology companies. Europe follows closely, buoyed by the implementation of the General Data Protection Regulation (GDPR) and the UK’s pioneering Age Appropriate Design Code. Meanwhile, the Asia Pacific region is witnessing the fastest growth, with a projected CAGR of 27.1% through 2033, as countries such as China, Japan, and India enhance their regulatory frameworks and digital infrastructure. The regional outlook underscores the global imperative for age-appropriate design compliance and the pivotal role of AI in achieving it.
The Component segment of the Age-Appropriate Design Compliance AI market is bifurcated into Software and Services, both playing distinct yet complementary roles in the ecosystem. Software solutions form the backbone of compliance operations, offering advanced functionalities such as age verification, content moderation, behavioral analytics, and policy enforcement. These AI-driven platforms leverage cutting-edge technologies, including machine learning, natural language processing, and computer vision, to automate the identification and management of age-sensitive content. The demand for highly
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Reasons for data donation subscale correlations, and means, standard deviations and Cronbach’s Alpha for subscales of Reasons for Data Donation.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Correlation of each of the reasons for data donation scales, while partialling out other two scales, with Prosocial Tendencies Measure, Self-Report Altruism Scale and Interpersonal Reactivity Index.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
For step comparison, missing observation were deleted listwise, resulting in N = 1024.
Facebook
TwitterA survey conducted in April and May 2023 revealed that around ** percent of the companies that do business in the European Union (EU) and the United Kingdom (UK) found it challenging to adapt to new or changing requirements of the General Data Protection Regulation (GDPR) or Data Protection Act 2018 (DPA). A further ** percent of the survey respondents said it was challenging to increase the budget because of the changes in the data privacy laws.