Methods Analytics’ response to the DCMS consultation on the future of the UK’S Data Protection regime
Methods Analytics response to “Data – A New Direction”
Methods Analytics’ response to the Department for Digital, Culture, Media and Sport’s consultation on the future of the UK’s data protection regime:
https://www.gov.uk/government/consultations/data-a-new-direction
November 2021
Contact: Martine Clark, Head of Government Sector
Email: [email protected]
Copyright 2021 Methods Analytics Limited
This document is the property of Methods Analytics Limited. It shall not be reproduced in whole or in part, nor disclosed to a third party, without prior written permission. No responsibility or liability is accepted for any errors or omissions in this document or related documents.
Freedom of Information Act
Methods Analytics Ltd wishes to comply with the requirements of the Freedom of Information Act 2000 (FOIA). We, therefore, provide this document on the basis that it is not released without informing Methods Analytics or Methods Advisory Ltd of its release and to whom. If the need arises to release this document, we would wish to inform you of areas within this document which are covered by Section 43 of the FOIA which we consider to be a Trade Secret and therefore may not be divulged to any third party under the Act.
3 years has passed since the UK Data Protection Act 2018. The sweeping changes brought in by the widespread introduction of the GDPR have had global implications for how organisations of all shapes and sizes handle the personal data entrusted to them. As well as building on previous policies, the explosion in the volume and processing of personal data had driven the need for this comprehensive legislation to protect privacy rights.
However, the world has changed since 2018. We’ve seen innovative and exponential technological growth and, sometimes associated, unpredictable levels of social change. We also understand better how the public expects organisations to use their data. This is not just the increasing expectation of complete transparency, but also the belief that organisations are making the most of available technology to share and process data in a way that benefits them. Following the UK’s exit from the EU, it is also time for us take this opportunity to look at whether our regime remains in all cases fit for purpose. An effective data regime can support and promote innovation in data, while also making sure the rights of individuals are protected.
Significant changes to the UK’s data protection regime may unlock the most significant benefits in the long term. However, the cost of disruption to private, public and third sector organisations who have invested significant time and money to ensure compliance with the current regime cannot be underestimated. Reducing unnecessary regulatory burdens will be welcomed; a confusing regime which makes it difficult for UK organisations to work across global boundaries will not.
It is positive to see this consultation being framed around ambition, innovation and growth. While the focus on maximising the benefits from more innovative use of data is positive, we must always be mindful that a core responsibility of data policy is to protect the public. Public and private sector organisations alike must continue to ensure they handle personal data transparently and fairly (with more clarity as to how we determine “fairness” very welcome). The public increasingly trust organisations to handle enormous volumes of very sensitive data; an effective regime must clearly set out how organisations must behave to earn that trust, and provide the effective checks and balances to monitor their performance.
We also know that the EU will be watching this work closely. While the strength of the UK’s regime meant the positive adequacy decisions this year were likely, they were by no means a given. The impact on both the economy and security of the UK if we lose adequacy cannot be understated. We look forward with interest to see how DCMS and UK ministers choose to balance the drive to maximise the benefits of an independent regime with the need to share data smoothly with our closest trade partners.
In this consultation response we have concentrated on Chapter 1 – Reducing barriers to responsible innovation and Chapter 2 - Reducing burdens on businesses and delivering better outcomes for people. As a specialist data consultancy, Methods Analytics is dedicated to helping our clients across central government, defence and the health sector manage and use their data to solve some of the most complex problems in the country. We passionately believe that good data governance unlocks innovation, and strongly support the government’s ambition to create a data regime which is adaptable and dynamic enough to grow and develop as data capability and technology continues to expand.
Q1.2.1 To what extent do you agree that consolidating and bringing together research-specific provisions will allow researchers to navigate the relevant law more easily?
We fully agree that reducing the uncertainty around what constitutes research will improve transparency for individuals, reduce perceived risk, and increase the value research institutes can drive from data.
Q1.2.2. To what extent do you agree that creating a statutory definition of 'scientific research' would result in greater certainty for researchers?
Q1.2.3. Is the definition of scientific research currently provided by Recital 159 of the UK GDPR (‘technological development and demonstration, fundamental research, applied research and privately funded research’) a suitable basis for a statutory definition?
We would welcome further clarification on this topic, including the interactions with EU institutions.
Q1.2.4. To what extent do you agree that identifying a lawful ground for personal data processing for research processes creates barriers for researchers?
The current 6 lawful bases for processing data, as outlined in Article 6 of the UK GDPR, covers quite a large scope for personal data research. Defining boundaries will no doubt in some way create barriers for some undertaking research. If there were examples of scenarios that demonstrated this to be detrimental to research deemed of importance, then we believe due consideration could be given to lifting some of those perceived barriers.
Q1.2.5. To what extent do you agree that clarifying that university research projects can rely on tasks in the public interest (Article 6(1)(e) of the UK GDPR) as a lawful ground would support researchers to select the best lawful ground for processing personal data?
Q1.2.6. To what extent do you agree that creating a new, separate lawful ground for research (subject to suitable safeguards) would support researchers to select the best lawful ground for processing personal data?
As long as this new lawful ground for research was clear and comprehensible to all this could work. Any guidance given that was not clear and distinct could further muddy waters if roles and responsibilities are misunderstood by parties involved in processing personal data.
Q1.2.7. What safeguards should be built into a legal ground for research?
N/A
Q1.2.8. To what extent do you agree that it would benefit researchers to clarify that data subjects should be allowed to give their consent to broader areas of scientific research when it is not possible to fully identify the purpose of personal data processing at the time of data collection?
While this would be welcome in principle, it is unclear whether this is compatible with reducing the broader over-reliance on consent.
Q1.2.9. To what extent do you agree that researchers would benefit from clarity that further processing for research purposes is both (i) compatible with the original purpose and (ii) lawful under Article 6(1) of the UK GDPR?
Q1.2.10. To what extent do you agree with the proposals to disapply the current requirement for controllers who collected personal data directly from the data subject to provide further information to the data subject prior to any further processing, but only where that further processing is for a research purpose and it where it would require a disproportionate effort to do so?
We agree with the principle, but also the concern that this could be very open to misuse.
Q1.3.1. To what extent do you agree that the provisions in Article 6(4) of the UK GDPR on further processing can cause confusion when determining what is lawful, including on the application of the elements in the compatibility test?
Q1.3.2. To what extent do you agree that the government should seek to clarify in the legislative text itself that further processing may be lawful when it is a) compatible or b) incompatible but based on a law that safeguards an important public interest?
Q1.3.3. To what extent do you agree that the government should seek to clarify when further processing can be undertaken by a controller different from the original controller?
Clarity here is key, with organisations able to clearly interpret the often complex hand-offs between processors and controllers. Process/controller guidance and clarity to date hasn’t actually provided organisations with practical advice on the interpretation: we welcome improvements which reduce the need for layers of legal expertise to apply safely.
Q1.3.4 To what extent do you agree that the government should seek to clarify when further processing may occur, when the original lawful ground was consent?
Q1.4.1. To what extent do you agree with the proposal to create a limited, exhaustive list of legitimate interests for which organisations can use personal data without applying the balancing test?
We’re pleased to see recognition of how unclear the legitimate interest test can be, and welcome any clarity that can be provided on defining what legitimate interest is. Without this, there remains a risk that organisations try to work around the regulation and process the data without a strong enough consideration of risk. Even with additional clarity, it must be noted that it will still be necessary to apply the balancing test to some categories in order to gain public trust.
We also agree the emphasis on reducing “consent-fatigue” and the known issue that in the face of a large volume of consent requests people will start to accept them, despite not having the time or resources to assess them properly. However, if regulations need to withstand the test of time, will they need to be reviewed and updated constantly? How does that meet the aim of simplifying the regime while being clear on ambiguities?
Q1.4.2. To what extent do you agree with the suggested list of activities where the legitimate interests balancing test would not be required?
While we strongly agree with clarifying the list of activities, we have concerns about some of those suggested:
c. Monitoring, detecting or correcting bias in relation to developing AI systems. This could be easily misused for marketing or other purposes
d. Using audience measurement cookies or similar technologies to improve web pages that are frequently visited by service users. How would this actually work in practice? What happens to less visited web pages? Would a slight change of address count as a new web page?
h. Using personal data for internal research and development purposes, or business innovation purposes aimed at improving services for customers. Again, this could easily include marketing and spam.
Q1.4.3. What, if any, additional safeguards do you think would need to be put in place?
A method to ensure that data regulation could not be worked around to process data for marketing or unnecessary reasons. Individuals should be informed if any internal research is performed on their data and ideally have examples of what could be done with the data collected (e.g. recommend products for a certain problem/treatment).
Q1.4.4. To what extent do you agree that the legitimate interests balancing test should be maintained for children’s data, irrespective of whether the data is being processed for one of the listed activities?
Strongly support ensuring there are additional safeguards in place to ensure risk calculations are paramount when processing children’s data.
AI and Machine Learning – Fairness in an AI context
Q1.5.1. To what extent do you agree that the current legal obligations with regards to fairness are clear when developing or deploying an AI system?
Decisions about “fairness” are often made by the data scientists processing the data. They are responsible for applying AI solutions, ensuring they are fair and that they do not discriminate against groups or individuals. In many cases neither the data scientists or data owners will understand the full extent of the fairness obligations: we have seen the implications of this where bias has been inherent in the AI deployed by even the largest organisations (who should have the legal resources to interpret and apply the legislation effectively).
Q1.5.2. To what extent do you agree that the application of the concept of fairness within the data protection regime in relation to AI systems is currently unclear?
We strongly support the narrative that fairness is not the same as transparency. AI should not be democratised: while it is important to consider public expectations about the way their data is handled (as per the ICO guidance), these expectations alone are not enough to steer the application of algorithms.
Q1.5.3. What legislative regimes and associated regulators should play a role in substantive assessments of fairness, especially of outcomes, in the AI context?
Before deploying an AI system there should be evidence that tests have been in place to make sure data is not bias and outcomes are fair. Statistical tests and analysis will be crucial before and after applying AI.
Q1.5.4. To what extent do you agree that the development of a substantive concept of outcome fairness in the data protection regime - that is independent of or supplementary to the operation of other legislation regulating areas within the ambit of fairness - poses risks?
While we broadly support an outcome-based approach to data protection, organisations need sufficient clarity to be able to measure outcomes and understand whether they are acceptable. In an area as ambiguous as fairness, where outcomes will interact with multiple other legislative areas, it is difficult to see how outcome fairness could be applied in practise.
Building trustworthy AI systems
Q1.5.5. To what extent do you agree that the government should permit organisations to use personal data more freely, subject to appropriate safeguards, for the purpose of training and testing AI responsibly?
There are many ways that the data could be wrangled to limit the risk, such as pseudonymisation or even simply using synthetic data (with characteristics based on actual data). More extensive training and testing of AI is needed to understand, and limit, unintended consequences from this emerging field. More freedom to use data sets to demonstrate whether an AI system is performing fairly before it is deployed could significantly reduce harm and improve service outcomes. However, we would welcome further clarification on the “appropriate safeguards” referred to here.
Q1.5.6. When developing and deploying AI, do you experience issues with identifying an initial lawful ground?
Not significantly. We work predominantly with public sector clients, who tend to be well versed in how to identify an appropriate legal basis.
Q1.5.7 When developing and deploying AI, do you experience issues with navigating re-use limitations in the current framework?
No. We aim to design pipelines that can be applied to different problems, rather than adhoc for individual projects. While obviously some aspects, specific to the data structure and content, will not be re-usable, most of the functions should be written independently from the input data.
Bias
Q1.5.10. To what extent do you agree with the proposal to make it explicit that the processing of personal data for the purpose of bias monitoring, detection and correction in relation to AI systems should be part of a limited, exhaustive list of legitimate interests that organisations can use personal data for without applying the balancing test?
This should be part of the balance test, and should be needed to make sure systems are fair. Some people will be more encouraged to add data that would not normally be added, but it will take a while to gain public trust. There should be examples on how this has mitigated some bias in the past, and how this could contribute to the creation of synthetic data sets to speed up lengthy processes.
Q1.5.11. To what extent do you agree that further legal clarity is needed on how sensitive personal data can be lawfully processed for the purpose of ensuring bias monitoring, detection and correction in relation to AI systems?
The use of sensitive personal data is vital to both monitor, and prevent, algorithmic bias. Providing additional clarity to assist with, for example, the balancing test will help increase and improve its use.
Q1.5.12. To what extent do you agree with the proposal to create a new condition within Schedule 1 to the Data Protection Act 2018 to support the processing of sensitive personal data for the purpose of bias monitoring, detection and correction in relation to AI systems?
This option would provide greater clarity and ease of use than the legitimate interest changes. The challenge will be in the collection of the data and making sure that the (perceived?) intrusive nature of some of the questions is not a blocker and continuous source of bias. If this option is taken into consideration, there will need to be greater consideration given to preventing data breaches. It feels like this is taken lightly because the benefits are greater, but it will have a massive impact on public trust.
Automated decision-making and data rights
Q1.5.14 To what extent do you agree with what the government is considering in relation to clarifying the limits and scope of what constitutes ‘a decision based solely on automated processing’ and ‘producing legal effects concerning [a person] or similarly significant effects?
Currently both the legislation and associated guidance is particularly unclear. Safely increasing the decisions which can be made solely based on automated processing could significantly improve service provision and drive a step change in the efficiency of many public services. However, safeguards here are obviously vital: clear guidance, for example around the necessity for a consistently applied data ethics framework in addition to a DPIA. Public transparency of where, and how, decisions are made will also be essential.
Q1.5.15. Are there any alternatives you would consider to address the problem?
Q1.5.16. To what extent do you agree with the following statement: 'In the expectation of more widespread adoption of automated decision-making, Article 22 is (i) sufficiently future-proofed, so as to be practical and proportionate, whilst (ii) retaining meaningful safeguards'?
Article 22 doesn’t provide either the clarity or safeguards required to enable the safer adoption of automated decision-making. It is both too restrictive, and too vague.
Q1.5.17. To what extent do you agree with the Taskforce on Innovation, Growth and Regulatory Reform’s recommendation that Article 22 of UK GDPR should be removed and solely automated decision making permitted where it meets a lawful ground in Article 6(1) (and Article 9-10 (as supplemented by Schedule 1 to the Data Protection Act 2018) where relevant) and subject to compliance with the rest of the data protection legislation?
This move would be extremely risky, and lead to significant public mistrust. Clear guidance which can be practically applied to automated decision-making is necessary: simply relying on legislation which is not designed for this purpose would reduce public trust, and make applying sensible safeguards more difficult. Automated decision-making should be handled by both legislation, and associated guidance and case studies.
We were surprised to see how much the emphasis in this section was on reducing burdens on businesses, without the corresponding safeguards or drive to actually deliver better public outcomes. While reducing unnecessary bureaucracy must obviously be welcomed, the reforms in this section add up to a real dilution of business accountability for data risks. The removal of some of the best practise aspects of data management, including accountability for data protection and the need for evidenced risk assessments, could lead to unchecked, irresponsible practices.
2.2 Reform the accountability framework
Privacy management programmes
Q2.2.1. To what extent do you agree with the following statement: ‘The accountability framework as set out in current legislation should i) feature fewer prescriptive requirements, ii) be more flexible, and iii) be more risk-based’?
The current legislation provides clear, consistent standards for data across organisations, and principles against which businesses can measure their compliance. The lack of clarity over ‘flexibility’ and ‘risk-based’ approach is risky given the sensitivity of the data involved. That said, the lack of understanding over current legislation is often a barrier to innovative insights.
Q2.2.2. To what extent do you agree with the following statement: ‘Organisations will benefit from being required to develop and implement a risk-based privacy management programme’?
Whilst hypothetically this is a good idea, the lack of clarity over specificities on what constitutes a risk based framework and its legislative enforcement, coupled with often lack of data literacy across organisations, makes the practicalities of implementing this doubtful. Additionally, if the cutting of red tape implied by this reform comes at the cost of an increased risk of exploitative or invasive data practices with regards to client or citizen data, then it is not something to be endorsed.
Q2.2.3. To what extent do you agree with the following statement: ‘Individuals (i.e. data subjects) will benefit from organisations being required to implement a risk-based privacy management programme’?
As per Q2.2.2., the benefits of additional insight need to be weighed against privacy concerns. Data protection officer requirements
Q2.2.4. To what extent do you agree with the following statement: ‘Under the current legislation, organisations are able to appoint a suitably independent data protection officer’?
The appointment of a DPO is often arbitrary in small companies and lacks empowerment in larger ones. This is a failing of appreciation of the importance of the position, rather than the legislative failings themselves.
Q2.2.5. To what extent do you agree with the proposal to remove the existing requirement to designate a data protection officer?
Doing away with the need for a DPO would expose companies to numerous data risks, and remove an appropriate escalation point for employees to raise privacy concerns.
Q.2.2.6. Please share your views on whether organisations are likely to maintain a similar data protection officer role, if not mandated.
It is likely that larger organisations will maintain a similar data protection officer role for risk mitigation, and small organisations will, rightfully, take this as an opportunity to remove the role. There is real risk in the middle: SMEs which often process large amounts of personal data will be unlikely to bear the expense of properly training and resourcing a skilled data protection officer without a legislative requirement.
Data protection impact assessments
Q2.2.7. To what extent do you agree with the following statement: ‘Under the current legislation, data protection impact assessment requirements are helpful in the identification and minimisation of data protection risks to a project’?
DPIAs are reflective of best practice within data management as necessary due diligence.
Q.2.2.8. To what extent do you agree with the proposal to remove the requirement for organisations to undertake data protection impact assessments?
Removing this requirement without proposing a suitable alternative would permit companies to perform unnecessary and invasive data practices without performing the necessary due diligence. As an irresponsible reform which could have negative impact on citizens, this cannot be endorsed.
Prior consultation requirements
Q. 2.2.9 Please share your views on why few organisations approach the ICO for ‘prior consultation’ under Article 36 (1)-(3). As a reminder Article 36 (1)-(3) requires that, where an organisation has identified a high risk that cannot be mitigated, it must consult the ICO before starting the processing.
Unknown
Please explain your answer, and provide supporting evidence where possible.
Q.2.2.10. To what extent do you agree with the following statement: ‘Organisations are likely to approach the ICO before commencing high risk processing activities on a voluntary basis if this is taken into account as a mitigating factor during any future investigation or enforcement action’?
Unknown
Please explain your answer, and provide supporting evidence where possible, and in particular: what else could incentivise organisations to approach the ICO for advice regarding high risk processing?
Unknown
Record keeping
Q.2.2.11. To what extent do you agree with the proposal to reduce the burden on organisations by removing the record keeping requirements under Article 30?
Unknown
Breach reporting requirements
Q.2.2.12. To what extent do you agree with the proposal to reduce burdens on organisations by adjusting the threshold for notifying personal data breaches to the ICO under Article 33?
Unknown
Voluntary undertakings process
Q.2.2.13. To what extent do you agree with the proposal to introduce a voluntary undertakings process? As a reminder, in the event of an infringement, the proposed voluntary undertakings process would allow accountable organisations to provide the ICO with a remedial action plan and, provided that the plan meets certain criteria, the ICO could authorise the plan without taking any further action.
The further empowerment of the ICO and collaboration opportunities it would afford are welcomed. Clarity is required around standards for remediation plans and mitigations of fraudulent auditing.
Further questions
Q.2.2.14. Please share your views on whether any other areas of the existing regime should be amended or repealed in order to support organisations implementing privacy management requirements.
Unknown
Q.2.2.15. What, if any, safeguards should be put in place to mitigate any possible risks to data protection standards as a result of implementing a more flexible and risk-based approach to accountability through a privacy management programme?
The concepts of standards and flexibility are somewhat at odds with each other, the government’s position on this requires clarification.
Record-keeping - 2
Q2.2.16. To what extent do you agree that some elements of Article 30 are duplicative (for example, with Articles 13 and 14) or are disproportionately burdensome for organisations without clear benefits?
Unknown
Breach reporting requirements - 2
Q.2.2.17. To what extent do you agree that the proposal to amend the breach reporting requirement could be implemented without the implementation of the privacy management programme?
Unknown
Data protection officers
Q.2.2.18. To what extent do you agree with the proposal to remove the requirement for all public authorities to appoint a data protection officer?
As per Q2.2.5, we strongly disagree with this.
Q.2.2.19. If you agree, please provide your view which of the two options presented at paragraph 184d(V) would best tackle the problem.
Please provide supporting evidence where possible, and in particular:
• What risks and benefits you envisage
• What should be the criteria for determining which authorities should be required to appoint a data protection officer
N/A
Further questions
Q2.2.20 If the privacy management programme requirement is not introduced, what other aspects of the current legislation would benefit from amendments, alongside the proposed reforms to record keeping, breach reporting requirements and data protection officers?
Unknown
2.3 Subject Access Requests
Q2.3.1. Please share your views on the extent to which organisations find subject access requests time-consuming or costly to process.
Please provide supporting evidence where possible, including:
While an important part of ensuring individuals can hold organisations to account, and understand the data being held about them, subject access requests are significant burdens, particularly in the public sector. Reform to clarify, and potentially reduce, the requirement on organisations, would be welcome.
Q2.3.2. To what extent do you agree with the following statement: ‘The ‘manifestly unfounded’ threshold to refuse a subject access request is too high’?
Q2.3.3. To what extent do you agree that introducing a cost limit and amending the threshold for response, akin to the Freedom of Information regime (detailed in the section on subject access requests), would help to alleviate potential costs (time and resource) in responding to these requests?
Q2.3.4. To what extent do you agree with the following statement: ‘There is a case for re introducing a small nominal fee for processing subject access requests (akin to the approach in the Data Protection Act 1998)’?
This is a demonstration of the rolling back of citizens rights and has dangerous implications over the autonomy of citizens with regards to their data, as well as discrimination against low income citizens.
Q2.3.5. Are there any alternative options you would consider to reduce the costs and time taken to respond to subject access requests?
2.4 Privacy and electronic communications
Q2.4.1. What types of data collection or other processing activities by cookies and other similar technologies should fall under the definition of 'analytics'?
Unknown
Q2.4.2 To what extent do you agree with the proposal to remove the consent requirement for analytics cookies and other similar technologies covered by Regulation 6 of PECR?
Any process of collecting personal data will require consent from citizens. Any effort to infringe upon the autonomy or privacy of citizens cannot be endorsed, rather we would hope the government hold companies with invasive data practices to account over their infringements.
Q2.4.3. To what extent do you agree with what the government is considering in relation to removing consent requirements in a wider range of circumstances? Such circumstances might include, for example, those in which the controller can demonstrate a legitimate interest for processing the data, such as for the purposes of detecting technical faults or enabling use of video or other enhanced functionality on websites.
As per Q2.4.2.
Q2.4.4. To what extent do you agree that the requirement for prior consent should be removed for all types of cookies?
As per Q2.4.2.
Q2.4.5. Could sectoral codes (see Article 40 of the UK GDPR) or regulatory guidance be helpful in setting out the circumstances in which information can be accessed on, or saved to a user’s terminal equipment?
Unknown
Q2.4.6. What are the benefits and risks of requiring websites or services to respect preferences with respect to consent set by individuals through their browser, software applications, or device settings?
Autonomy and privacy rights of citizens. Methods Analytics believes this requires robust enforcement.
Q2.4.7. How could technological solutions, such as browser technology, help to reduce the volume of cookie banners in the future?
Unknown
Q2.4.8. What, if any, other measures would help solve the issues outlined in this section?
Unknown
Q2.4.9. To what extent do you agree that the soft opt-in should be extended to non- commercial organisations? See paragraph 208 for description of the soft opt-in.
Unknown
Q2.4.10. What are the benefits and risks of updating the ICO’s enforcement powers so that they can take action against organisations for the number of unsolicited direct marketing calls ‘sent’?
Unknown
Q2.4.11. What are the benefits and risks of introducing a ‘duty to report’ on communication service providers?
This duty would require communication service providers to inform the ICO when they have identified suspicious traffic transiting their networks. Currently the ICO has to rely on receiving complaints from users before they can request relevant information from communication service providers.
Please provide information on potential cost implications for the telecoms sector of any new reporting requirements.
Unknown
Q2.4.12. What, if any, other measures would help to reduce the number of unsolicited direct marketing calls and text messages and fraudulent calls and text messages?
Unknown
Q2.4.13. Do you see a case for legislative measures to combat nuisance calls and text messages?
Unknown
Q2.4.14. What are the benefits and risks of mandating communications providers to do more to block calls and text messages at source?
Unknown
Q2.4.15 What are the benefits and risks of providing free of charge services that block, where technically feasible, incoming calls from numbers not on an ‘allow list’? An ‘allow list’ is a list of approved numbers that a phone will only accept incoming calls from.
Unknown
Q2.4.16. To what extent do you agree with increasing fines that can be imposed under PECR so they are the same level as fines imposed under the UK GDPR (i.e. increasing the monetary penalty maximum from £500,000 to up to £17.5 million or 4% global turnover, whichever is higher)?
The current fines are insufficient to dissuade larger businesses from their invasive and unethical data practices and are often seen simply as a cost of doing business.
Q2.4.17. To what extent do you agree with allowing the ICO to impose assessment notices on organisations suspected of infringements of PECR to allow them to carry out audits of the organisation’s processing activities?
The ICO requires further empowerment to audit companies, further clarity is require on the nature of these powers.
Q2.4.18. Are there any other measures that would help to ensure that PECR's enforcement regime is effective, proportionate and dissuasive?
Fines over and above those enforced by the EU.
2.5 Use of personal data for the purposes of democratic engagement
Q2.5.1. To what extent do you think that communications sent for political campaigning purposes by registered parties should be covered by PECR’s rules on direct marketing, given the importance of democratic engagement to a healthy democracy?
Given the increase in lobbying and broad dissemination of misinformation, giving an additional avenue of citizen outreach to parties whose motivations are often unethical cannot be endorsed. Any perception that the UK allows political campaigning to be disguised as ‘democratic engagement’ is potentially damaging.
Q2.5.2. If you think political campaigning purposes should be covered by direct marketing rules, to what extent do you agree with the proposal to extend the soft opt-in to communications from political parties?
As per previous answers, the citizen should be empowered and autonomous, requiring consent for all aspects of data management
Q2.5.3. To what extent do you agree that the soft opt-in should be extended to other political entities, such as candidates and third-party campaign groups registered with the Electoral Commission? See paragraph 208 for description of the soft opt-in
Please explain your answer, and provide supporting evidence where possible. As per Q2.5.2
Q2.5.4. To what extent do you think the lawful grounds under Article 6 of the UK GDPR impede the use of personal data for the purposes of democratic engagement?
Unknown
Q2.5.5 To what extent do you think the provisions in paragraphs 22 and 23 of Schedule 1 to the DPA 2018 impede the use of sensitive data by political parties or elected representatives where necessary for the purposes of democratic engagement?
Unknown
2.6 Further Questions
Q2.6.1. In your view, which, if any, of the proposals in ‘Reducing burdens on business and delivering better outcomes for people’, would impact on people who identify with the protected characteristics under the Equality Act 2010 (i.e. age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex and sexual orientation)?
Unknown
Q2.6.2. In addition to any of the reforms already proposed in ‘Reducing burdens on business and delivering better outcomes for people’, (or elsewhere in the consultation), what reforms do you think would be helpful to reduce burdens on businesses and deliver better outcomes for people?
Unknown