This post expands the points made in the ICOP briefing prepared for the Committee Stage of the Data (Use and Access) Bill in the House of Lords starting on 3rd December 2024. The briefing highlights three areas where UK data protection standards are (or may become) lower in the UK than in the EU, thus posing a risk to the free flow of data from the EU to the UK.  The free flow of data depends on a finding of data adequacy by the European Commission.  Data adequacy is conferred where the standard of protection of personal data is “essentially equivalent” to that in the EU.  The three areas of risk are:

  1. The overall standard of the right to the protection of personal data, following the deletion of EU fundamental rights from the UK statute book;
  2. Protections from automated decision-making, taking into account the changes the Bill makes in this area;
  3. Data subject rights.

These issues are explored below.

  1. Overall standard of protection of personal data in the UK post-Brexit

Immediately post Brexit

When the UK left the EU the rights and freedoms conferred under EU law were preserved through the European Union (Withdrawal) Act 2018 (“EUWA”).[1]  Although EUWA did not preserve the EU’s Charter of Fundamental Rights in domestic law, the government’s position was that this made no difference to the standard of the protection of personal data in the UK because the rights which the Charter lists were preserved.[2]  The general principles of EU law (which encompass fundamental rights, including the fundamental right to the protection of personal data[3]) were preserved as an aid to interpretation of the UK’s data protection frameworks.[4]  Further, the CJEU’s case law on data protection remained binding on most UK courts.[5]  These aspects of the UK’s post-Brexit legislation are referenced in the European Commission’s adequacy decision for the UK. [6]

Effect of the Retained EU Law (Revocation and Reform) Act 2023

The Retained EU Law (Revocation and Reform) Act 2023 (“REULA”) deleted the retained EU general principles (including fundamental rights) from the UK statute book.[7]  This was particularly problematic in the area of data protection because fundamental rights (and in particular the fundamental right to the protection of personal data) are the underpinning foundations of the UK GDPR.  Indeed, the UK GDPR contains 61 references to the data subject’s rights and freedoms which would previously have been interpreted as references to the fundamental right to the protection of personal data in the EU legal order, as saved into UK domestic law. [8]  In order to preserve the coherence of the UK GDPR, the previous government introduced the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023[9] (“the Regulations”).  The Regulations ensured that references to fundamental rights and freedoms in the UK GDPR would be read as referring to “Convention Rights within the meaning of the Human Rights Act 1998”.[10]

It is not clear that the ECHR (in particular Article 8 which encompasses the right to the protection of personal data) is as protective as the EU fundamental right to the protection of personal data.  In the case of of R (Davis & Watson) v. Secretary of State for the Home Department [2015] EWHC 2092 (Admin) at [80], the High Court held that Article 8 of the EU’s Charter of fundamental rights “goes further” and “is more specific” than Article 8 of the ECHR.  The coming into force of the relevant provisions of RUELA may therefore have weakened the protection of personal data in the UK.

The government’s interpretation of the application of ECHR rights in data protection legislation

The government’s ECHR memorandum for the Data (Use and Access) Bill suggests that despite the amendment made via the Regulations, the application of ECHR rights in the context of data protection law are limited.  The ECHR memorandum states that “where processing is conducted by a private body, that processing will not usually engage convention (ie ECHR) rights”.[11] 

The government’s position on the application of the ECHR suggests that the 61 references to the data subject’s rights and freedoms in the UK GDPR have no effect in most cases where the actions of private bodies engage rights and freedoms of individuals.  In an age where powerful technology companies, as well as smaller organisations’ processing of personal data can have profound effects on data subject’s lives, this statement is concerning.  Further, it suggests that the UK’s data protection regime is less protective than the EU regime.  CJEU case law has confirmed that the consideration of the data subject’s rights and freedoms required the (private sector) controller to take “account of the data subject’s rights arising from Articles 7 and 8 of the Charter [of Fundamental Rights]”.[12] There is no suggestion in the EU case law that the Charter would not be relevant where the processing was conducted by a private body.  On the contrary: the CJEU has consistently emphasised that a high level of protection for data subjects is required.[13]

The government’s stance on the meaning of the data subject’s “rights and freedoms” suggests that the changes made to data protection rights in the UK bring the standard of protection below that in the EU, thus imperilling the free flow of personal data from the EU to the UK.

2. Protections from automated decision-making

The UK GDPR provides crucial protections where automated decision-making is being undertaken in relation to individuals. [14] Currently Article 22 imposes two “layers” of safeguards which broadly speaking have the following effects:

  1. A controller can only use machines as the sole method of making important decisions about human beings (such as deciding how much they should be paid based on the machine’s evaluation of their work) where:
    • Individuals have given their explicit consent to the processing. Consent can’t be given under data protection law where there is an imbalance of power between the controller and the data subject, such as in an employment relationship;
    • The use of automated decision-making is necessary for entering into or performing a contract with the data subject.  Regulatory guidance suggests that an example of where automated decision-making is necessary for contract purposes is where a credit scoring system is used to decide whether to give an individual a loan; or
    • This is authorised by a national law (which itself must be targeted and proportionate ).[15]
  2. Even where automated decision-making is permitted, the controller needs to implement suitable safeguards which enable individuals to obtain human intervention by the controller, to enable the individual to express his or her views, and contest the decision. 

The changes made by the Bill sweep away the first “layer” of protection as described above, unless the decision is made using special category personal data (eg personal data relating to health, sexual orientation or racial or ethnic origin).  It is not clear why the government considers it advantageous to allow automated decision-making in the context of important decisions made about individuals (subject to safeguards).  Research suggests that human beings tend to be overly deferential towards automated decision-making. Therefore, simply relying on safeguards enabling the individual to contest the automated decision may provide insufficient protection (see for example Human–AI Interactions in Public Sector Decision Making: “Automation Bias” and “Selective Adherence” to Algorithmic Advice | Journal of Public Administration Research and Theory | Oxford Academic).  Experience such as the A-level algorithms fiasco or the Post Office/Horizon miscarriages of justice shows just how dangerous it can be to rely on analysis conducted by machines.  Human beings should not be subject to decisions made by biased, faulty or unreliable computers.  For some real-life examples of how automated decision-making is currently being used in the UK, see Annex A.

Leaving in place the current protections from automated decision-making will not stop businesses from developing technology in the UK because the protections only apply where significant decisions are being made about people using computers.  Article 22 of the UK GDPR as currently drafted therefore does not hinder research and development, but it does mean that robust protections exist where machines are used to make important decisions about people.     

3. Data subject rights

Data Subject rights and data adequacy

The UK’s current adequacy decision under the GDPR does not apply in the area of immigration.  The decision states as follows:

(5) The Commission has carefully analysed the law and practice of the United Kingdom. Based on the findings developed in recitals (8) to (270), the Commission concludes that the United Kingdom ensures an adequate level of protection for personal data transferred within the scope of Regulation (EU) 2016/679 from the European Union to the United Kingdom.

(6) This conclusion does not concern personal data transferred for United Kingdom immigration control purposes or which otherwise falls within the scope of the exemption from certain data subject rights for purposes of the maintenance of effective immigration control (the “immigration exemption”) pursuant to paragraph 4(1) of Schedule 2 to the UK Data Protection Act. The validity and interpretation of the immigration exemption under UK law is not settled following a decision of the England and Wales Court of Appeal of 26 May 2021. While recognising that data subject rights can, in principle, be restricted for immigration control purposes as “an important aspect of the public interest”, the Court of Appeal has found that the immigration exemption is, in its current form, incompatible with UK law, as the legislative measure lacks specific provisions setting out the safeguards listed in Article 23(2) of the United Kingdom General Data Protection Regulation (UK GDPR). In these conditions, transfers of personal data from the Union to the United Kingdom to which the immigration exemption can be applied should be excluded from the scope of this Decision. Once the incompatibility with UK law is remedied, the immigration exemption should be reassessed, as well as the need to maintain the limitation of the scope of this Decision.

The judgment of the Court of Appeal referred to in the adequacy decision is the case of  R (Open Rights Group and the3million) v Secretary of State for the Home Department and Others [2021] EWCA Civ 800 (“the Open Rights Case”).

Open Rights Case

The Open Rights case was brought after the UK left the EU, but before REULA came into effect.  The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high standards of data protection in the UK (before this principle was deleted from the statute book by REULA[16]).  In broad terms, the Court of Appeal found that the immigration exemption in Schedule 2 to the DPA 2018[17] conflicted with the safeguards in Article 23 of the UK GDPR.  This was because the immigration exemption was drafted too broadly, and failed to incorporate the safeguards prescribed for exemptions under Article 23(2) of the UK GDPR.  The immigration exemption was therefore held to be unlawful and was disapplied.   The Home Office redrafted the exemption to make it more protective, but took several attempts to bring forward legislation which provided sufficient safeguards for data subjects (see R (Open Rights Group and another) v Secretary of State for the Home Department and another [2023] EWCA Civ 1474). The extent of the safeguards now set out in the Immigration exemption underscores both what is required for compatibility with Article 23(2) of the UK GDPR, and the deficiencies in the rest of the Schedule 2 exemptions.   

What is the relevance of the Open Rights Case to UK data protection standards more generally?

It is clear when reading the judgment in the Open Rights case that none of the exemptions from data subject rights under Schedules 2 – 4 to the Data Protection Act meet the standards set out in Article 23(2) to the UK GDPR (bar the current immigration exemption under paragraph 4).  When the European Commission comes to reassess the UK in terms of data adequacy this is bound to be an area of vulnerability. 

The effect of REULA

The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights style challenge to the other exemptions in Schedules 2 – 4 to the Data Protection Act 2018.  This is because the effect of REULA is that in a conflict, what was EU law is now subject to domestic law (whenever made or enacted).  The exceptions from this rule for data protection rights do not alter this position.  Although the Data (Use and Access) Bill remedies the effect of the removal of the principle of the supremacy of EU law in certain contexts (see Clause 105), it does not alter the position under REULA as regards data subject rights:  in a conflict Article 23 of the UK GDPR is subject to the Schedules 2 – 4 exemptions.  For further detail see Annex B.

Eleonor Duhs

2nd December 2024

Annex A

These examples have been provided by the Communication Workers’ Union in South Wales

In a hi-tech logistics company a newly appointed manager, straight from university, without human training made an expense claim incorrectly, rather than have human oversight, a computer authorised the expenses, the first the employee knew there was an error was when he was called in for a dismissal hearing. The company thought there was nothing wrong with relying solely on computer based training and computer oversight, their attitude was that this error showed the problem was with their human employees.

 The introduction of tracking technology in a communication company caused a great deal of trouble for us, it was brought in alongside a new productivity system and aggregated data from many sources. Although the agreement on introduction stated it would not be used for disciplinary purposes, the implementation was very different.

Some early examples were almost humorous, such as workers being brought in to explain why they stopped at traffic lights. But soon as the computer generated reports highlighted time spent in the depo and simple matters of dignity were under attack such as workers being pressured not to use toilets in work time, not to make use of welfare facilities in order to game the report so that a managers red box turned to a green box on his report.  Later systems would direct technicians to specific work points by the machine, often it was incorrect and no match for years of experience and local knowledge, the company knew this and allowed deviation whilst it worked –  but if for any reason your productivity was low your failure to follow the machines commands was noted and then monitored as you started a plan to improve or be exited from the business.

One system recently under consideration by a broadband provider was an inwards facing vehicle camera, with the ability to allow managers to view a video stream, listen to conversations and use AI to text managers if it detected you holding a mobile phone, or report you for yawning. Offers by the union to accept forward facing and vehicle reversing cameras were declined without us accepting AI employee surveillance camera – despite assurances that the scheme was going to be introduced for employee and public safety. The company were not able to provide any examples or statistics to show the safety issue they were addressing or how the technology could help. This scheme was quickly abandoned when the union notified staff and announced a campaign against the intrusive technology.

AI tools can also provide a real-time assessment of your conversations, call centre workers working under these conditions in Wales reported being measured on speaking too slowly, speaking too fast, told to pause more or don’t pause so long amongst a myriad of other metrics this AI measured and ranked them emotionally quantifying their empathy and positivity. Some members have reported losing around £500 per month for not using key words or not trying to up sell products on calls where a human knows the customer is not interested in sales.


Annex B – Amendments made to deal with the removal of the principle of the supremacy of EU law

In broad terms, the Bill provides that:

  • New domestic law enabling data sharing will be subject to data protection law[18]So to take a fictional example, if in two years’ time the government brings forward new legislation to mandate data sharing in order to examine the effects of climate change, these data sharing provisions will be subject to the requirements of lawfulness, fairness and transparency in the UK GDPR.
  • Existing domestic law enabling data sharing or containing requirements relating to the processing of personal data will be subject to data protection law.[19]  So for example if the fictional legislation regarding data sharing to examine the effects of climate change had been introduced three years ago, these data sharing provisions would be subject to requirements of transparency etc in the UK GDPR.
  • Existing domestic legislation which prohibits or restricts access to information does not trump data subject rights as set out in data protection law.  However, those rights are subject to exceptions set out in data protection law (for example Schedule 2 to the Data Protection Act 2018). [20] So for example if a candidate for judicial office makes a subject access request relating to his or her application for a judicial appointment, the statutory bar on disclosure under section 139 of the Constitutional Reform Act 2005 will not apply.  The right of subject access will apply instead, but will be subject to the exemptions for applications for judicial office under paragraph 14(1) of Schedule 2 to the Data Protection Act 2018.    However, these provisions do not allow an Open Rights-style challenge to be brought because they do not apply to the relationship between the UK GDPR and Schedule 2 to the Data Protection Act 2018, which remains in its “reversed”, post-REULA state.[21]

[1] See section 2, 3 and 4 of EUWA, as well as the definition of retained EU law in section 6(7) of EUWA, as originally enacted.  For a discussion of how the general principles were preserved in domestic law post Brexit see Duhs, E. and Rao, I. (2021). Retained EU law : a practical guide. London: The Law Society, chapter 13.5.4.

[2] See for example the May government’s Charter of Fundamental Rights of the EU:  Right by Right analysis of 5th December 2017 at p4.

[3] Ibid, see p. 25.

[4] See section 6(3)(a) of EUWA, as originally enacted.

[5] Ibid.  Note that section 5(5) of EUWA, as originally enacted also ensured that references to the Charter of Fundamental Rights in CJEU case law should be read as references to the underlying rights and freedoms, thereby ensuring that pre-Brexit CJEU case law referencing fundamental rights was still relevant in interpreting the law saved through the Act.

[6] See paragraph 13 of the EU’s adequacy decision under the GDPR relating to the UK.

[7] See section 2(1) and section 4 of REULA.  It is arguable that the general principles were (in part) saved through section 2, 3 and 4 of EUWA (as enacted).  See for example the case of Adferiad Recovery Ltd v Aneurin Bevan University Health Board [2021] EWHC 3049 (TCC) and the discussion in Duhs, E. and Rao, I. (2021). Retained EU law : a practical guide. London: The Law Society, Chapter 13.5.4.

[8] This result was obtained by conducting a search for “fundamental rights and freedoms” in the UK GDPR and includes references both in the recitals and the articles.

[9] SI 2023/1417.

[10] See Regulation 2.  See also the explanatory memorandum to the regulations, in particular paragraph 7.

[11] See paragraph 47 of the ECHR memorandum. 

[12] See for example Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González at [74]. 

[13] Ibid.  See in particular paragraph 66.  This case dealt with the 1995 Directive which the GDPR replaced, but there are numerous other cases emphasising the importance of a high level of protection of personal data (see for example Data Protection Commissioner v Facebook Ireland and Maximilian Schrems (Case C-311/18) EU:C:2020:559 at [101] and– IAB Europe v Gegevensbeschermingsautoriteit  (Case C‑604/22) EU:C:2024:214 at [53]

[14] The strength of the protections under Article 22 in the EU regime and current UK GDPR are considerable.  Rosemary Jay says of Article 22 ““One of the strongest restrictive provisions in the Regulation” (see Jay, R. (2020). Data protection: law and practice. London: Sweet & Maxwell/Thomson Reuters at [14-082]).  The European Data Protection Board describes this as a “general prohibition for decision-making based solely on automated processing” and “interpreting this as a prohibition rather than as a right to be invoked means that individuals are automatically protected from the potential effects this type of processing may have.”  There has been a recent CJEU case of Shufa Case C‑634/21 which interprets the wording of this provision broadly to ensure that the protections in Article 22 are effective (see [42] – [50]).   See further commentary here.

[15] EU Member States will be acting within the scope of EU law when they provide exceptions from EU law in national legislation.  Those exceptions must themselves be drafted so as to comply with fundamental rights (see for example R (on the application of Zagorski v Secretary of State for Business, Innovation and Skills [2010] EWHC (Admin) (29 November 2010) at [70].  In the same way, UK secondary legislation must comply with human rights principles (see section 3 of the Human Rights Act 1998).

[16] See section 3 of REULA.

[17] The purpose of the immigration exemption in paragraph 4 of Schedule 2 to the DPA 2018 is to enable a controller of personal data is to disapply certain key rights in the data protection regime such as the right to be informed about the processing of personal data, the right of access and the right to object to the processing, to the extent that complying with those rights would be likely to prejudice matters relating to immigration including the maintenance of effective immigration control.  Schedule 2 contains exemptions which apply in the same way in other contexts.  For an overview of the exemptions see What other exemptions are there? | ICO .‌

[18] See new section 183A of the Data Protection Act 2018, as inserted by Clause 105 of the Bill.

[19] See new section 183B of the Data Protection Act 2018, as inserted by Clause 105 of the Bill.

[20] See amendments to section 186 and new section 186A, as inserted by Clause 105 of the Bill.

[21] This is because new section 186A(2), which disapplies the effect of REULA’s deletion of the principle of the supremacy of EU law only applies as between the relationship between “a pre-commencement enactment” and (inter alia) the UK GDPR.  The definition of “pre-commencement enactment” excludes “an enactment contained in, or made under, a provision listed in section 186(3)).  Section 182(3) (as amended by the Bill) includes Schedule 2 to the UK GDPR.  This means that the rule under section 3 of REULA (removing the principle of the supremacy of EU law) applies as between the relationship between Article 23 UK GDPR and Schedule 2 to the DPA 2018.  In a conflict Article 23 UK GDPR is subject to Schedule 2 to the DPA 2018.  This means that the broad exemptions in Schedule 2 take precedence over the safeguards in Schedule 2, thus lowering UK data protection standards.