This Data Protection news blog looks at the latest developments on the UK government’s plans for post-Brexit data protection reforms, as the second reading of the latest legislative proposals were abruptly postponed, following the election of Liz Truss as the UK’s new prime minister. Natasha King, Data Protection Consultant at Xcina Consulting also examines the draft guidance from the ICO over the use of privacy-enhancing technologies to help ‘unlock lawful data sharing’, as well as a €405m fine issued by the Irish regulator against Instagram for breaching children’s privacy rights.
Read our full analysis below for a look at what happened and why it matters.
UK data protection reforms delayed amid leadership change
- Reforms to the UK’s data protection framework have been delayed after a scheduled second reading of the legislative proposals on Monday 5 September was abruptly postponed, following the appointment of Liz Truss as the UK’s new prime minister.
- The Data Protection and Digital Information Bill, the government’s proposed post-Brexit data protection reform, had originally been introduced into parliament on 19 July 2022, as previously reported on by Xcina Consulting.
- The second reading of the Bill was set to be the first opportunity for the proposals to be debated in parliament and for the overall principles of the Bill to be scrutinised.
- Ahead of the proposed second reading, the Local Government Association (LGA) had released a statement raising concerns about the Bill, saying they were “disappointed the Bill removes the existing requirement to designate a data protection officer. Although the proposal is now to replace this with a Senior Responsible Individual, this is a person at Senior Management Team level who would not have the time or experience to undertake much of what the data protection officer did.”
- In announcing the postponement of the parliamentary debate of the Bill, Leader of the House Mark Spencer said: “Following the appointment of the new leader of the Conservative Party, the business managers have agreed that the government will not move the second reading and other motions relating to the Data Protection & Digital Information Bill today, to allow Ministers to consider the legislation further.”
- At present, no revised date has been set for a second reading of the Bill.
Why it matters
- The contentious Bill was scheduled to be presented for its second reading by the Culture Secretary at the time, Nadine Dorries, who had hailed the reforms as “one of Brexit’s biggest rewards” that will “drop unnecessary box-ticking and measures stifling British businesses.”
- The Culture Secretary at the time had not addressed the growing concerns that the UK government’s proposed reforms could increase business costs and workload, while also creating barriers to data sharing between the EU and UK, in the event of the reforms diverging too far from the EU GDPR.
- However, following the cabinet reshuffle, the future of the Bill is now uncertain. Michelle Donelan has been appointed as new Culture Secretary.
- Under a much-changed cabinet, the Bill is likely to face a long road through parliament, particularly as the new Prime Minister has previously vowed to review all EU laws retained after Brexit, and to remove or replace those that are deemed to hinder UK growth.
- This raises the prospect that the Bill could undergo more radical amendments than those that have been proposed to date.
ICO publishes guidance on the use of privacy enhancing technologies
- The Information Commissioner’s Office (ICO) has published draft guidance on privacy-enhancing technologies (PETs) to help organisations unlock the value of data sharing and analysis, whilst protecting privacy and confidentiality.
- PETs are a collection of technologies and methods that may be used to share and utilise personal data in a responsible, legal, and secure manner. These methods include reducing the amount of data used, as well as encryption, pseudonymisation, or anonymisation.
- The ICO guidelines say PETs can help organisations “demonstrate a ‘data protection by design and by default’ approach” to data processing, which is a concept to do with considering data protection and privacy issues upfront in everything an organisation does with data.
- It is a key element of the UK GDPR’s risk-based approach and helps organisations to ensure that they comply with the GDPR’s fundamental principles and accountability requirements.
- In summary, the guidelines set out that:
- PETs can be used to give access to datasets which would otherwise be too sensitive to share, while ensuring individuals’ data is protected.
- PETs should not be regarded as a “silver bullet” for data protection compliance. All processing activities must still adhere to the GDPR’s principles, including the requirement for the processing to be lawful, fair, and transparent.
- A case-by-case assessment through a Data Protection Impact Assessment (DPIA) should be performed, to determine if PETs are appropriate for an organisation’s aims.
- The ICO’s guidance goes on to set out the benefits and different categories of PETs currently available, as well as how their use can help organisations comply with data protection law.
Why it matters
- As customer data continues to be an increasingly important source of competitive advantage, it is crucial to use and share personal data in a secure manner, to prevent cybercrime and associated reputational damage, to gain customers’ trust, and to ultimately expand the business.
- To achieve this, all processing activities should be designed with data privacy and security in mind from the outset.
- The increasing need to ensure that data is shared securely has been recognised by leading technology providers such as Amazon, Google and Snowflake, who have all created data “clean rooms”, to allow businesses to pool information without exposing customer details or commercially sensitive data.
- PETs open up previously unimaginable options to harness the power of data through cutting-edge and trustworthy applications, by enabling organisations to exchange and jointly analyse sensitive data in a manner that protects privacy.
- The ICO’s draft guidance on PETs was released ahead of the G7 data protection and privacy authorities’ 2022 roundtable, which was held on 7th and 8th September in Germany. At the roundtable, the ICO presented its work on PETs to its G7 counterparts and urged global agreement for the support of responsible and innovative use of PETs.
- The guidelines are open for consultation until 16 September, and the ICO called for the development of industry-led governance, such as codes of conduct and certification schemes, to help organisations use PETs responsibly and to help PETs developers and providers to build the technology with data protection and privacy at the forefront.
- You can provide your feedback to the ICO on the guidance by emailing email@example.com.
Instagram owner Meta fined €405m over failing to protect children’s data
- Following a two-year investigation by the Irish Data Protection Commission (DPC) into possible violations of the EU GDPR, Instagram owner Meta has been fined €405 million (equating to around £350 million) for failing to protect children’s privacy online.
- The Irish DPC effectively serves as the EU’s “big tech regulator”, as the majority of large US technology firms have European headquarters in Ireland, including Google, Apple, Microsoft, and TikTok, in addition to Meta.
- The DPC’s investigation found that Instagram had allowed users aged between 13 and 17 to operate business accounts on the platform, which publicly displayed their contact information, including phone numbers and email addresses.
- The DPC also found the platform had operated a user registration system whereby the accounts of 13-to-17-year-old users were set to “public” by default.
- A Meta spokesperson said: “This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private”.
- Meta went on to confirm that they disagree with the DPCs calculation of the financial penalty and intend to appeal against it.
Why it matters
- This penalty is the highest imposed on Meta by the DPC, following a €225m fine it imposed in 2021 after ruling that WhatsApp had not informed EU data subjects transparently about how it collected and used their personal data. A further €17m fine was issued by the DPC against Meta this year, over record-keeping issues in response to a series of data breaches it had reported.
- The fine also represents the second largest ever issued under GDPR, behind the €746m fine issued by the Luxembourg National Commission for Data Protection against Amazon in July 2021.
- The Irish DPC has previously come under fire from both privacy campaigners and fellow EU data protection regulators over allegedly failing to hold Big Tech companies to account under the EU GDPR – but the DPC’s recent spate of fines appear to signal a much tougher stance on enforcement going forward.
- In last week’s post, Xcina Consulting emphasised the importance of designing services with children’s privacy in mind and using a data protection by design and by default approach, given that the GDPR explicitly states that children’s personal data requires special protection, as they may be less aware of the risks, consequences, and safeguards concerns as well as their rights in relation to the processing of their data.