When we consider the value of personal data, what price are organisations placing on it and how do individuals protect and retain its worth? If you’ve been involved in implementing a new system or technology, you may have completed a data protection impact assessment (DPIA). The DPIA was introduced in 2018 but how useful are they today and are they still necessary? There is a new ISO standard on anonymisation and de-identification. We review the details below.
Read the latest Data Protection Blog by Jackie Barlow at Xcina Consulting.
What is an individual’s personal data worth?
A class action lawsuit brought against Facebook’s parent firm Meta* in the Competition Appeal Tribunal alleged that Facebook (FB) had abused its market dominance in setting an unfair price for free use of Facebook US users’ personal data.
FB forced its users to give up their valuable personal data in order to receive personalised, targeted advertising and access to the social network. The data was ‘harvested’ between 2015 and 2019 and provided information on internet use, helping FB make large profits.
Another significant case that involved a class action (and personal data), was Lloyd v Google**. This also highlighted the value of personal data as an asset, but the FB case is more complicated as it considers both sides; the value of the data to the individual using FB and the commercial value of that data to the firm.
The value an individual attributes to their personal data is subjective. It will be important to strike a balance between users who are happy that their data is monetised but want a return and those that see the return as the value of the service they are receiving. The FB case has a key role here.
*Meta faces billion-pound class-action case – BBC News
**Lloyd v Google: the funder perspective | Feature | Law Gazette
**Top UK court blocks legal action against Google over internet tracking | UK supreme court | The Guardian
Why it matters
The sheer volume of data being captured by social media companies demonstrates that data has a value, whether it is commercial, economic or both! From a privacy perspective, the important issue for organisations is protecting the rights of individuals rather than facilitating large profits and this is where GDPR can support the FB case.
Mergers and acquisitions that have taken place in the retail sector, have demonstrated that the value of a business is not seen as its stock or physical assets but its customer database and its intellectual property. The World Economic Forum’s definition of data gives weight to this. The forum describes personal data as a ‘new asset class that is generating a new wave of opportunity for economic and societal value creation’. Personal Data: The Emergence of a New Asset Class (weforum.org)
If personal data is in the public domain, is it fair for companies to make a profit from that data? If so, are individuals happy to give up their data for free or should they be incentivised?
The outcome of the FB case will hopefully answer this question. The case has brought to light, the tension between personal data as a right and personal data as a saleable asset. It is important now, for the law to strike a balance. What is the value of data and who is the data valuable to?
The future regulation of the internet will be important, and also the provision of online services on digital platforms. The outcome of the FB case will provide a benchmark in terms of assessing the value that can be attributed to personal data and it might well change how businesses and individuals view this going forward. In time this could lead to changes in regulation that provide more guidance on the monetisation of, and protection given to personal data.
The FB and Google cases have shown that personal data is valuable and that is demonstrated by the way it is used in such large volumes by many organisations.
What do you believe your own data is worth to an organisation? Is it more valuable to you than to organisations who collect it and would you expect something in return if they want to use it for marketing or other purposes?
Have you been involved in implementing a new technology or system? If so, did you complete a data protection impact assessment?
In September 2021 North Ayrshire Council decided to implement a system using facial recognition technology (FRT) in nine school dining rooms. They used photographs of pupils, and these were matched with their identities contained in school records. This enabled the schools to automatically deduct the correct charges according to the lunches consumed.
The intention of installing the ‘cashless’ system was designed to create more efficiencies and prevent queues which usually occurred due to pupils forgetting passes or not having the correct amount of money to pay.
In this case, collecting photographs, matching them to school records and charging each respective child’s account, is processing of personal data and this needs to comply with data protection legislation. Furthermore, this type of biometric data is classed as ‘special category data’, so as well as one of the ‘usual’ lawful bases for processing being necessary, an additional condition is needed as this type of data is more sensitive.
When implementing any new technology that is likely to present a high risk to individuals, a formal data protection impact assessment (DPIA) needs to be undertaken.
- This will help to identify all the data protection risks involved so that these can be addressed.
- The DPIA asks whether the processing is justified and proportionate and whether it is the best way to achieve the aim. This is especially important where children’s data is concerned.
- The DPIA also covers transparency i.e. it is important that individuals have been told about the processing and whether there is a lawful basis for it.
Unfortunately, once the new FRT system was implemented at the schools, press reports and complaints resulted in an ICO investigation within a few weeks. The system was soon withdrawn, and all images deleted.
Why it matters
This case has highlighted that when introducing a new system or technology (like FRT) it’s important to address all privacy risks before implementation.
A DPIA must be completed at the outset, particularly when processing special category data. In this case, a DPIA had been completed, but it had not adequately covered off all the data protection risks.
There was a lack of transparency and information was not clearly provided to pupils and parents. Additionally, the retention of pupils’ images was not data protection compliant.
There had been no consultation with pupils and parents and no sign off of the DPIA by a senior staff member.
Details of the ICO’s letter to North Ayrshire Council and the BBC article on this case can be found at the links below.
Using FRT in schools – letter to North Ayrshire Council | ICO
ICO watchdog ‘deeply concerned’ over live facial recognition – BBC News
Have you been involved in the implementation of any new systems or technologies that involve processing individuals’ personal data?
If so, did you complete a data protection impact assessment at the outset to assess the risks?
Completing a DPIA will ensure that an organisation adheres to the practice of ‘Data Protection by Design and Default’. This refers to article 25 of UK GDPR which makes it a legal obligation for data controllers to implement organisation controls which make sure data protection issues are addressed at the design stage of any project.
The ICO has provided information on this at Data protection by design and default | ICO
ISO publishes a new anonymisation standard
In November 2022 the international Organisation for Standardisation (ISO) published a new data protection standard ISO/EC 27559; known as a ‘Privacy enhancing data de-identification framework’.
It is the result of a 5-year effort and the standard will be important in establishing best practice for the re-use and sharing of personal data.
The aim of the new ISO framework is to identify data protection risks and mitigate them by anonymising personal data where possible. Anonymisation is achieved when the risk of identifying individuals has been reduced to a sufficiently remote level. Anonymisation of data aligns nicely with data protection principles such as ‘data minimisation’ and ‘data security’.
Anonymising data wherever possible is encouraged particularly where sensitive personal data is involved. In this way, sensitive data can be re-used for lots of purposes, for example to improve services, look for new opportunities and insights that can shape an organisation and create data products to serve people’s needs.
Privacy experts will recall that the ICO issued a consultation calling for views on anonymisation, pseudonymisation and privacy enhancing technologies guidance at the end of 2022. The ICO’s draft guidance can be found here.
Why it matters
If there is no need to process personal data, for the purposes you need it for, it is always good practice to anonymise it.
The new ISO standard should promote more anonymisation and will provide a way forward for the safe and responsible use of personal data, especially ‘special category data’ (more sensitive data).
Privacy experts will be able to rely on this new framework when putting forward best practices for the re-use and sharing of individuals’ personal data.
ISO standards are widely respected in providing strong compliance practices, so adopting this standard may enable organisations to have a competitive edge. The standard will give stakeholders confidence that an implemented process involving anonymisation is reliable and compliant.
A preview of the new framework can be found at ISO – ISO/IEC 27559:2022 – Information security, cybersecurity and privacy protection – Privacy enhancing data de-identification framework
Do you think the new standard will lead to more anonymisation of personal data?
Do you think that many organisations are likely to adopt it?