In our latest review of recent news and developments, Samad Miah, Data Protection Consultant at Xcina Consulting, looks at the UK’s new AI strategy, a recent case in Denmark relating to the inappropriate sharing of confidential patient information and further scrutiny received by Facebook for its data processing.
The British government has published its National AI Strategy setting its vision and approach to ensure the country is able to transition to an AI-enabled economy over the next ten years.
The strategy sets out a number of commitments including greater investment in machine learning technologies, maximising the benefits that can be derived from AI in all regions and sectors and governing the use of AI effectively.
The government’s plans also include a reference to their ongoing consultation on data protection reforms in the UK and the need to assess the challenges with the current data protection framework in developing and deploying AI responsibly.
Why it matters
Artificial intelligence is the fastest growing deep technology in the world, with huge potential to change industries, drive economic growth and transform the way people live.
However, this also puts renewed pressure on how it is governed so that it is not misused and does not cause detriment to individuals.
Existing data protection law contains provisions related to ‘automated decision-making’ including the need for consent (in specific cases) and human intervention if requested.
However, proposed changes to data protection law set out by the UK government may reduce these requirements.
Medicals Nordic were fined approximately €80,500 for using WhatsApp groups to share confidential information to all of its employees including data related to Covid-19 testing.
The data protection regulator in Denmark found that all employees of Medicals Nordic had unrestricted access to confidential patient information on their private phones as data was being shared via WhatsApp group chats.
This includes employees who had no relationship to a particular patient as well as employees that had left the business but were not removed from the group chat.
Why it matters
Data protection law governs the overarching use of health data and the principles to consider.
However, in most cases, health data is also confidential in nature. In many European jurisdictions, common law and the duty of confidence will also apply.
In the UK, common law and associated guidance states that only health and care professionals within a patient’s direct care team should have access to relevant information relating to their health.
Access management is therefore an important factor to consider as well as the need to complete a Data Protection Impact Assessment to identify and mitigate the risks of sharing health information on a large-scale.
More than 250 people in Afghanistan seeking relocation to the UK following the Taliban seizing control of the country have had their personal data, including names and profile pictures, mistakenly copied into an email.
A second data breach of a similar nature was also uncovered by the BBC involving the email addresses and names of 55 people which could be seen by everyone who was sent the message.
The Secretary of Defence has stated that a formal investigation has been launched and one official has been suspended as a result of the incident.
Why it matters
A personal data breach means a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data. This includes breaches that are the result of both accidental and deliberate causes.
Organisations should ensure that they have a mix of both organisational and technical controls to prevent emails containing personal data accidentally being sent to the wrong person.
This can include disabling messages from being sent immediately through an auto-delay feature that many email providers now offer.