Another EU regulator rules against the use of Google Analytics | Resources
 
 
Xcina Blog

Another EU regulator rules against the use of Google Analytics

In this week’s issue of In Perspective, Natasha King, Data Protection Consultant at Xcina Consulting, looks at the Italian Data Protection Authority’s decision against the use of Google Analytics, an Opinion from the ICO in relation to the DVLA’s incorrect use of lawful basis to share information with parking companies, and more. Find out the details of these and other key emerging themes as events unfold. Our analysis looks at what happened and why it matters, read our complete review below.

Italy’s data protection regulator joins EU counterparts in ruling against use of Google Analytics

What happened

  • The Italian Data Protection Authority (DPA) has joined its Austrian and French counterparts in taking a stance against the use of Google Analytics, over its unlawful transfer of data to the United States – a country which was deemed to provide an inadequate level of data protection compared to that which exists under the GDPR.
  • The decisions against Google Analytics can be traced back to the filing of 101 complaints by the advocacy group ‘My Privacy is None of Your Business’ (‘NOYB’) against businesses that continued to transfer personal data to the US via Google Analytics and/or Facebook in a number of EU and EEA member states.
  • The DPA discovered that website owners using Google Analytics gather data on user website activities, including pages visited, services used, IP addresses of the user’s device, and more, through the use of cookies.
  • It was found that the existing protections applied by Google were not of a sufficient standard to address the risk, resulting in information being transmitted to the US without an adequate level of protection to the necessary EU legal standard.
  • A 90-day deadline has been set to bring transfers into compliance with GDPR and Italy’s DPA plans ad-hoc inspections to verify compliance.

Why it matters

  • The verdicts could have significant, widespread consequences for US service providers and their EU-based clients. In a “task force” formed by the EDPB, DPAs have been coordinating their reactions to the 101 complaints that NOYB submitted. As a result, it is anticipated that the DPAs that are still left to rule on the issue will reach similar decisions.
  • US tech giants will be keeping a close eye on the developments as they unfold. In fact, Microsoft has already promised to process all EU data within EU borders as part of its EU Data Boundary initiative.
  • This month, the French DPA (CNIL) released a Q&A providing its position regarding use of Google Analytics, including possible alternative solutions. Future decisions from other EU DPAs should also offer guidance as to next steps to be taken by exporters within their territorial scopes.
  • We recommend that both EU and UK based organisations keep up to date with the developments, rules, and guidelines in their own jurisdictions to maintain compliance.

 

DVLA used incorrect lawful basis to disclose information to car park management companies, ICO finds

What happened

  • Following numerous complaints from individuals over the DVLA’s sharing of vehicle keeper data, the ICO published an Opinion on the legal basis for the processing of vehicle keeper data by the DVLA this month.
  • The DVLA had been relying on ‘legal obligation’ as their lawful basis under the UK GDPR for sharing the personal data of vehicle keepers with car park management companies to enable them to recover parking fines.
  • Following the ICO’s consideration of the evidence provided and legal analysis, it was concluded that the DVLA’s identified lawful basis of ‘legal obligation’ was not correct and the lawful basis for sharing data was in fact ‘public task’.
  • This was due to the fact that legislation provided the DVLA with a power, rather than a legal duty, to disclose vehicle keeper information to car park management companies for the purpose of recovering fines, and that they had the discretion to decline a request in certain circumstances.

Why it matters

  • The ICO found that the DVLA’s actions violated data protection laws, but decided against taking formal enforcement action. The Commissioner reasoned that in this case, the risk of harm to vehicle owners from the DVLA disclosing their information under the legal obligation lawful basis rather than for a public task is low.
  • One of the key principles of GDPR requires that personal data is processed lawfully, fairly, and transparently. Identifying the appropriate legal basis for processing is extremely important for several reasons, including:
    • Organisations are required by the GDPR to identify the appropriate legal basis for processing before it begins, and switching between different bases for the same activity is not permitted.
    • The legal basis identified must be demonstrable and justifiable to all parties, including internal stakeholders, data subjects, and regulatory bodies. An organisation’s privacy notice must clearly inform data subjects of the legal justification for processing their personal information.
    • The legal basis for processing has a significant impact on the way that an organisation responds to data subject rights requests because there are conditions, exceptions, and limitations on requests depending on the legal basis for processing.
  • Ultimately, if the wrong legal basis is identified, it could result in unlawful processing, non-compliant responses to data subject rights requests, reputational damage and even a regulatory fine.
  • The ICO have published a useful lawful basis interactive guidance tool on their website, which gives tailored advice to organisations on which lawful basis is likely to be most appropriate to their processing activities.

Microsoft AI ethics overhaul: framework for responsible AI unveiled

What happened

  • Microsoft’s Chief Responsible AI Officer has outlined their new ‘Responsible AI Standard’ which eliminates use of automated tools that can infer an individual’s emotional state and attributes like gender, age, and other facial features.
  • Now, any company that wants to use the service’s facial recognition features will need to actively apply for use, including those that have already built it into their products, to prove they are matching Microsoft’s AI ethics standards and that the features benefit the end user and society.
  • Accountability, Transparency, Fairness, Privacy & Security, Reliability, and Inclusiveness are the six main principles of Microsoft’s Standards, which closely align with those set out under the GDPR.
  • The Standards set out the need to work towards ensuring AI systems are responsible by design and aim to break down each of the six principles into key enablers, such as impact assessments, data governance, and human oversight.
  • In support of the Responsible AI Standard, Microsoft have published an Impact Assessment template and guide, and a collection of Transparency Notes.

Why it matters

  • Microsoft’s AI ethics overhaul highlights the increasing need for organisations to take action to preserve privacy in a data-driven world.
  • Earlier this month, we reported that the ICO had launched an AI and data protection risk toolkit which provided a methodology to audit AI applications and ensure they process personal data fairly.
  • It is recommended that organisations using AI should refer to the toolkit, which sets out practical steps to take in order to reduce, mitigate or manage the risks to individual rights and freedoms when processing personal data through AI.
  • Meanwhile, the EU’s Artificial Intelligence Act, which intends to establish a uniform regulatory and legal framework for the use of AI, including how it is developed, what businesses may use it for, and the consequences of failing to adhere to the requirements, is making its way through the European Parliament.

We’d love to hear from you

Natasha is an experienced privacy professional with a proven ability to implement and manage successful data protection compliance programmes. Prior to joining Xcina Consulting, Natasha gained extensive knowledge and experience in dealing with complex privacy challenges across various sectors including the insurance industry, healthcare, education, and local government. She is a member of the International Association of Privacy Professionals (IAPP), holding a CIPP/E accreditation and is a certified BCS Practitioner in Data Protection.

To discuss how the areas highlighted in this post, or any other aspect of risk management, information governance or compliance impact your business, speak with our team, tell us what matters to you and find out how we can help you navigate complex issues to help you deliver long term value.

If you have any questions or comments, or if there’s anything you would like to see covered, please get in touch by emailing Xcina Consulting at info@xcinaconsulting.com. We’d love to hear from you.

Natasha King

Data Protection Consultant

Speak to me directly by Email, or
Telephone: +44 (0)20 3745 7826

Subscribe to Updates

Receive regular updates from our expert consultants as they provide clarification and guidance on issues impacting your organisation.

Subscribe >>