EdTech companies exploit UK children's data
 
 
Xcina Blog

EdTech companies exploit UK children’s data, say privacy campaigners

A recent study alleging that suppliers of free educational tools widely used in UK schools are flouting data protection laws by using children’s data for commercial advantage, Natasha King, Data Protection Consultant at Xcina Consulting investigates. She also discusses the most recent developments in the long-running US privacy lawsuit against Facebook over its data sharing with Cambridge Analytica, including a potential settlement.  Read our full analysis below for a look at what happened and why it matters.

Privacy campaigners claim EdTech providers exploit children’s data for commercial gain

What happened

  • Since the pandemic, there has been a huge increase in the use of Educational Technology (aka EdTech) in UK schools, to support teaching and the efficient operation of educational institutions.
  • When lockdown measures forced the mass closure of schools and universities, institutions were pushed to adapt to remote teaching. EdTech companies were able to take advantage of this opportunity, enjoying accelerated growth and profits.
  • However, privacy advocates argue that due to the rush in moving to distance learning, the effects of utilising free software created by large technology companies for educational purposes have not been sufficiently taken into consideration or controlled.
  • According to a recent report published by the Digital Futures Commission (DFC) in conjunction with digital rights group 5Rights, EdTech businesses are violating data protection regulations and leaving children’s data open to commercial exploitation.
  • The report claims that ClassDojo and Google Classroom, two platforms extensively used, are gathering and profiting from the use of data on students’ academic achievements in ways that likely violate UK data protection laws, including advertising and other commercial uses.
  • As part of the study, an experiment was conducted to track how school children interacted with Google Classroom and revealed how third-party and Google data was tracked when the user clicked on external links. Preferences could be derived from this data, allowing personalised advertising to be directed to the users.
  • The study claims that the same applied to ClassDojo, raising concerns about its behavioural profiling and social scoring, and the potential impact of these practices on children’s education records and future opportunities.
  • From the viewpoint of the schools, EdTech tools like ClassDojo and Google Classroom are both practical and, like many other EdTech products, they are ‘free’. The report asserts, however, that children are actually paying for these services, with their personal data, or through watching targeted advertising.
  • In summary, the report indicated it was highly likely that:
    • In a typical day, personal data is collected from children and used for commercial purposes including developing new products, marketing and advertising.
    • EdTech collects far more personal data than schools or families expect and also processes, shares and profits from the data in many ways that they do not know about.
    • Children lack the knowledge and power to exercise their rights for much of the data collected from them in school, and nor can schools do so on their behalf.

Why it matters

  • The publication of the research coincides with efforts by European governments to restrict the use of Big Tech in educational settings. Notably in Germany and Denmark, several states have already outlawed the use of Google’s products in schools.
  • Due to concerns about children’s privacy, the Dutch data protection authority also made the decision to put limitations on the use of Google’s Chrome browser and educational platforms from 2022.
  • A statutory code, which was produced by the ICO and came into force came into force in September 2020 with a 12-month transition period, sets out 15 standards for the age-appropriate design of online services that process UK children’s data.
  • The code, called the Children’s Code, or the Age Appropriate Design Code as it is formally known, aims to create better internet for children by ensuring online services likely to be accessed by children, respect a child’s rights and freedoms when using their personal data.
  • In particular, it sets out that:
    • privacy settings should automatically be set to very high;
    • children and their parents/carers should be given more control of the privacy settings;
    • non-essential location tracking should be switched off;
    • children mustn’t be ‘nudged’ by sites through notifications to lower their privacy settings; and
    • clear and accessible tools should be in place to help children exercise their data protection rights (including parental consent tools).
  • Services that do not conform to the code will find it very difficult to demonstrate that they’re processing children’s data fairly and to demonstrate compliance with other data protection and PECR If a service is found to process a child’s personal data in breach of the GDPR or PECR, the ICO can take regulatory action against it.
  • Any businesses that have not already done so, should review their existing services to establish whether they are subject to the Children’s Code and assess current conformance, identifying and implementing any additional measures where necessary.
  • It is understood that the outcome of the 5Rights report is due to be presented to the Information Commissioner’s Office and the Department for Education. Xcina Consulting shall closely monitor and report on developments in this case as they arise.

Video surveillance

Meta’s Facebook agrees to settle Cambridge Analytica data privacy lawsuit

What happened

  • Facebook’s parent company, Meta, has proposed to settle a long-running privacy lawsuit in the Northern District of California over allegations that it shared user personal data with other parties in violation of consumer privacy laws.
  • The most well-known of these third parties is the now defunct UK-based political consulting company Cambridge Analytica, which allegedly utilised the information to create psychographic profiles of voters to aid the Republicans’ election success in the US in 2016. A whistleblower also claimed that it was using strategies to suppress Black voting.
  • The most recent filing omitted to include the financial details of the settlement. The plaintiffs’ and Facebook’s lawyers asked the judge to halt the lawsuit for 60 days so that the parties may “finalise a written settlement agreement” and submit it to the court for preliminary approval.
  • The potential settlement comes after it was made public last month that former COO Sheryl Sandberg and CEO of Meta, Mark Zuckerberg would have to endure up to 11 hours of questioning as part of the Northern District of California Cambridge Analytica lawsuit.
  • If the “in-principle” deal is agreed, the two will, at least in this lawsuit, escape being cross-examined about their participation in the data scandal in person. Though the District of Columbia Attorney General has already filed a claim against Zuckerberg in a separate Cambridge Analytica privacy lawsuit.

Why it matters

  • The lawsuit was brought by a group of Facebook users, after the scandal saw the data of up to 87 million users scraped and shared without their consent.
  • In addition to the data being used during the USA’s 2016 Presidential election, it is claimed that Brexit campaigners and Russian disinformation agents also exploited the data.
  • In 2018, Meta CEO Zuckerberg was called before US Congress to testify about the scandal and explain how his company had enabled data harvesting and why it had not policed it properly.
  • It was considered by many that the CEO had answered the questions about the business he founded rather evasively, with several of his responses being along the lines of “I do not recall.”
  • In the UK, the ICO investigated the scandal after raiding Cambridge Analytica’s offices and completing a 7-hour search, Facebook was issued a £500,000 fine — the maximum possible penalty under applicable UK data protection law at the time.
  • After initially contesting the penalty, Facebook eventually reached a settlement with the ICO on 30 October 2019, under which it agreed to pay the £500,000 fine levied by the ICO in 2018 in connection with the processing and sharing of its users’ personal data with Cambridge Analytica (without admitting liability in the matter).
  • Eyebrows were raised when it became apparent that the terms of the settlement gagged the ICO from discussing certain elements of it in public, shutting down scrutiny of the tech giant’s actions in the wake of the scandal.

Xcina Consulting will examine the details of the settlement once details of it are disclosed publicly.

We’d love to hear from you

Natasha is an experienced privacy professional with a proven ability to implement and manage successful data protection compliance programmes. Prior to joining Xcina Consulting, Natasha gained extensive knowledge and experience in dealing with complex privacy challenges across various sectors including the insurance industry, healthcare, education, and local government. She is a member of the International Association of Privacy Professionals (IAPP), holding a CIPP/E accreditation and is a certified BCS Practitioner in Data Protection.

To discuss how the areas highlighted in this post, or any other aspect of risk management, information governance or compliance impact your business, speak with our team, tell us what matters to you and find out how we can help you navigate complex issues to help you deliver long term value.

If you have any questions or comments, or if there’s anything you would like to see covered, please get in touch by emailing Xcina Consulting at info@xcinaconsulting.com. We’d love to hear from you.

Natasha King

Data Protection Consultant

Speak to me directly by Email, or
Telephone: +44 (0)20 3745 7826

Subscribe to Updates

Receive regular updates from our expert consultants as they provide clarification and guidance on issues impacting your organisation.

Subscribe >>