Artificial intelligence (“AI”) is advancing rapidly, and increasingly becoming more intertwined with our personal lives. However, advances in AI also have the potential to be used in a manner that could be considered problematic and intrusive. On 29 November the Information Commissioner’s Office (“ICO”), the UK’s data protection regulator, issued a provisional £17 million fine to Clearview AI Inc and ordered the company to stop processing the personal data of people in the UK following “alleged serious breaches of the UK’s data protection laws”.
Clearview AI Inc – Facial Recognition
Clearview is an American facial recognition business. Self-described as the ‘World’s Largest Facial Network’, it supplies an app to a range of businesses, educational institutions and, most controversially, law enforcement. It has developed an algorithm capable of matching faces to a three billion image strong database gathered from across the internet – including images sourced from social media accounts.
In January 2020 the New York Times published an article exposing Clearview’s activities – with a particular emphasis on the scale of social media scraping to gather a database. The article also revealed that US law enforcement agencies had been using the Clearview app to assist with criminal prosecutions.
Clearview maintained that its algorithm was not intended for the general public and was only to be used by law enforcement agencies, however marketing material appeared to encouraged users to search for friends, family and celebrities. Several civil rights organisations, tech businesses and social media platforms voiced their concerns about the facial recognition capabilities. Twitter sent a cease and desist letter to request that all images be removed from Clearview due to a contravention of site policy, which was followed by other social media platforms including Google, YouTube, Facebook and Venmo.
The ICO’s decision
In response to the situation, the ICO and the Office of the Australian Information Commissioner (OAIC) launched a joint investigation into Clearview in July 2020. The investigation sought to examine Clearview’s personal information handling processes, and specifically the use of data scraping and facial recognition biometrics. Although investigating together, each regulator would reach separate conclusions. On 3 November 2021, the OAIC found Clearview to be in breach of Australian privacy laws.
The 29 November ICO announcement specified that the ICO intends to issue a £17 million fine for several reasons. The ICO noted that although Clearview no longer operated in the UK, various UK law enforcement agencies were previously given a free trial of the service. However, the Clearview database was also likely to contain a substantial number of images from UK citizens. The ICO argues that Clearview failed to comply with UK data protection legislation by:
Failing to process information in an expected or fair manner;
Not having procedures in place to prevent indefinite retention of data;
No lawful reason to collect the information;
Failure to meet the higher standards for biometric data as a special category of personal data;
Not informing UK-based individuals about what is happening to their data; and
Asking for unnecessary personal information from individuals who wished to object to data processing, which may have deterred them from doing so.
Next steps and implications
Clearview has the opportunity to respond to the alleged breach before the ICO makes a final decision in mid-2022. Although statements from the UK government indicate that the UK data protection framework may undergo considerable reform in the future, this decision highlights that the ICO takes its regulatory role seriously and is in the position to take action when data subject rights are at risk. Ultimately, with the use of AI becoming more widespread and accessible, it is likely that regulatory intervention will continue in this area.
If you have any data protection or technology queries about this subject, please contact David Varney in our Data Protection team.
This article was written by David Varney and Laura Craig