On 21 May 2024, the ICO announced it had concluded investigations into Snap, Inc’s (Snap) launch of the ‘My AI’ chatbot.
It determined that Snap had adequately ensured that ‘My AI’ was compliant with applicable data protection law.
Background
The ICO’s preliminary investigation found that Snap had not adequately assessed the data protection risks posed by its ‘My AI’ chatbot prior to launch. The ICO was particularly concerned about users of the chatbot between the ages of 13 – 17, who were flagged as vulnerable to the data processing abilities of the bot.
Consequently, the ICO issued a preliminary enforcement notice on Snap Inc in October of 2023. We reported on this in more depth here.
ICO Decision
In response to the preliminary enforcement notice, Snap took steps to review the risks associated with ‘My AI’ and implement appropriate mitigations. As a result, the ICO has stated that it is satisfied that Snap has undertaken a risk assessment relating to ‘My AI’ that is compliant with data protection law.
The final Commissioner’s decision in this case is due to be published in the upcoming weeks.
The Message: it’s a warning-shot
The ICO has been clear that its decision does not reflect a loosening of data protection standards; if anything, it indicates the ICO’s commitment to monitoring the development and deployment of generative AI models. Industry must continue to engage with the data protection risks of generative AI, with risk assessments being a crucial part of preparing AI products before bringing them to market.
The ICO will continue to monitor generative AI applications such as ‘My AI’ to ensure that they remain compliant with data protection standards, and has released a series of consultations regarding data protection and AI.
This messaging is reflected in Stephen Almond’s (the ICO Executive Director of Regulatory Risk’s) statement:
“Our investigation into ‘My AI’ should act as a warning shot for industry. Organisations developing or using generative AI must consider data protection from the outset, including rigorously assessing and mitigating risks to people’s rights and freedoms before bringing products to market. We will continue to monitor organisations’ risk assessments and use the full range of our enforcement powers – including fines – to protect the public from harm.”
Social media and video sharing platforms should be aware that children’s privacy is high on the list of ICO priorities for 2024-2025 (outlined here). Since the introduction of the Children’s Code in 2021, the ICO has been clear that it considers privacy protections for children paramount in the operation of online services. Accordingly, online services likely to be accessed by children, such as Snapchat, should reflect on whether their privacy protections are adequate; default public profiles, targeted advertisements, and age assurance technologies have been identified as areas of particular scrutiny by the ICO.
If you have any questions or would otherwise like to discuss any issues raised in this article, please contact Lucy Pegler, David Varney, or a member of the Data Protection Team.
This article was drafted by Victoria McCarron and Liz Smith.