As the Online Safety Bill ("OSB") continues to make its way through Parliament, we examine two new powers that Government has stated will come into force as part of the legislation. The Joint Select Committee appointed to consider the Government’s draft OSB is still hearing evidence from various parties on the potential effects of the OSB and their proposed amendments to its provisions, most recently from Facebook, Google and Twitter on the implications for them in self-regulating their user platforms under the proposed legislation. The discussions within the Joint Select Committee and in Parliament have outlined two new mechanisms for policing harmful online content within the OSB which are particularly significant.

  • Two new offences are set to be created for individual’s posting harmful online content

The government has outlined its plan to introduce two new offences for individuals publishing harmful online content, with these set to be added to the OSB in the coming months. These new offences follow recommendations from the Law Commission to widen the scope of offences under the OSB, and individuals posting content with a harmful effect could now face a two year prison sentence. There are two specific offences which will likely be added to the OSB:

  • where the intention of a post is to create a ‘genuine sense of fear’ in the victim ‘that a threat will be carried out’; and
  • where a ‘knowingly false communication’ is posted with the intention of causing ‘emotional, psychological or physical harm to the likely audience’

It remains to be seen the exact form these offences will take and how they will be received in Parliament. However, this does represent a gradual shift away from the OSB's original purpose to impose obligations on companies and platforms to regulate and monitor their content and a move towards regulating individuals themselves in relation to the online content they produce.

  • New powers will be created for OFCOM to act as the ‘online safety regulator’ 

One of the Government’s main aims behind the Bill was to equip OFCOM with the powers it requires to enforce the online harms legislation against social media platforms and ensure those providers be held accountable for harmful third-party content that is posted on their platforms. The current intention is that OFCOM will have the power to:

  • block access to those platforms;
  • fine providers up to 10% of annual turnover or £18 million (whichever is higher); and
  • hold senior managers criminally liable for failures of a duty of care.

These powers would allow OFCOM to monitor the fulfilment of a wide range of new duties that are set to be imposed on social media platforms, such as a duty to carry out and maintain a risk assessment of any illegal content posted on their sites and taking steps to mitigate and manage any harm caused by such illegal content. Whilst these proposed powers may be subject to change, they demonstrates the clear intention from Government to increase regulation of social media companies.

The primary intention of the OSB is clearly to increase the regulatory obligations on social media platforms to monitor the content that is posted on their sites. However, as the Bill moves towards the first legislative hurdle, debates centre around issues such as  defining online ‘harm’ and the implications of the OSB for free speech online. As the OSB continues to be debated in Parliament, it is unclear to what extent these offences and powers will be diluted from their current form, with Big Tech companies citing the regulatory burden the OSB would place on platforms to monitor their content.

This article was written by Isaac Bedi and David Varney