This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
viewpoints
Welcome to Reed Smith's viewpoints — timely commentary from our lawyers on topics relevant to your business and wider industry. Browse to see the latest news and subscribe to receive updates on topics that matter to you, directly to your mailbox.
| 2 minutes read

The UK’s Online Safety Bill is Back

You may have been wondering what became of the Online Safety Bill (once named the Online Harms Bill) and what the impact, if any, of the change in culture secretary would bring. Well, the answer can be found in the announcement on Monday that it is alive and kicking and moving to the next stage of legislative scrutiny, albeit with some quite material amendments. The UK Government claims that the latest draft of the Bill goes further in safeguarding children, protecting free speech, and boosting transparency and accountability of online services. 

We’ve summarised the key (new) amendments below:

Key Changes

  • “Legal but harmful” no more

    - whilst online services will still need to remove content that is illegal (or prohibited by their terms of use), the Bill will no longer cover content which is legal but which may be harmful to adults or children. This presents a U-turn on, arguably, one of the most controversial aspects of the Bill, and removing the risks of uncertainty concerning what could be considered “harmful” will be welcomed by those subject to the rules. 

  • Power to the people

     - online services must now provide adults with greater control and choice over the content they see and engage with, and offer adults tools to help them avoid harmful content. This new obligation intends to replace the former obligations to remove lawful but harmful content, which have now been scrapped. Many platforms already include controls and settings to give control to users over what they choose to see, but this will make it more universal and potentially more standardised.

  • Protecting freedom of speech

    - new duties will explicitly prohibit online services from removing or restricting user generated content, or suspending or banning user accounts, provided this does not breach their terms of use or the law. The rationale here is to avoid the potential risk of platforms ‘over-deleting’, given the risks of not doing so. The new position of course now means there are risks on both sides making navigating fast decisions on content removal even more complex. If online services do remove any content or ban a user, they’ll have to offer users an effective right of appeal. 

  • Publishing child risk assessments

    - online services must publish more information about the risks their services pose to children. 

  • Age verification and restricting underage access

    - online services will be forced to set out in their terms of use how they enforce their minimum age policies as a way to stop children circumventing authentication methods and will have to publish details of when Ofcom has taken action against them. 

  • New criminal offences

    - including ‘assisting or encouraging self-harm’ and ‘controlling or coercive behaviour’. These criminal offences will be listed as types of illegal content that platforms must take steps to prevent users from encountering.

Timeline

The Bill will be returning to the House of Commons on Monday 5 December. While the Bill is still intended to come into force by May 2023, more delays and twists could still be ahead. Of course, in the meantime, international platforms are already working hard on implementing the changes required for the EU’s Digital Services Act. For our latest comparison table between the two, see here.

Tags

entertainment & media