The long-awaited Online Safety Bill has been published for pre-legislative scrutiny by the UK government. It follows the government’s Online Harms White Paper and its December 2020 response and aims to deliver the Conservative party’s 2019 manifesto commitment “to make the UK the safest place in the world to be online while defending free expression”. It aims to help protect young people and clamp down on racist abuse online, as well as safeguarding freedom of expression.
The key points in the draft Bill are:
Ofcom will be given the power to fine companies failing to comply with a new duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher, and have the power to block access to websites.
A duty of care is proposed in line with the government’s response to the Online Harms White Paper. All companies in scope will have “a duty of care towards their users so that what is unacceptable offline will also be unacceptable online.”
They will need to consider the risks their sites may pose to the youngest and most vulnerable people and act to protect children from inappropriate content and harmful activity.
They will need to take “robust action” to tackle illegal abuse, including swift and effective action against hate crimes, harassment and threats directed at individuals and keep their promises to users about their standards.
The largest and most popular social media sites (which will be designated as Category 1 services) will need to act on content that is lawful but still harmful. This includes abuse that falls below the threshold of a criminal offence, encouragement of self-harm and mis/disinformation. Category 1 platforms will need to state explicitly in their terms and conditions how they will address these legal harms and Ofcom will hold them to account.
The draft Bill contains reserved powers for Ofcom to pursue criminal action against named senior managers whose companies do not comply with Ofcom’s requests for information. These will be introduced if tech companies fail to comply with their new responsibilities. A review will take place at least two years after the new regulatory regime is fully operational.
The final legislation, when introduced to the UK parliament, will contain provisions that require companies to report child sexual exploitation and abuse content identified on their services. This aims to ensure companies provide law enforcement with the high-quality information they need to safeguard victims and investigate offenders.
Freedom of expression - the Bill aims to ensure people in the UK can “express themselves freely online and participate in pluralistic and robust debate”. All companies falling within scope will be required to consider and put in place safeguards for freedom of expression when fulfilling their duties. These safeguards will be set out by Ofcom in codes of practice but, for example, might include having human moderators take decisions in complex cases where context is important.
People using their services will need to have access to effective routes of appeal for content removed without good reason and companies must reinstate that content if it has been removed unfairly. Users will also be able to appeal to Ofcom and these complaints will form an essential part of Ofcom’s horizon-scanning, research and enforcement activity.
Category 1 services will have additional duties. They will need to conduct and publish up-to-date assessments of their impact on freedom of expression and demonstrate they have taken steps to mitigate any adverse effects. These measures aim to remove the risk that online companies adopt restrictive measures or over-remove content in their efforts to meet their new online safety duties. An example of this could be AI moderation technologies falsely flagging innocuous content as harmful, such as satire.
Category 1 services must protect content defined as ‘democratically important’. This will include content promoting or opposing government policy or a political party ahead of a vote in Parliament, election or referendum, or campaigning on a live political issue.
Content on news publishers’ websites is not in scope. This includes both their own articles and user comments on these articles. Category 1 companies will have a statutory duty to safeguard UK users’ access to journalistic content shared on their platforms, whether contributed by professional or citizen journalists.
There is already concern about the draft Bill, particularly in relation to the notion of harm which is serious but not serious enough to be a criminal offence; the exemptions for democratically important activity and journalistic activity, whether the Bill will require age verification; and the lack of action on online scams (the government says it will deal with this elsewhere). It is likely that the finer detail of the Bill will change during the pre-legislative scrutiny as well as the legislative process and companies should keep a watching brief on developments.
Despite the fact that we are now using the internet more than ever, over three quarters of UK adults are concerned about going online, and fewer parents feel the benefits outweigh the risks of their children being online – falling from 65 per cent in 2015 to 55 per cent in 2019.