Ofcom has launched its second major consultation under the Online Safety Act 2023 which includes draft codes of practice for user-to-user and search services to follow when seeking to protect children under the Act.
The Act places duties on platforms – including social media, search, and pornography services – to protect users in the UK by assessing risks of harm and taking steps to address them.
Ofcom has already consulted on the wider duties under the Act and now turns its attention to protecting children.
What online services must do to protect children
Companies must assess if children are likely to access their service – or part of it. This involves completing “children’s access assessments”. Ofcom has published draft Children’s Access Assessments Guidance aimed at helping service providers comply. It recognises that platforms will have already carried out risk assessments under the ICO Children's Code but says that they will have to do them again, although they may rely on some of the same research. Last week, Ofcom and the ICO published a joint statement on their collaboration regarding data protection and online safety compliance.
If a platform assesses that it is likely to be accessed by children, it must complete a children’s risk assessment to identify any risks its service(s) may pose to children. This is as well as the illegal content risk assessments that all services need to complete. To facilitate this, Ofcom has published draft Children’s Risk Assessment Guidance. This explains how services can complete the assessment and assess the risks to children. Services must prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography. Services must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, bullying content, and content promoting dangerous challenges.
A measure of safety
Services must also take and implement safety measures to mitigate the risks to children. Ofcom is proposing more than 40 safety measures in its codes. These fall within the following areas:
- Robust age checks. Ofcom expects much greater use of age assurance, so services know which of their users are children. All services which do not ban harmful content, and those at higher risk of it being shared on their service, should implement highly effective age-checks to prevent children from seeing it. “Highly effective” means measures such as facial age recognition, credit cards and photo ID matching. Ofcom has said that relying on self-declaration, contract terms or debit cards (as you don't have to be over 18 to have one) is unlikely to be acceptable.
- Safer algorithms. Ofcom says that recommender systems (algorithms which provide personalised recommendations to users) – are children’s main pathway to harm online. Under its proposals, any service which operates a recommender system and is at higher risk of harmful content should identify who its child users are and configure its algorithms to filter out the most harmful content from children’s feeds and reduce the visibility of other harmful content.
- Effective moderation. Ofcom also says that all user-to-user services should have content moderation systems and processes that ensure swift action is taken against content harmful to children. Search services should also have appropriate moderation systems and, where large search services believe a user to be a child, a “safe search” setting, which children should not be able to turn off, to filter out the most harmful content.
- Strong governance and accountability. Ofcom proposes that platforms should have a named person as accountable for compliance with the children’s safety duties; an annual senior-body review of all risk management activities relating to children’s safety; and an employee Code of Conduct that sets standards for employees around protecting children.
- More choice and support for children. This includes ensuring clear and accessible information for children and carers, with easy-to-use reporting and complaints processes, and giving children tools and support to help them stay safe, such as being able to block other users or turn off comments on their posts.
- Platforms should keep assessments and safety measures under review. Services that are not ‘likely to be accessed by children’ need to carry out children’s access assessments annually, and before any major changes to their services. In addition, services need to keep their children’s risk assessments up to date. Ofcom also suggests that service providers monitor the effectiveness of the safety measures they take or implement, and continually improve them over time.
Some platforms have already made changes in anticipation of the Online Safety Act coming into force.
The consultation and guidance includes several very long documents, so there is a lot for tech companies to digest here - especially as the EU's Digital Services Act has also come into force with similar, but different, requirements.