Ofcom has recently published illuminating research showing how young children are being drawn into the worlds of social media and gaming at a younger age.
It shows that 65% of 5-7 year olds go online to send messages or make voice/video calls and 50% watch live-streamed content (these figures are higher than one year ago).
Similarly, overall use of social media sites or apps among all 5-7 year olds has increased year-on-year, with WhatsApp, TikTok, Instagram and Discord seeing particular growth among this age group. Online gaming among 5-7 year-olds has also seen a significant annual increase – 41%, up from 34%.
The research also suggests a disconnect between older children’s exposure to potentially harmful content online, and what they share with their parents about their online experiences. Now that the Online Safety Act is in force, Ofcom is:
- consulting in May on its draft Children’s Safety Code of Practice. This will set out the practical steps that Ofcom expects tech firms to take to ensure children have safer experiences online; and
- planning an additional consultation later this year on how automated detection tools, including AI, can be used to mitigate the risk of illegal harms and content most harmful to children.
As well as Ofcom's activities, the ICO recently published its priorities to protect children's privacy online, with four main principles:
- Default privacy and geolocation settings: children's profiles must be private by default and geolocation settings must be turned off by default.
- Profiling children for targeted advertisements: unless there is a compelling reason to use profiling for targeted advertising, it should be off by default.
- Using children’s information in recommender systems: the design of these needs to ensure that they don't lead children to harmful content or encourage children to spend longer on the platform than they otherwise would, or sharing more personal information with the platforms.
- Using information of children under 13 years old. Parental consent is required. How services gain consent, and how they use age assurance technologies to assess the age of the user and apply appropriate protections, are important for mitigating potential harms.
The regulators have emphasised their collaboration to protect children and it will be interesting to see how Ofcom's draft Code on children's safety compares with the ICO Children's Code - the ICO's priorities are equally relevant to online safety as they are to protecting children's' personal information.