If your organisation hosts online forums or other communities where users can interact online, you may need to take action under Ofcom’s new codes on child safety. Charities advocating for online safety for children and women will also be interested in the implications of these codes, as well as Ofcom’s current consultation on protecting women and girls online.

Last month, the Netflix hit mini-series, Adolescence sparked a national conversation about the dangers that social media, toxic influencers and the “manosphere” pose to children. While the Prime Minister warned that there was not a “silver bullet response” to fix such issues, the Government has pointed to the Online Safety Act 2023 (the “OSA”) which aims to protect children from such harmful content. Ofcom has now published its final child safety measures under the OSA, The Protection of Children Codes and Guidance (the “Codes”). The measures within the Codes are intended to be a “reset for children” and bring about “a safer generation of children online”.

Viewers of Adolescence will be familiar with the online harms that Ofcom is aiming to address with these Codes, as in the series online harms boiled over with fatal consequences. While Adolescence is not based on a true story, it highlights very real issues. Research carried out by Ofcom found that three in five teenage children reported encountering potentially harmful content online over a four-week period and 30% of 13 to 17 year olds said that they encountered online harms through scrolling through their feed or via a ‘For You’ page. Nearly 70% of 11-14 year old boys have been exposed to content promoting misogyny and other harmful views, while Refuge found that as many as 62% of young women have been victims of online abuse.

Ofcom’s child safety measures

As detailed in our previous blog, platforms are already expected to have carried out a children’s access assessment. Now, those platforms which are likely to be accessed by children will have until 24 July 2025 to carry out their children’s risk assessments and then must implement the safety measures from 25 July 2025.

The Codes in their entirety are substantial, with over 900 pages of content for service providers to navigate. The extent to which the measures within the Codes apply to different providers differs depending on their size (by UK user base) and potential risk to children.

The measures which will apply to all user-to-user online services likely to be accessed to children include:

  • Designating an individual to be accountable for compliance with the child safety, reporting and complaints duties;
  • Having terms and conditions and statements regarding the protection of children, which include all the information mandated by the OSA;
  • Having a content moderation function to review and assess suspected content harmful to children, which allows for swift action to be taken;
  • Having transparent complaints processes that are easy to access and use; and
  • Taking appropriate action in response to complaints about content considered harmful to children.

Some of the additional measures which apply to large (with an average user base of more than 7 million monthly active UK users) and/or higher risk user-to-user services include:

  • Having an internal code of conduct for all staff working to protect children online;
  • Ensuring some staff are trained in the child safety, reporting and complaints duties;
  • Implementing age assurance to ensure that children cannot access primary priority content (defined as pornographic, suicide and self harm, and eating disorder content) and are protected from accessing priority content (defined as abuse and hate, bullying, violent, harmful substances, dangerous stunts and challenges, depression, and body stigma content);
  • Enabling children to give negative feedback on content that is recommended to them, which must then be considered in how content is recommended to them; and
  • Providing children with the option to block and mute other users, turn off comments on their posts, and decline invitations to online groups.

These additional measures will depend on the level of risk on the service (as identified in the children’s risk assessment) and whether it meets other specific risk criteria (e.g. having relevant functionalities).

If online services providers fail to comply with the Codes, Ofcom has enforcement powers to impose significant fines (up to 10% of qualifying global revenue or £18 million), and, in the most serious cases, apply for a court order to prevent the service from being available in the UK.

Do these measures go far enough?

Adolescence’s writer, Jack Thorne, personally advocates for a ban of social media and even smartphone use for all under 16s.

Of course, the OSA doesn’t go as far as to limit under 16s’ use of social media altogether – although the child safety duties are intended to ensure that children consume age-appropriate content online.

It is unsurprising then that some campaigners feel that Ofcom’s Codes will not be enough to protect children online. There are concerns that the Codes grant too much discretion to platforms to decide for themselves what content is harmful. Online safety campaigner Ian Russell, whose 14-year-old daughter Molly ended her life after engaging with harmful content online, has said he is “dismayed by the lack of ambition” in the Codes. In the context of trade negotiations with the US, there are also worries that OSA requirements could be watered down.

Ofcom insists that the Codes are “transformational” and that it intends to put in place further rules in due course. The NSPCC responded similarly, noting that the Codes are “an important stepping stone rather than the end solution”. In response to fears about pressure from Trump’s US, the Secretary of State for Science, Innovation and Technology has held firm, reiterating that US platforms “must adhere to British laws” if they operate in the UK.

Wider action from Ofcom

The Codes on child safety are just one of the measures Ofcom is taking to protect people online – our previous blog discusses some of these other measures. Ofcom is also currently consulting on its draft guidance, A safer life online for women and girls: Practical guidance for tech companies. The guidance proposes several actions aimed at tackling four key gender-based harms: online misogyny, harassment and pile-ons (meaning where coordinated individuals target a specific woman, girl or group), online domestic abuse and image-based sexual abuse.

Ofcom has stressed that it is keen to ensure that the guidance in relation to protecting women and girls online is ambitious. It is calling on tech companies, civil society, researchers and survivors to come to the table and discuss how stakeholders can work together to strengthen the proposals outlined in the guidance. The consultation is open until 5pm on 23 May 2025.

Whether the OSA will adequately push platforms to protect children and others, and what might come next if they don’t, remains to be seen.

If you would like to discuss the child safety measures set out under the Online Safety Act or Ofcom’s consultation on women and girl’s online safety, please contact Louise Sivey or Natasha Davies.

The material in this article is provided for guidance and general information only and is not intended to constitute legal or other professional advice upon which you should rely. In particular, the information should not be used as a substitute for a full and proper consultation with a suitably qualified professional. Please do contact the Bates Wells team if you require further information.