Ofcom Lays Down the Law with Child Safety Rules for Tech Giants
The UK’s digital and comms regulator has announced a new code of practice for tech firms to ensure they meet their child safety obligations under the Online Safety Act.
Published today, the Protection of Children Codes and Guidance was written after a length consultation process that included interviews with tens of thousands of parents and children, and feedback from industry, civil society, charities and child safety experts, according to Ofcom.
It features 40 separate measures that website and app providers will need to implement in order to prevent children seeing harmful content, better support those who have and provide kids with greater control over their online experience.
The “safety first” approach demanded by the regulator will require:
- Recommender systems which filter out “harmful” content from children’s feeds
- More effective age checks, or else ensuring an age-appropriate experience for younger children
- Processes in place to review, assess and take down harmful content when made aware of it
- More control for kids to indicate what content they don’t like, accept/decline group chat invitations, block and mute accounts, and disable comments on their posts
- A straightforward mechanism for reporting/complaining about content, with terms of service children can understand
- Strong governance, including a named person accountable for children’s safety, and a senior body to annually review risks to children
Read more on children’s online safety: Annual Cost of Child Identity Fraud Almost $1Bn
Ofcom CEO, Melanie Dawes, described the new rules as a “reset” for children online.
“They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content,” she added.
“Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
Big Fines Could Be Coming
That enforcement could come in the form of fines of up to £18m or 10% of global revenue. The regulator could also apply for a court order to block a site or app for UK users.
Ofcom cited research claiming that 59% of 13-17-year-olds have encountered harmful content over a four-week period, and that 30% of 8-12-year-olds have seen something online they found “worrying” or “nasty.”
Tech firms have long been in the crosshairs of regulators for the way they handle their younger users. In March, the Information Commissioner’s Office (ICO) launched an investigation into TikTok, Reddit and Imgur after expressing concerns over the way the sites use children’s personal information.
TikTok has been fined in the US over similar charges, and last year was hit with a new FTC civil lawsuit.
After Ofcom’s announcement today, in-scope online service providers now have to complete and record a risk assessment of their services by July 24, before ensuring the appropriate safety measures set out in the code are put in place by July 25.