UK watchdogs press meta, TikTok, snap and YouTube to block children
Published by Global Banking & Finance Review®
Posted on March 12, 2026
2 min readLast updated: March 12, 2026
Published by Global Banking & Finance Review®
Posted on March 12, 2026
2 min readLast updated: March 12, 2026
UK regulators Ofcom and the ICO have demanded by April 30 that Meta, TikTok, Snap, YouTube (and others) implement stronger age checks and safety features to protect children, under the Online Safety Act enforcement phase.
LONDON, March 12 (Reuters) - Britain's media and privacy regulators on Thursday demanded that major social media platforms do more to keep children off their services, warning that companies were failing to enforce their own minimum age rules.
Britain has been weighing tougher curbs on children's access to social media, with the government considering barring under 16s from such platforms - mirroring a move by Australia.
Ofcom and the Information Commissioner's Office said they had grown increasingly concerned about algorithmic feeds that expose children to harmful or addictive content.
"These online services are household names, but they're failing to put children's safety at the heart of their products," Melanie Dawes, Ofcom's chief executive, said.
"That must now change quickly, or Ofcom will act."
USE 'MODERN' TECH, COMPANIES TOLD
In the latest implementation phase of Britain's Online Safety Act, Ofcom told Facebook and Instagram - both owned by Meta - as well as Roblox, Snapchat, ByteDance's TikTok and Alphabet's YouTube to show by April 30 how they would tighten age checks, restrict strangers from contacting children, make feeds safer and stop testing new products on minors.
The ICO separately issued an open letter to the same platforms, calling on them to adopt "modern, viable" age-assurance tools to stop those under 13 accessing services not designed for them.
"There's now modern technology at your fingertips, so there is no excuse," Paul Arnold, ICO's chief executive, said.
Ofcom can fine companies up to 10% of their qualifying global revenue, while the ICO can issue fines of up to 4% of a company's global annual turnover.
The privacy watchdog last month fined Reddit nearly 14.5 million pounds for failing to introduce meaningful age checks and for processing children's data unlawfully.
($1 = 0.7439 pounds)
(Reporting by Sam TabahritiEditing by Paul Sandle and Tomasz Janowski)
Ofcom and the Information Commissioner's Office are urging major platforms to tighten age verification.
Ofcom can fine companies up to 10% of their global revenue, and ICO can issue fines up to 4% of annual turnover.
Meta (Facebook, Instagram), TikTok, Snap, YouTube, and Roblox have been given until April 30 to outline compliance.
They must implement tighter age checks, restrict stranger contact with minors, make feeds safer, and stop testing new products on children.
Yes, Reddit was fined nearly 14.5 million pounds for failing to implement effective age checks and protect children's data.
Explore more articles in the Finance category
