EU fails to extend rules on child abuse content detection by online platforms
Published by Global Banking & Finance Review®
Posted on March 16, 2026
2 min readLast updated: March 16, 2026
Published by Global Banking & Finance Review®
Posted on March 16, 2026
2 min readLast updated: March 16, 2026
EU member states and the European Parliament failed on March 16, 2026 to agree on extending the temporary derogation allowing voluntary detection of child sexual abuse material (CSAM) by online platforms, creating a legal gap after the current rules expire on April 3, 2026.
By Foo Yun Chee
BRUSSELS, March 16 (Reuters) - EU countries and lawmakers on Monday failed to agree to an extension of a temporary measure governing how Alphabet's OGL.O> Google, Meta Platforms and other online platforms tackle child sexual abuse material, leaving a legal vacuum on the issue.
The current system of voluntary detection and removal of online child sexual abuse by companies, which exempts them from strict online privacy rules, has been in place since 2021 and will expire on April 3.
"Regrettably the European Parliament insisted on amending the scope of the interim measure in a way that, in the view of the vast majority of member states, would have made this measure ineffective," a spokesperson for Cyprus, which currently holds the rotating EU presidency, said.
"Today's development creates a vacuum."
Lawmakers last week insisted that the temporary rules should not apply to end-to-end encrypted communications, among other proposed changes.
Europe resorted to a temporary measure after failing to agree on legislation on the issue, which pits advocates of online safety measures against privacy activists worried about surveillance.
The European Commission's draft rule known as child sexual abuse material (CSAM) has been stuck in a quagmire since it was drawn up in 2022, with both sides criticising key elements.
Big Tech has lobbied against any requirement that would force messaging services, app stores and internet access providers to report and remove known and new images and videos, as well as cases of grooming.
(Reporting by Foo Yun Chee; Editing by Hugh Lawson)
Since 2021, voluntary detection and removal of child sexual abuse material by online companies has been allowed, exempting them from strict online privacy rules.
EU lawmakers and member states could not agree on the scope of temporary measures, especially regarding encrypted communications and effectiveness concerns.
The current system will expire on April 3, creating a legal vacuum for online platforms regarding child abuse material detection.
The key issue is balancing online safety with privacy rights, with Big Tech and privacy activists disagreeing over requirements for reporting and content removal.
Big Tech opposes forced reporting and removal, while privacy activists are concerned about surveillance, especially regarding encrypted messages.
Explore more articles in the Finance category


