UK to impose 48-hour online takedown rule for nonconsensual intimate images
Published by Global Banking & Finance Review®
Posted on February 19, 2026
2 min readLast updated: February 19, 2026
Published by Global Banking & Finance Review®
Posted on February 19, 2026
2 min readLast updated: February 19, 2026
UK will require platforms to remove nonconsensual intimate images within 48 hours or face fines up to 10% of global revenue or blocking. Ofcom will fast‑track hash‑matching rules, with a decision due in May. ([gov.uk](https://www.gov.uk/government/news/tech-firms-will-have-to-take-down-abusive-images-within-48-hours-under-new-law-to-protect-women-and-girls?utm_source=openai))
LONDON, Feb 18 (Reuters) - Britain will require technology companies to take down intimate images shared online without consent within 48 hours or else face fines of up to 10% of eligible global revenue, and even risk having their services blocked.
The government says the steps will improve safeguards for women and girls during a global push to curb abuse in a world where images sent privately can be easily shared online and AI-based tools can instantly create sexually explicit images.
Britain said on Thursday it would amend legislation passing through parliament to create a legal duty for major platforms to take down nonconsensual intimate images no more than two days after they are reported.
It is already illegal in Britain to post such images online, but some victims have reported difficulty getting platforms to permanently remove them.
"The online world is the frontline of the 21st century battle against violence against women and girls," Prime Minister Keir Starmer said in a statement.
NONCONSENSUAL IMAGES FUELLING ONLINE SAFETY DEBATE
A surge in nonconsensual images has fed into Britain's wider debate over online safety. Ministers are examining whether to restrict social media access for under 16s, echoing Australia's ban.
Britain said its media regulator Ofcom was considering treating the sharing of illegal intimate images with the same severity as child sexual abuse and terrorist content.
The government said victims would only need to report material once, with platforms expected to remove the same image across services and prevent re-uploads.
Any fines for failing to do so could be applied to a platform's 'Qualifying Worldwide Revenue' - a measure used by Ofcom which covers income generated anywhere in the world from the parts of the service it regulates.
In a separate statement, Ofcom said it would fast-track a decision on new rules requiring platforms to use "hash-matching" tools to block illegal intimate images at source. The decision would come in May, and new measures could come into effect this summer.
(Reporting by Sam Tabahriti; editing by William James)
Britain plans to mandate that platforms remove reported nonconsensual intimate images within 48 hours, with steep penalties for failures, under an amendment to the Crime and Policing Bill. ([gov.uk](https://www.gov.uk/government/news/tech-firms-will-have-to-take-down-abusive-images-within-48-hours-under-new-law-to-protect-women-and-girls?utm_source=openai))
Companies that miss the 48-hour deadline risk fines up to 10% of qualifying worldwide revenue and could have their services blocked in the UK. ([gov.uk](https://www.gov.uk/government/news/tech-firms-will-have-to-take-down-abusive-images-within-48-hours-under-new-law-to-protect-women-and-girls?utm_source=openai))
Ofcom will announce by May whether platforms must use hash-matching to block illegal intimate images; any new measures could start this summer. ([ofcom.org.uk](https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-fast-tracks-decision-on-measures-to-block-illegal-intimate-images?utm_source=openai))
Explore more articles in the Headlines category