Meta fixes error that flooded Instagram Reels with violent videos
Published by Global Banking & Finance Review®
Posted on February 27, 2025
2 min readLast updated: January 25, 2026
Published by Global Banking & Finance Review®
Posted on February 27, 2025
2 min readLast updated: January 25, 2026
Meta resolved an error that caused violent videos to appear in Instagram Reels, affecting users globally. The issue raises concerns about Meta's content moderation policies.
(Reuters) - Meta Platforms said on Thursday it had resolved an error that flooded the personal Reels feeds of Instagram users with violent and graphic videos worldwide.
It was not immediately clear how many people were affected by the glitch. Meta's comments followed a wave of complaints on social media about violent and "not safe for work" content in Reels feeds, despite some users having enabled the "sensitive content control" setting meant to filter such material.
"We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended. We apologize for the mistake," a spokesperson for Meta said.
It did not disclose the reason behind the error.
Meta's moderation policies have come under scrutiny after it decided last month to scrap its U.S. fact-checking program on Facebook, Instagram and Threads, three of the world's biggest social media platforms with more than 3 billion users globally.
Violent and graphic videos are prohibited under Meta's policy and the company usually removes such content to protect users, barring exceptions given for videos that raise awareness on topics including human rights abuse and conflict.
The company has in recent years been leaning more on its automated moderation tools, a tactic that is expected to accelerate with the shift away from fact-checking in the United States.
Meta has faced criticism for failing to effectively balance content recommendations and user safety, as seen in incidents like the spread of violent content during the Myanmar genocide, Instagram promoting eating disorder content to teens and misinformation during the COVID-19 pandemic.
(Reporting by Surbhi Misra and Akash Sriram in Bengaluru; Editing by Saumyadeb Chakrabarty)
Meta resolved an error that caused users' Reels feeds to be flooded with violent and graphic videos that should not have been recommended.
Users expressed their concerns on social media, complaining about the presence of violent and 'not safe for work' content in their Reels feeds.
Meta has faced criticism for failing to effectively balance content recommendations and user safety, particularly highlighted by past incidents like the spread of violent content during the Myanmar genocide.
Violent and graphic videos are prohibited under Meta's policy, and the company typically removes such content to protect users, except for videos that raise awareness on certain topics.
In recent years, Meta has increasingly relied on automated moderation tools, a strategy expected to grow with the discontinuation of its U.S. fact-checking program.
Explore more articles in the Finance category


