Musk's X must face part of lawsuit over child pornography video
Published by Global Banking & Finance Review®
Posted on August 1, 2025
3 min readLast updated: January 22, 2026
Published by Global Banking & Finance Review®
Posted on August 1, 2025
3 min readLast updated: January 22, 2026
A federal court revived a lawsuit against Musk's X for negligence in reporting a child exploitation video, challenging Section 230 protections.
By Jonathan Stempel
(Reuters) -A federal appeals court on Friday revived part of a lawsuit accusing Elon Musk's X of becoming a haven for child exploitation, though the court said the platform deserves broad immunity from claims over objectionable content.
While rejecting some claims, the 9th U.S. Circuit Court of Appeals in San Francisco said X, formerly Twitter, must face a claim it was negligent by failing to promptly report a video containing explicit images of two underage boys to the National Center for Missing and Exploited Children (NCMEC).
The case predated Musk's 2022 purchase of Twitter. A trial judge had dismissed the case in December 2023. X's lawyers did not immediately respond to requests for comment. Musk was not a defendant.
One plaintiff, John Doe 1, said he was 13 when he and a friend, John Doe 2, were lured on SnapChat into providing nude photos of themselves to someone John Doe 1 thought was a 16-year-old girl at his school.
The SnapChat user was actually a child pornography trafficker who blackmailed the plaintiffs into providing additional explicit photos. Those images were later compiled into a video that was posted on Twitter.
According to court papers, Twitter took nine days after learning about the content to take it down and report it to NCMEC, following more than 167,000 views, court papers showed.
Circuit Judge Danielle Forrest said section 230 of the federal Communications Decency Act, which protects online platforms from liability over user content, didn't shield X from the negligence claim once it learned about the pornography.
"The facts alleged here, coupled with the statutory 'actual knowledge' requirement, separates the duty to report child pornography to NCMEC from Twitter's role as a publisher," she wrote for a three-judge panel.
X must also face a claim its infrastructure made it too difficult to report child pornography.
It was found immune from claims it knowingly benefited from sex trafficking, and created search features that "amplify" child pornography posts.
Dani Pinter, a lawyer at the National Center on Sexual Exploitation, which represented the plaintiffs, said in a statement: "We look forward to discovery and ultimately trial against X to get justice and accountability."
The case is Doe 1 et al v Twitter Inc et al, 9th U.S. Circuit Court of Appeals, No. 24-177.
(Reporting by Jonathan Stempel in New York; Editing by Cynthia Osterman)
Child pornography refers to any visual depiction of sexually explicit conduct involving a minor. It is illegal and considered a serious crime in many jurisdictions.
Negligence is a failure to take proper care in doing something, which results in damage or injury to another person. In legal contexts, it often involves a breach of duty.
The Communications Decency Act is a U.S. law that provides immunity to online platforms from liability for user-generated content, with certain exceptions, particularly regarding child exploitation.
The National Center for Missing and Exploited Children is a nonprofit organization in the U.S. that provides resources and support to prevent child abduction and exploitation.
Explore more articles in the Headlines category


