• Meta terminated 2020 Project Mercury study after finding causal mental health evidence
  • Internal documents show week-long Facebook deactivation reduced depression and anxiety levels
  • School districts nationwide file suit alleging company concealed known risks from public

MENLO PARK, CA (TDR)Meta Platforms shut down internal research into the mental health effects of Facebook and Instagram after discovering causal evidence that its products harmed users, according to unredacted court filings in a nationwide class-action lawsuit filed by school districts against major social media companies.

The 2020 research project, code-named "Project Mercury," involved Meta scientists working with survey firm Nielsen to measure the impact of deactivating Facebook and Instagram. Internal documents revealed that people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison.

Rather than publishing these findings or conducting additional research, Meta terminated the project and internally declared the negative results were tainted by the "existing media narrative" surrounding the company.

Court filings reveal internal contradictions

The revelations emerged from court documents filed Friday by Motley Rice, a South Carolina-based law firm representing school districts in litigation against Meta, Google, TikTok and Snapchat. The plaintiffs argue these companies have deliberately concealed internally recognized risks from users, parents and educators.

Freedom-Loving Beachwear by Red Beach Nation - Save 10% With Code RVM10

Despite the company's public dismissal of the findings, internal staff privately assured Nick Clegg, Meta's then-head of global public policy, that the research conclusions were valid. One unnamed researcher wrote that the Nielsen study demonstrated causal impact on social comparison.

"The Nielsen study does show causal impact on social comparison."

Another employee raised ethical concerns about suppressing the data, warning that withholding negative findings would be similar to the tobacco industry conducting research, knowing cigarettes were harmful but concealing that information.

Congressional testimony contradicted internal findings

Court filings claim Meta told Congress it had no ability to quantify whether its products were harmful to teenage girls, despite the company's own work documenting a causal link between its platforms and negative mental health effects.

The documents allege Meta intentionally designed youth safety features to be ineffective and rarely used, while blocking testing of safety features that might harm growth. Court filings also claim Meta required users to be caught 17 times attempting to traffic people for sex before removal from its platform.

CEO prioritized other initiatives over child safety

The filing reveals that in a 2021 text message, CEO Mark Zuckerberg stated he wouldn't say child safety was his top concern when he had other areas to focus on, like building the metaverse. Court documents indicate Zuckerberg repeatedly dismissed or ignored requests from Clegg to better fund child safety initiatives.

CLICK HERE TO READ MORE FROM THE THE DUPREE REPORT

Following ongoing debates over border security and immigration policy in 2026, do you support stricter enforcement measures?

By completing the poll, you agree to receive emails from The Dupree Report, occasional offers from our partners and that you've read and agree to our privacy policy and legal statement.

Meta recognized that optimizing its products to increase teen engagement resulted in serving them more harmful content, but proceeded anyway, according to the filings. The company also stalled internal efforts to prevent child predators from contacting minors for years due to growth concerns.

Company disputes allegations

Meta spokesman Andy Stone disputed the allegations, stating the Project Mercury study was halted due to flawed methodology. He emphasized the company's commitment to improving product safety over the years.

"The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens."

Stone called the allegations based on cherry-picked quotes and misinformed opinions, adding that Meta's safety measures are broadly effective. He said the company's current policy is to remove accounts as soon as they are flagged for sex trafficking.

The underlying Meta documents cited in the filing remain sealed, and Meta has filed a motion to strike the documents from the public record. A hearing on the matter is scheduled for Jan. 26, 2026, in Northern California District Court.

Growing legal pressure on social media platforms

The lawsuit is part of a growing legal movement against social media platforms. More than 2,172 claims have been filed in the Adolescent Social Media Addiction multidistrict litigation, with 33 states filing lawsuits against Meta for its role in the youth mental health crisis.

School districts cite increased costs for mental health services and disruptions to education as students struggle with attention spans and emotional wellbeing. Arkansas districts including Springdale, Fayetteville, Pulaski County Special, Siloam Springs and Conway have joined the litigation.

What safeguards should social media companies implement to protect young users' mental health?

Freedom-Loving Beachwear by Red Beach Nation - Save 10% With Code RVM10