In newly unsealed U.S. court documents, Meta Platforms stands accused of suppressing internal research that reportedly found a direct causal link between its social media platform and mental-health problems among young users. The filings, part of a large lawsuit by school districts against major tech firms, provide fresh insight into how Meta allegedly stalled study results and then misled regulators and the public.
The central claim concerns a 2020 initiative code-named “Project Mercury” in which Meta teamed with survey firm Nielsen Holdings to test what happened when users deactivated Facebook for a week. According to the filings, participants reported fewer feelings of depression, anxiety, loneliness and social comparison after stepping away from the platform. Rather than publish the findings or pursue additional research, Meta halted the project, according to the documents. Company records reportedly described the discovery as tainted by the “existing media narrative” around its products.
Internally, some researchers welcomed the results and compared the situation to the tobacco industry’s handling of evidence. One wrote that the study “does show causal impact on social comparison .” Another expressed concern: “It would be akin to doing research and knowing cigarettes were bad and then keeping that info to themselves.”
Still, when testifying before Congress and in public documents, Meta allegedly told regulators it lacked data quantifying harm to teen users—despite apparently conducting research to the contrary. The plaintiffs’ filing argues the discrepancy reflects a pattern of burying known risks while prioritizing user growth.
These filings come as part of a broader class action brought by U.S. school districts. The suit accuses Meta, along with other tech firms, of knowingly concealing platform risks to teenagers—such as exposure to harmful content, under-age user encouragement, and design decisions that favored engagement over safety.
Meta responded to the allegations by stating that the study was stopped due to flawed methodology and that it has implemented numerous safety features over the past decade. The company’s spokesperson, Andy Stone, said that Meta continues to invest in youth safety and product protections. The company also filed motions to keep key documents sealed, arguing that disclosure could harm sensitive business information.
The legal battle is scheduled for a hearing on January 26, 2026 in Northern California. Industry analysts note that should the claims succeed, Meta could face serious reputational damage, regulatory scrutiny, and potential financial penalties. Public trust may also suffer, as parents and educators demand greater transparency and stronger protections for young social-media users.
The case may set a precedent in holding digital-platform companies legally responsible for the mental-health impacts of their products, especially on younger users. As regulators globally tighten oversight of Big Tech, firms now face growing pressure to balance innovation, growth and user safety with greater accountability.
More News : EU Crackdown: Meta and TikTok Face Preliminary Finding of Breaching New Digital Transparency Rules







