Newly unsealed court filings paint a troubling picture inside Meta — and let’s be honest, nobody is shocked that the same company preaching about “connecting the world” might have conveniently tucked away findings showing its platforms hurt people. Because when Silicon Valley giants insist they’re “here to help,” we all know to keep one eyebrow permanently raised.
According to internal documents revealed in a nationwide lawsuit brought by U.S. school districts, Meta quietly shut down a 2020 research project called Project Mercury after it produced causal evidence — yes, their word, not mine — that Facebook and Instagram were linked to increased depression, anxiety, loneliness, and toxic social comparison. Participants who deactivated their accounts for just one week reported feeling better. Imagine that.
Rather than continue the research or share the results, Meta allegedly dismissed the study as being influenced by “existing media narrative.” Convenient timing, considering the findings would have added fuel to a fire the company was desperate to put out. Meanwhile, staff privately assured top executive Nick Clegg that the research was valid. One internal researcher even compared Meta’s behavior to Big Tobacco hiding evidence about cigarettes — and when your own employees are making that comparison, you’ve got a real problem.
Publicly, however, Meta maintained it had no way to determine whether its platforms harmed teens — especially teen girls. The filings allege that the company told Congress exactly that, all while sitting on evidence suggesting the opposite.
Meta spokesman Andy Stone responded by insisting the research was flawed and proclaiming that Meta has spent “over a decade” listening to parents and improving safety. Sure — and I listen to gym ads every January, doesn’t mean I actually go.
The lawsuit, filed by hundreds of school districts, accuses Meta, along with TikTok, Google, and Snapchat, of intentionally hiding the risks their platforms pose to children. Allegations include encouraging underage use, failing to tackle child sexual exploitation, and even trying to pay child-focused organizations to publicly defend them. TikTok, for example, allegedly bragged internally that a sponsorship deal with the National PTA meant the group would “do whatever we want going forward.” Sounds about right.
But Meta gets the most heat, with internal documents claiming the company:
-
Designed youth safety features to be ineffective and blocked testing that might slow growth.
-
Allowed users to attempt sex-trafficking behavior 17 times before removing them — a threshold employees themselves called “very, very, very high.”
-
Chose to serve teens more harmful content because it increased engagement.
-
Delayed efforts to stop predators from contacting minors due to growth concerns.
-
Had leadership — including Mark Zuckerberg — downplay child safety concerns while prioritizing the metaverse.
Stone denied all of it, saying the accusations rely on “cherry-picked quotes.” Of course, when quotes are this specific, that’s some pretty impressive cherry-picking.
The internal documents aren’t yet public, and Meta is fighting to keep them sealed. A hearing is scheduled for January 26 in federal court.
As more sunlight hits Big Tech’s inner workings, one thing becomes clear: parents, teachers, and communities have every right to demand transparency. And while Meta may try to hide behind PR lines and polished statements, accountability has a funny way of catching up. Even in cases like this, there’s hope that the truth comes out — and that real protections for kids finally take priority over corporate growth charts.