The story
In 2021, a Facebook whistleblower leaked internal company research that confirmed what many parents already suspected: Instagram was making teen mental health worse, and Meta knew it. Internal slides showed that one in three teen girls said Instagram made their body image issues worse. But instead of fixing the problem, the company explored launching a version of Instagram for kids under 13.
It's not just Meta. TikTok's algorithm pushes increasingly extreme content. Snapchat's streaks create anxiety about breaking them.
YouTube autoplay leads kids down rabbit holes. These features aren't bugs — they're designed to maximize time on the app, which means more ad revenue. The lawsuits argue these companies made a deliberate choice: addict children to their platforms and deal with the consequences later.
Schools are also suing, saying they're overwhelmed by the mental health fallout.