Evil Meta
- Quick Savant
- Mar 24
- 5 min read
The following was compiled to extend a summary on Sarah Wynn-Williams book Careless People.
In order to get more information about them and to sell them products, Meta designs algorithms to prey on vulnerable teenagers who are concerned about their looks or weight, regardless of the increasing risk of suicide or other forms of harm.
Meta sometimes obtains information about subscribers without their consent in order to sell it to the highest bidder, as was done in the Cambridge Analytica
Meta cooperates with authoritarian, human-rights abusive regimes to gain access to their restricted countries, for example, by granting them access to sensitive private data on millions.
Meta allowed hate speech that fueled violence, as in the Rohingya case of the Buddhist genocide of Muslims
Meta tolerates a culture of male dominance where acts of misogyny and sexual harassment of women are commonplace.
States Sue Meta
In a recent development reported by PBS News on October 24, 2023, more than 40 U.S. states, including California and New York, have launched lawsuits against Meta Platforms Inc., alleging that the company’s social media platforms, Instagram and Facebook, are intentionally designed to addict children and harm their mental health. A federal lawsuit filed by 33 states in California accuses Meta of violating federal law by collecting data on children under 13 without parental consent, while nine additional attorneys general are pursuing separate lawsuits in their states, bringing the total to 41 states plus Washington, D.C.
The states claim that Meta has exploited powerful technologies to hook young users for profit, misleading the public about the dangers of its platforms and concealing how they manipulate vulnerable teens and children. New York Attorney General Letitia James emphasized that Meta’s manipulative features fuel addiction and erode self-esteem, while California Attorney General Rob Bonta called for an end to the company’s harmful practices. The lawsuits demand financial damages, restitution, and a halt to Meta’s illegal actions.
Meta countered that it has implemented over 30 tools to ensure safe, positive online experiences for teens and expressed disappointment that the attorneys general opted for litigation rather than collaboration on industry-wide standards. The legal action follows 2021 reports from The Wall Street Journal and whistleblower Frances Haugen, whose leaked documents revealed Meta’s awareness of Instagram’s negative impact on teen mental health, including body image issues and suicidal thoughts.
With social media use nearly universal among U.S. teens—about a third of whom use it almost constantly, according to the Pew Research Center—the states argue that Meta knowingly flouts the Children’s Online Privacy Protection Act. While other platforms like TikTok and Snapchat face similar criticism, this lawsuit targets Meta specifically, with Washington D.C. Attorney General Brian Schwalb labeling it “the worst of the worst” for addicting teens. Amid growing concerns, the U.S. Surgeon General has urged immediate action to protect kids from social media’s harms, a call these lawsuits aim to enforce.
A growing number of families are taking legal action against Meta, the social media giant formerly known as Facebook, Inc., alleging that its platforms harm children and teens. Research has increasingly linked social media use to negative behaviors in young people, including eating disorders, loneliness, self-harm, and suicidal tendencies. Among these cases, the parents of a teenage girl from New York have filed a lawsuit claiming that their daughter’s compulsive Instagram use led to an eating disorder, self-harming behaviors, and suicidal thoughts. According to the suit, the girl created an Instagram account at age eleven—below Meta’s minimum age of 13—without her parents’ knowledge, and she now faces an ongoing struggle to maintain her recovery.
Meta, a tech titan that generates billions through advertising, has faced mounting scrutiny over its practices, particularly after former employee Frances Haugen testified before a U.S. Senate committee that the company knowingly pushes harmful content, such as anorexia-related material, to young users. The Facebook Papers, a trove of internal documents published by The Wall Street Journal, further revealed that Instagram worsened body image issues for one in three teens and that Meta staff discussed concealing how their apps promote anorexia and self-harm—yet took no meaningful action. These revelations have spurred investigations by multiple attorneys general offices.
The lawsuits highlight a range of adverse effects tied to Meta’s platforms, including anxiety, depression, eating disorders, self-harm, suicide attempts, and suicidal ideation. Critics question the ethics of Meta’s algorithms targeting children, amplifying concerns about the company’s responsibility for the mental health crisis among youth.
Individual Legal Cases
Several legal cases have been filed by parents against Meta Platforms, Inc., alleging that the company’s social media platforms, particularly Instagram and Facebook, contributed to their children’s suicides through addictive design and harmful content. Below is a summary of notable cases based on available information:
Rodriguez v. Meta Platforms Inc. (January 2021)Tammy Rodriguez, a mother from Connecticut, sued Meta and Snap Inc. following the suicide of her 11-year-old daughter, Selena, in July 2021. The lawsuit claims that Selena developed an extreme addiction to Instagram and Snapchat, leading to depression, sleep deprivation, and poor self-esteem. It alleges that the platforms’ defective design and lack of safeguards exposed Selena to sexual exploitation, exacerbating her mental health struggles. Filed in the U.S. District Court for the Northern District of California, the case asserts that Meta knowingly designed addictive features without warning parents or protecting young users.
Dawley v. Meta Platforms Inc. (April 2022)Donna Dawley of Wisconsin filed a lawsuit against Meta and Snap after her 16-year-old son, CJ, died by suicide in January 2014. The complaint alleges that CJ became addicted to Meta’s platforms, resulting in sleep deprivation and body image obsession. It claims Meta’s algorithms encouraged excessive use, contributing to his mental decline, culminating in him posting a cryptic message on Facebook before taking his life. The case, filed in the U.S. District Court for the Eastern District of Wisconsin, emphasizes the platforms’ failure to warn about known risks.
Unnamed California Case (December 2022)Parents in California sued Meta after their teenage daughter took her life following exposure to a hanging video on one of its platforms. Reported by Fox News, the lawsuit contends that Meta’s failure to moderate harmful content and its addictive features directly contributed to the girl’s death. Specific details, such as the victim’s name and exact filing date, are less documented, but the case aligns with broader claims of negligence and inadequate safety measures.
Broader Context: Multi-Plaintiff LitigationBeyond individual cases, Meta faces lawsuits from hundreds of parents and guardians as part of consolidated actions, such as “In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation” (Case No. 4:22-md-03047-YGR). These lawsuits, including some suicide-related claims, argue that Meta’s platforms caused addiction and mental health crises in teens, often citing internal documents from whistleblower Frances Haugen showing the company’s awareness of these harms. While not all specify suicide, the allegations overlap with individual cases.
These lawsuits commonly argue that Meta’s platforms use manipulative algorithms and fail to protect minors from harmful content, violating product liability principles and consumer protection laws. Outcomes vary: some cases remain ongoing, while others face challenges under Section 230 of the Communications Decency Act, which shields tech companies from liability for user-generated content. However, judicial rulings, like a 2024 decision allowing 34 states’ claims against Meta to proceed, suggest growing legal traction for claims focused on design flaws rather than content alone. Specific resolutions for the named cases are not fully detailed in public records as of March 24, 2025, indicating many are still in litigation.

https://amzn.to/3FO2crQ $16.99 Amazon ebook
https://amzn.to/4bBxQ85 $17.71 Amazon Audiobook
http://tiny.cc/2kyd001 $2..99 Summary Ebook
http://tiny.cc/9kyd001 $4.99 Summary audiobook
https://amzn.to/420Fsg1 $Varies The Great Gatsby
Comments