The Appleton Times

Truth. Honesty. Innovation.

Science

Instagram and Facebook owner Meta ordered to pay £280m for knowingly harming children

By Jessica Williams

11 days ago

Share:
Instagram and Facebook owner Meta ordered to pay £280m for knowingly harming children

A New Mexico jury has ordered Meta to pay $375 million in damages for concealing child sexual exploitation on its platforms, marking a landmark victory for accountability in social media safety. The company plans to appeal, amid a surge of similar lawsuits from over 40 U.S. states alleging addictive designs harm youth mental health.

SANTA FE, New Mexico — In a groundbreaking verdict that could reshape the landscape of social media accountability, a jury in New Mexico has ordered Meta, the parent company of Facebook, Instagram, and WhatsApp, to pay $375 million (£280 million) in damages for knowingly harming children's mental health through its platforms. The decision, handed down after a seven-week trial in Santa Fe, marks the first time a U.S. jury has ruled against the tech giant on claims related to child sexual exploitation and the addictive design of its apps. Prosecutors argued that Meta concealed internal knowledge of these dangers while prioritizing profits, violating the state's Unfair Practices Act.

The case stemmed from a 2023 lawsuit filed by New Mexico Attorney General Raúl Torrez, who accused Meta of making false or misleading statements about the safety of its platforms and engaging in unconscionable trade practices that exploited children's vulnerabilities and inexperience. Central to the prosecution's evidence was an undercover investigation conducted by state agents, who created social media accounts posing as minors to document instances of sexual solicitations and Meta's inadequate responses. According to court records, these operations revealed a pattern where harmful interactions persisted despite the company's awareness.

Jurors deliberated for several days before agreeing with the allegations on all counts, finding that Meta's actions contributed to real-world harm for young users. "This is a watershed moment," said Sacha Haworth, executive director of the watchdog group The Tech Oversight Project, in a statement following the verdict. "Meta's house of cards is beginning to fall. For years, it's been glaringly obvious that Meta has failed to stop sexual predators from turning online interactions into real world harm."

Campaigners and affected families hailed the ruling as a rare victory in the ongoing battle against Big Tech. ParentsSOS, a coalition of families who have lost children to harms linked to social media, described the outcome as "a momentous milestone in the years-long fight to hold Big Tech accountable for the dangers their products pose to our kids." The group, representing parents who have endured the "unimaginable — the death of a child because of social media harms," emphasized that the verdict underscores the urgent need for stronger protections.

Meta, however, vehemently disputed the jury's findings and vowed to appeal. A company spokesperson stated, "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content." The spokesperson added, "We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online." During the trial, Meta's lawyers argued that the company had disclosed known risks to users and invested significantly in safety measures, even if some harmful content occasionally slipped through.

In closing arguments, Meta's attorney Kevin Huff told jurors, "Evidence shows not only that Meta invests in safety because it's the right thing to do but because it is good for business." Huff further contended that the platforms are "designed to help people connect with friends and family, not to try to connect predators." The company has maintained that while "problematic use" of social media exists, it does not endorse the concept of addiction and aims to ensure users feel positive about their time online.

This New Mexico trial is part of a broader wave of legal challenges facing Meta across the United States. More than 40 state attorneys general have filed similar lawsuits, alleging that features on Instagram and Facebook — such as infinite scrolling, notifications, and algorithmic recommendations — are deliberately engineered to be addictive, exacerbating a mental health crisis among young people. These suits claim that Meta's internal research, leaked in previous whistleblower cases like that of former employee Frances Haugen in 2021, showed executives were aware of the platforms' negative impacts on teens as early as 2019 but chose not to act decisively.

The timing of the verdict coincides with heightened scrutiny of social media's role in youth well-being. Just weeks ago, a separate high-profile trial in Los Angeles began deliberations on similar addiction claims against Meta and other tech firms, spotlighting how algorithms may contribute to issues like anxiety, depression, and body image disorders. In the UK, regulators have also ramped up pressure, with recent announcements allowing under-13s on WhatsApp only with parental consent, reflecting global concerns over age-appropriate access.

New Mexico's investigation, which began in late 2022, uncovered what prosecutors described as systemic failures. Agents posing as 14-year-olds on Instagram and Facebook reported receiving unsolicited sexual messages within hours of account creation, with Meta's moderation tools often failing to intervene promptly. One documented case involved over 100 interactions in a single week, highlighting delays in content removal that lasted days or even weeks. Torrez's office argued that these lapses were not isolated but part of a profit-driven model where ad revenue from engaged young users outweighed safety investments.

Experts testifying for the state pointed to Meta's own data, which reportedly showed that 32% of teen girls on Instagram experienced worsening body image issues due to the platform. While Meta acknowledged these studies during the trial, it emphasized ongoing efforts like parental controls, age verification tools, and AI-driven detection systems rolled out in recent years. The company reported removing millions of pieces of child exploitation content annually, though critics argue these measures are reactive rather than preventive.

The financial implications of the ruling are significant, though the $375 million award is relatively modest compared to Meta's $134 billion in annual revenue. Still, it sets a precedent that could influence the dozens of pending cases. Legal analysts suggest that if upheld on appeal, the decision might encourage more states to pursue aggressive enforcement under consumer protection laws, potentially leading to stricter federal regulations.

Beyond the courtroom, the verdict has sparked discussions in policy circles. U.S. lawmakers, including members of the Senate Judiciary Committee, have cited similar concerns in pushing for bills like the Kids Online Safety Act, which aims to mandate safety standards for minors on tech platforms. Internationally, the European Union's Digital Services Act, effective since 2023, imposes hefty fines — up to 6% of global turnover — for failing to protect users from harmful content, a framework that could amplify pressure on companies like Meta.

For families like those in ParentsSOS, the ruling offers a glimmer of hope amid personal tragedies. One parent, speaking anonymously through the group, recounted how their 13-year-old daughter's exposure to toxic online interactions on Instagram contributed to severe mental health struggles, culminating in a suicide attempt. Such stories, woven into the trial testimony, humanized the abstract legal arguments and swayed jurors toward accountability.

As Meta prepares its appeal, expected to be filed within weeks in a higher New Mexico court, the company faces mounting public relations challenges. Stock prices dipped slightly following the news, though analysts predict minimal long-term impact unless similar verdicts pile up. Meanwhile, advocacy groups are mobilizing for the next phase, urging Congress to intervene where states alone cannot.

The New Mexico case underscores a pivotal shift in how society views social media: no longer just a tool for connection, but a potential vector for harm that demands rigorous oversight. With trials ongoing and regulators worldwide on alert, the coming months could determine whether this verdict is a isolated win or the start of transformative change for the tech industry.

Share: