The Appleton Times

Truth. Honesty. Innovation.

Science

Is this social media's 'tobacco' moment?

By Michael Thompson

6 days ago

Share:
Is this social media's 'tobacco' moment?

U.S. courts are seeing over 2,000 lawsuits against social media companies like Meta and TikTok, drawing parallels to 1990s tobacco trials by targeting addictive platform designs rather than user content. Key cases involve a young woman's mental health struggles, deaths from viral challenges, and teen sextortion suicides, with companies defending their safeguards amid calls for reform.

By Michael Thompson, The Appleton Times

In a Los Angeles courtroom this week, Meta CEO Mark Zuckerberg faced pointed questions about the design of social media platforms, as a high-stakes trial unfolded accusing tech giants of contributing to mental health crises among young users. The case, centered on a 20-year-old Californian plaintiff identified only as KGM, marks one of the first major challenges to social media companies under a novel legal strategy that sidesteps traditional defenses. With more than 2,000 lawsuits pending in U.S. courts alleging harm from platforms like Instagram, TikTok, Snapchat, and YouTube, legal experts are drawing parallels to the tobacco industry's reckoning in the 1990s, when companies were held accountable for knowingly addictive and harmful products.

The ongoing trial in Los Angeles federal court focuses not on specific user-generated content, but on the platforms' addictive features, such as infinite scrolling and photo filters, which KGM's lawyers claim exacerbated her anxiety, depression, and body image issues. This approach aims to bypass Section 230 of the Communications Decency Act, a 1996 law that has long shielded tech companies from liability for what users post. By targeting product design, plaintiffs hope to establish precedent for holding companies responsible for foreseeable harms, much like the tobacco trials that resulted in billions in settlements and regulatory changes starting in the mid-1990s.

Snapchat and TikTok reached out-of-court settlements with KGM before the trial began, but Meta and Alphabet, YouTube's parent company, are vigorously defending their platforms. During his testimony on Wednesday, Zuckerberg emphasized Meta's mission-driven approach, stating, "The company's philosophy has always been to try to build useful services that people connect to." He denied that Meta set internal targets for user engagement time, adding, "If something is valuable, people will do it more because it's useful to them."

Zuckerberg also addressed efforts to protect young users, noting that Meta has worked to remove underage accounts. "Any implication to the contrary was not true," he said, though he conceded, "I don't think we identified every single person who tried to get around restrictions." Toward the end of his appearance, he turned to the families of affected teens in the courtroom and apologized, saying, "I'm sorry for everything you have all been through."

Instagram head Adam Mosseri, who testified last week, pushed back on claims of clinical addiction, describing excessive use as "problematic" rather than addictive in a medical sense. Meta, in a statement to reporters, framed the case around KGM's broader life challenges: "The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff's mental health struggles. The evidence will show she faced many significant, difficult challenges well before she ever used social media." If successful, KGM's case could set benchmarks for compensation in similar suits and potentially force redesigns to curb addictive elements.

Another closely watched case, filed in Delaware federal court, involves five British parents suing TikTok over the deaths of their children, who they allege were influenced by the platform's "blackout challenge"—a dangerous trend involving self-asphyxiation. The plaintiffs, including Lisa Kenevan, whose 13-year-old son Isaac died in 2022, argue that TikTok's algorithm promoted an endless stream of harmful content to impressionable young users. Kenevan, speaking to reporters last month, described her son as a cheerful boy with no prior mental health issues and asked, "How the hell do you, as a parent, get your head around that?"

The parents' lawsuit, still in early stages with an update expected before mid-April, does not target the videos themselves but the algorithm's role in amplifying risks. TikTok has expressed sympathy while defending its practices, stating, "Our deepest sympathies remain with these families. We strictly prohibit content that promotes or encourages dangerous behaviour." The company highlighted its enforcement, claiming, "Using robust detection systems and dedicated enforcement teams to proactively identify and remove this content, we remove 99% that's found to break these rules before it is reported to us." A victory here could compel TikTok to overhaul its recommendation system, particularly for minors.

In a third significant lawsuit, the family of 16-year-old Murray Dowey from Scotland is taking Meta to court in the U.S., alleging that sextortion on Instagram contributed to his suicide in 2023. Joined by the mother of 16-year-old Levi from the U.S., who died under similar circumstances, the case marks the first time a U.K. family has sued a social media company directly over such exploitation, shifting focus from perpetrators to platform vulnerabilities. Murray's family contends that Instagram's recommendation algorithms and data practices enabled scammers to target him, despite Meta's default restrictions for users under 16.

The suit challenges Meta's safeguards for older teens, whom the family says remain at risk. A Meta spokesperson outlined recent measures, saying, "Since 2021, we've placed teens under 16 into private accounts when they join Instagram in the U.S." Additional protections include, "We work to prevent accounts showing suspicious behaviour from following teens and avoid recommending teens to them. We also take other precautionary steps, like blurring potentially sensitive images sent in DMs and reminding teens of the risks of sharing them, and letting people know when they're chatting to someone who may be in a different country."

These cases represent a bundled effort by lawyers who have consolidated thousands of individual claims into stronger test suits, aiming to streamline the litigation wave that has swelled since 2021. The sheer volume—over 2,000 active cases—signals growing scrutiny of Big Tech's role in youth mental health, with plaintiffs citing studies linking heavy social media use to increased rates of depression and anxiety among teens. In the 1990s tobacco trials, companies like Philip Morris eventually settled for $206 billion in 1998 after internal documents revealed deliberate manipulation of nicotine levels for addiction.

Social media defenders, including industry groups, argue that platforms foster connection and that personal responsibility plays a key role in usage. Yet, as these trials progress, the outcomes could echo the tobacco era by exposing internal research on engagement tactics. For instance, leaked documents in past suits have shown tech firms tracking teen behavior to boost retention, fueling accusations of prioritizing profits over safety.

Beyond the courtroom, the litigation wave is prompting voluntary changes. TikTok recently enhanced parental controls, while Meta expanded teen account limits in Europe under regulatory pressure. However, plaintiffs' attorneys warn that without judicial mandates, reforms may remain superficial. The KGM trial, expected to wrap arguments soon, could deliver the first jury verdict on these design-focused claims as early as this summer.

If any of these cases succeed, the ripple effects could extend globally, influencing regulations like the European Union's Digital Services Act, which imposes stricter content moderation on large platforms. In the U.S., lawmakers from both parties have introduced bills to amend Section 230, though progress has stalled amid free speech concerns. For families like the British parents and the Doweys, the suits offer not just potential compensation but a push for systemic accountability.

As Zuckerberg's testimony concluded, the packed courtroom reflected the emotional stakes: grieving parents, young survivors, and executives navigating a shifting legal landscape. With trials unfolding in California, Delaware, and potentially beyond, the question lingers whether social media's "tobacco moment" has arrived, forcing a reckoning on how digital products shape young lives.

Share: