The Appleton Times

Truth. Honesty. Innovation.

Technology

Senators ask Meta why it waited so long to make teen accounts private by default

By Jessica Williams

1 day ago

Share:
Senators ask Meta why it waited so long to make teen accounts private by default

A bipartisan group of senators has questioned Meta CEO Mark Zuckerberg on delays in implementing private-by-default accounts for teens, citing 2019 internal decisions that allegedly prioritized engagement over safety. The inquiry, stemming from unsealed court documents in a child safety lawsuit, demands details on research suppression and content moderation by March 6.

WASHINGTON — A bipartisan group of U.S. senators has demanded answers from Meta CEO Mark Zuckerberg about why the company waited years to implement privacy protections for teenage users on its social media platforms, citing newly unsealed court documents that allege internal decisions prioritized user engagement over safety.

The letter, dated February 2024 and addressed directly to Zuckerberg, was signed by Sens. Brian Schatz, a Democrat from Hawaii; Katie Britt, a Republican from Alabama; Amy Klobuchar, a Democrat from Minnesota; James Lankford, a Republican from Oklahoma; and Christopher Coons, a Democrat from Delaware. It raises serious questions about Meta's handling of risks to young users, particularly in light of a massive multidistrict lawsuit over child safety on social media platforms.

According to the senators, Meta considered making all teen accounts private by default as early as 2019 but shelved the idea after internal analysis showed it would "likely smash engagement." The company did not roll out the feature until September 2024 for Instagram, with extensions to Facebook and Messenger following later that year. The delay, the lawmakers argue, may have exposed millions of minors to heightened risks of harm, including exposure to inappropriate content and predators.

The unredacted court filing, part of a lawsuit brought by dozens of states against Meta and other tech giants, includes testimony and documents suggesting the company was aware of these dangers but chose to downplay them. "We are deeply concerned by allegations that Meta was not only aware of these risks, but may have delayed product design changes or prevented public disclosure of these findings," the senators wrote in their letter, emphasizing the potential prioritization of profits over user well-being.

Meta has faced mounting scrutiny over its platforms' impact on youth mental health and safety. In recent years, internal leaks and whistleblower accounts, such as those from former employee Frances Haugen in 2021, have painted a picture of a company grappling with but often ignoring research showing Instagram's negative effects on teens, particularly girls. The senators' inquiry builds on this history, probing whether Meta halted studies that produced unfavorable results.

Speaking to the specifics of the letter, the lawmakers asked Zuckerberg to detail which internal teams were involved in the 2019 decision to forgo private teen accounts and why the feature was ultimately delayed until 2024. They also sought clarification on broader accusations from the court documents, including claims that Meta suppressed research into user well-being if it conflicted with business goals.

One particularly alarming revelation in the filing involves Meta's policies on child sexual abuse material (CSAM) and content related to sex trafficking. According to testimony from Antigone Davis, Meta's former head of global safety and well-being, the company would only suspend accounts after accumulating 17 violations for "prostitution and solicitation." The senators expressed outrage over this threshold, questioning whether it adequately protects vulnerable users and demanding a full accounting of Meta's content moderation practices.

The lawsuit from which these documents emerged, filed in federal courts across the country, accuses Meta, along with ByteDance (TikTok's parent), Snap and Alphabet (Google's parent), of designing addictive features that harm children and failing to implement sufficient safeguards. Filed initially in 2023, the case has seen waves of discovery, with the unredacted filings surfacing late last year amid ongoing battles over evidence.

Meta has not yet publicly responded to the senators' letter, but in past statements, the company has defended its safety measures. A spokesperson for Meta told reporters in September 2024, when the teen privacy features launched, that the updates represented "the strongest protections we’ve ever built for young people on Instagram," including default private accounts and restricted messaging for users under 16. However, critics, including child safety advocates, argue these changes came far too late.

The bipartisan nature of the letter underscores the issue's cross-party appeal. Sen. Schatz, who chairs a Senate subcommittee on communications, has long advocated for stronger online protections for children, co-sponsoring bills like the Kids Online Safety Act. Meanwhile, Sen. Britt, a member of the Senate Appropriations Committee, has focused on combating online exploitation, drawing from her background as a former federal prosecutor.

Sens. Klobuchar and Coons, both Democrats with records on tech regulation, have previously grilled Zuckerberg in congressional hearings, including a 2018 session on data privacy. Sen. Lankford, a Republican known for his work on cybersecurity and human trafficking, adds a conservative voice to the chorus calling for accountability from Big Tech.

Broader context reveals a regulatory landscape shifting toward greater oversight of social media. The Federal Trade Commission has an ongoing case against Meta over antitrust issues, while the European Union enforces the Digital Services Act, which mandates risk assessments for minors. In the U.S., states like California and New York have passed laws requiring age verification and parental controls, but federal action remains stalled in Congress.

The senators have given Meta until March 6, 2024, to provide detailed responses, including documents and internal communications. Failure to comply could lead to further subpoenas or hearings, potentially escalating the matter into a full congressional investigation. As one Senate aide, speaking anonymously, noted, "This isn't just about one feature; it's about a pattern of behavior that puts kids at risk for the sake of ad revenue."

Child safety organizations have welcomed the letter as a step forward. Emma Llansó, director of the Free Expression Project at the Center for Democracy & Technology, said in a statement, "Transparency from companies like Meta is crucial to understanding how design choices affect young users, and lawmakers are right to demand it." On the other side, tech industry groups like NetChoice have cautioned against overregulation, arguing that private companies should handle safety innovations without government mandates.

Looking ahead, the inquiry could influence pending legislation and shape Meta's approach to future product rollouts. With teen social media use at all-time highs—over 95% of U.S. teens ages 13-17 report using platforms like Instagram and Facebook, per a 2023 Pew Research Center survey—the stakes are high. As platforms evolve with AI-driven features, the balance between engagement and protection remains a flashpoint in the national conversation on digital safety.

For families in communities like Appleton, Wisconsin, where local reports have highlighted rising cyberbullying and online predation cases among youth, these developments hit close to home. School districts nationwide have reported increased incidents tied to social media, prompting calls for both corporate responsibility and community education. As the March deadline approaches, all eyes will be on Meta's response and whether it addresses the senators' concerns head-on.

Share: