WASHINGTON — As Section 230 of the Communications Decency Act marks its 30th anniversary on Thursday, the landmark law that laid the foundation for the modern internet is confronting what may be its most significant challenges yet. Signed into law by President Bill Clinton on February 8, 1996, as part of the broader Telecommunications Act, Section 230 has shielded online platforms from liability for user-generated content, fostering the growth of social media and countless digital services. But today, with a wave of lawsuits and legislative proposals threatening to erode its protections, the statute's future hangs in the balance.
The core of Section 230 consists of just 26 words: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This provision, often called “the twenty-six words that created the internet,” allows platforms to host third-party content without being sued as if they authored it. A companion “Good Samaritan” clause permits companies to moderate harmful material in good faith, such as blocking obscene or harassing posts, without fear of civil lawsuits. Notably, the law offers no protection against criminal charges.
Over three decades, Section 230 has weathered the dot-com bubble's burst in the early 2000s and a 1997 Supreme Court ruling in Reno v. ACLU that invalidated much of the surrounding Communications Decency Act for violating the First Amendment. Yet its resilience is now tested by a bipartisan push in Congress to overhaul or eliminate it. Senators Dick Durbin (D-IL) and Lindsey Graham (R-SC) have introduced a bill to sunset Section 230 in two years, aiming to compel tech companies and lawmakers to negotiate reforms addressing online harms like child exploitation and addiction.
At a press conference last week in Washington, D.C., former House Minority Leader Dick Gephardt (D-MO), who voted for the law in 1996, joined actor Joseph Gordon-Levitt and parents bereaved by online tragedies to rally support for the sunset proposal. Gephardt reflected on his vote, saying, “As minority leader in the house in 1996, I voted for it because social media platforms told us that without that protection, America would never have an internet economy. They also said that the platforms were just a dumb pipe that just carried content produced by others.” He added that lawmakers back then had no inkling of algorithms' power to captivate users, calling them tools that could “brainwash” people. “Armed with new knowledge about the technology, it’s time to ‘correct the action that I and many others made 30 years ago,’” Gephardt said.
The event highlighted personal stories of loss linked to social media. Kristin Bride, whose 16-year-old son Carson died by suicide in 2020 after cyberbullying on the anonymous messaging app Yolo — integrated with Snapchat — described the devastating impact of Section 230 on her quest for justice. “I had wanted discovery, a jury, a trial, and an opportunity to face the creator of Yolo, Gregoire Henrion, and look him in the eyes and let him know how much his priorities for making fast money over kids’ online safety have destroyed our family,” Bride said. An appeals court allowed her lawsuit against Yolo for product misrepresentation to proceed, but years of Section 230-related appeals left the app as a defunded shell, denying her the full reckoning she sought. The “second darkest day” of her life, she recounted, was when an attorney informed her that the law barred recourse against the platforms.
Critics argue that Section 230, once a safeguard for a nascent industry, now unduly protects profitable giants like Meta and Google from accountability for harms facilitated on their sites, including sextortion, fentanyl-related deaths, and addictive designs. Dani Pinter, chief legal officer of the National Center on Sexual Exploitation (NCOSE), spoke at the conference, asserting, “Even with the language of 230 how it is now, I don’t believe they should be given immunity in some of the cases they are.” She blamed judicial interpretations for allowing the law to evolve unchecked, saying, “I think part of it is judges and lawyers don’t necessarily really get how these tech companies function.” Pinter advocated for scrapping and rewriting the statute: “I think we need to take 230 away, rewrite it to restart the clock.”
Senator Durbin echoed this frustration during the event, declaring, “The only business enterprise in America which is held harmless from their own wrongdoing is Big Tech.” While affirming his commitment to the Constitution and free expression, he noted, “there are limits.” The lawmakers' bill seeks to end the status quo, forcing negotiations on issues like child safety and content moderation.
Defenders of Section 230, however, warn that repealing or weakening it could stifle free speech and harm smaller voices online. Senator Ron Wyden (D-OR), a co-author of the provision alongside former Rep. Chris Cox (R-CA), called the current moment “the worst possible time to repeal Section 230” in a phone interview with The Verge. Wyden cautioned that such a move would empower figures like former President Donald Trump and his allies to reshape online speech laws, likening it to “handing [Trump] a grenade launcher pointed right at people who want to have a voice.” He emphasized the law's role in protecting platforms like Bluesky, Wikipedia, and activists monitoring U.S. Immigration and Customs Enforcement (ICE), saying, “the law stands for exactly the same thing, which is: Are you going to stand up for people who don’t have power, don’t have clout, and are looking for a way to be heard?” Wealthy entities, he noted, would always find ways to amplify their messages regardless.
Wyden recounted the law's origins over a casual lunch with Cox in a congressional cubbyhole, aimed at countering court rulings that penalized platforms for moderating objectionable content while letting passive hosts off the hook. Today, he and other proponents argue Section 230 encourages essential moderation to prevent sites from becoming “instant cesspools” and avoids pressuring companies to censor government-disfavored speech — a concern heightened by tech executives' recent interactions with the Trump administration, including multimillion-dollar lawsuit settlements and policy shifts post-inauguration.
Amy Bos, vice president of government affairs at NetChoice — whose members include Meta, Google, Pinterest, and Reddit — questioned the internet's fate without the law: “What would the internet look like without 230? It would force platforms, websites to remove third-party content. This is content created by everyday Americans.” Bos highlighted how the protections enable scale moderation for user-generated material from ordinary people.
A series of high-profile court cases this year could further test Section 230's boundaries, potentially reaching the Supreme Court. New Mexico's attorney general has sued Meta, alleging the platform facilitates child predators. Separate suits by individuals and school districts claim social media's designs are addictive, harming youth. These trials will examine what platform decisions qualify as negligence versus protected third-party speech or First Amendment rights. A prior case against Snap, involving its speed filters allegedly encouraging reckless driving, succeeded in piercing Section 230 immunity, a outcome Wyden endorsed as a model for targeted reforms on product design without broadly curbing speech or moderation.
Wyden expressed openness to specific changes, provided they “can’t target constitutionally protected speech, [and] can’t discourage moderation.” He criticized existing bills for violating these principles. The law's last major update came in 2018 with the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), which stripped protections for platforms promoting prostitution or sex trafficking. This led to the closure of Backpage.com, seen as a win against exploitation, but sex workers reported increased dangers without a safe vetting space. A 2021 U.S. Government Accountability Office report found FOSTA invoked in few cases.
Both Section 230's creators and critics concur on one frontier: artificial intelligence. In a 2023 Fortune op-ed, Cox and Wyden wrote, “The law plainly states that it does not protect anyone who creates or develops content, even in part–and generative AI applications such as ChatGPT, by definition, create content.” As AI booms, lawmakers revisit 1990s debates on nurturing innovation while curbing risks. Ahead of the anniversary, a coalition of child safety and AI advocacy groups urged Senate leaders against new laws preempting state AI regulations, warning they could mirror Section 230's unintended dynamics.
Looking ahead, the interplay of legislation, litigation, and technological evolution will shape Section 230's legacy. Proponents of reform see the sunset bill as a catalyst for accountability, while guardians of the status quo fear it could fragment the open web. With trials unfolding and congressional debates intensifying, the law that enabled the internet's explosive growth now stands at a crossroads, its protections — once unassailable — under unprecedented scrutiny.
