The bill represents the culmination of a legislative trend on child safety that traces back to fall of 2021.
Members of Congress love to complain about social media platforms. The spectacle of tech executives being berated in front of congressional committees has, over the past few years, become a semi-regular feature of American politics, and lawmakers have introduced piles of legislation aimed at regulating technology companies.
It’s far rarer, though, for a bill to garner enough support to actually make it through one chamber of Congress—let alone two. Since the beginning of the “techlash,” Congress has passed only one piece of legislation on the subject, carving out a loosely defined exception in 2018 to Section 230’s liability shield for online content related to sex trafficking.
But in recent weeks, both the House and the Senate have shown surprising initiative in moving forward with tech legislation. In the House, lawmakers passed a bill that would mandate the divestment of TikTok by its owner, the China-based company ByteDance. In the Senate, the Kids Online Safety Act (KOSA), which would impose new responsibilities on tech platforms for looking after their minor users, has gathered a whopping 62 co-sponsors—including Senate Majority Leader Chuck Schumer (D-N.Y.)—and seems likely to make it through the chamber. Neither bill has a clear shot at becoming law: Schumer has been noncommittal about the TikTok legislation, and KOSA lacks a companion bill in the House. Still, given how Congress has struggled to move forward on tech legislation, this sudden burst of activity is striking.
There’s a great deal to be said about the TikTok legislation—both its legal and policy merits as well as the politics behind it. Here, though, I want to focus on KOSA. Of the many proposals to regulate platforms’ content moderation processes that have been discussed in recent years, the legislation has come the farthest. It’s worth taking a close look at the bill, both to evaluate its merits and to understand what the legislation says about the path that tech policymaking has taken.
In many ways, KOSA represents the culmination of a legislative trend that traces back to fall 2021 when the Wall Street Journal released a series of blockbuster reports on Facebook’s profound failures to moderate its platform effectively. Frances Haugen, the whistleblower who had provided the Journal with the underlying documents behind the investigation, testified before the Senate about the dangers posed by Facebook’s (now Meta’s) products to child safety. In the intervening years, proposals for tech regulation focusing on child safety specifically have grown increasingly prominent, particularly at the state level: According to a report from the University of North Carolina’s Center on Technology Policy, 23 new child safety laws were passed in 13 states over the course of 2023 alone. Now, with KOSA, the federal government has gotten in on the action.
The bill’s flagship provision is a “duty of care” that requires platforms to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate” a set of particular “harms to minors,” including specified “mental health disorders” such as depression, “patterns of use that indicate or encourage addiction-like behaviors,” violence, drug use, and deceptive marketing or financial harms. The bill also sets out particular “safeguards” that platforms must adhere to—these are, essentially, particular design features, such as establishing default privacy settings for minors at the highest level and allowing parents some ability to oversee their childrens’ use of social media.
KOSA has been through a fair amount of revision since Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.) introduced their proposal in February 2022. Early drafts drew sharp criticism from internet freedom groups and LGBTQ+ advocates, who worried that the bill’s establishment of a vaguely defined duty of care could counterintuitively encourage platforms to cut off access to crucial information for vulnerable youth. Key to this concern was how the legislation would allow the duty of care to be enforced through litigation brought by state attorneys general—especially given a political climate in which Republican politicians, especially at the state level, were increasingly attacking even age-appropriate LGBTQ+ resources as “grooming.” “Online services would face substantial pressure to over-moderate, including from state Attorneys General seeking to make political points about what kind of information is appropriate for young people,” warned a November 2022 open letter signed by a range of organizations including the Center for Democracy and Technology (CDT), the American Civil Liberties Union (ACLU), and other internet freedom and LGBTQ+ groups.
The concern here isn’t just that a red-state attorney general might use KOSA to bring a lawsuit against, say, Meta for allowing a 13-year-old to browse information about gender identity. It’s also that risk-averse platforms could preemptively remove or close off access to this information to avoid getting sued in the first place. After Congress expanded platforms’ liability for content related to sex trafficking with the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), many technology companies sought to limit their legal exposure by over-moderating and shutting down services and removing content that had nothing to do with trafficking. FOSTA operated by carving through an exception to Section 230 immunity, which KOSA doesn’t touch. But the underlying dynamic—increasing platforms’ legal responsibilities for what happens on their services—is similar.
Though advocacy groups focused on KOSA’s potential effects on LGBTQ+ youth, it’s not hard to imagine how this increased legal responsibility could lead platforms to constrain other kinds of speech as well—including in ways that would cut the opposite direction politically. Consider, for example, a blue-state attorney general suing Elon Musk’s X for allowing minors to access material questioning the usefulness of coronavirus vaccines.
But the focus on KOSA’s effects on LGBTQ+ youth was fueled in part by comments from the bill’s own supporters on the right. As senators worked through revisions to the bill in spring 2023, the right-wing Heritage Foundation trumpeted on Twitter that KOSA should be used to block minors’ access to material about transgender identity, writing, “Keeping trans content away from children is protecting kids.” Months later, video surfaced of Blackburn appearing to suggest in an interview that KOSA would aid in “protecting minor children from the transgender in this culture.” (In response to reporting on the video, Blackburn’s legislative director stated on Twitter that “KOSA will not—nor was it designed to—target or censor any individual or community.”)
The rollout of the most recent draft of the legislation this February seems designed to address these concerns. The biggest change from the original bill has to do with the duty of care provision, which now requires platforms to “exercise reasonable care in the creation and implementation of any design to prevent and mitigate” specifically delineated “harms to minors,” including several mental health disorders as defined by the “Diagnostic and Statistical Manual of Mental Disorders.” (An FAQ released by Blumenthal and Blackburn alongside the new draft emphasizes that the duty of care now “only applies to a fixed and clearly established set of harms.”) Not only is the duty of care defined more narrowly, but its enforcement is no longer up to state attorneys general—instead, the Federal Trade Commission (FTC) is responsible. State attorneys general do, however, still retain the power to enforce several other sections of the bill, which impose requirements on platforms for instituting and disclosing the availability of certain design features (giving minors the ability to control personalized recommendations, for example) and mandate yearly transparency reports.
The new version of KOSA specifies that the duty of care doesn’t cover searches made by minors, as opposed to content recommended to them by platforms—which would run up against Section 230’s protections against platform liability for third-party content. At least in theory, according to Blumenthal and Blackburn, Google, for example, wouldn’t risk liability by serving up resources to a transgender teenager searching for information about gender-affirming health care. The duty also doesn’t preclude platforms from displaying information “providing resources for the prevention or mitigation” of the “harms to minors” listed in the bill—the idea being that Google also wouldn’t have any reason to block our hypothetical teenager from, say, the National Suicide Prevention Lifeline.
These changes seem to have convinced at least some critics of the bill to change their tune. Blumenthal’s office pointed to a letter from seven LGBTQ+ organizations, including GLAAD and the Human Rights Campaign, dropping their opposition to the bill on the grounds that the most recent changes to KOSA “significantly mitigate the risk of it being misused to suppress LGBTQ+ resources or stifle young people’s access to online communities.”
But other groups, such as the Electronic Frontier Foundation and the trade organization NetChoice, reiterated their opposition. Meanwhile, the CDT, the ACLU, and the digital rights organization Fight for the Future all released statements expressing approval for the revisions to the bill but voiced continued concern over the legislation’s potential negative effects.
KOSA’s flagship duty of care provision remains a topic of contention for critics. Yes, state attorneys general will no longer have the power to enforce the duty of care, which limits possibilities for politicization. But what if a combative FTC in a second Trump term takes the view that LGBTQ+ content boosted by algorithmic recommendations—that is, by a design feature covered by the duty of care—contributes to youth mental illness? Along similar lines, while the Blumenthal and Blackburn FAQ suggests that the bill’s protections for helpful resources would cover “LGBTQ youth centers,” nothing in the bill text specifies this—leaving open the possibility that platforms could be pursued by the FTC for allowing minors to access such material.
As an agency led by Senate-confirmed officials, the FTC is more constrained than state attorneys general, the vast majority of whom are popularly elected and more responsive to local political pressures. Still, providing this authority to the agency could be risky given the stated plans of Trump allies to aggressively leverage the administrative state for far-right ends under a second Trump administration.
Consider, too, Section 301 of the legislation, which is new to this latest version. This section explicitly establishes that KOSA preempts conflicting state measures—but specifies that the legislation does not prohibit states “from enacting a law, rule, or regulation that provides greater protection to minors than the protection provided by the provisions of this Act.” Despite this added language, this provision is still vague. Just what does “greater protection to minors” mean? That’s not clarified—which means that state legislatures across the political spectrum can interpret it however they wish. It’s not hard to imagine a red state constructing legislation that would allow its attorney general to sue platforms for some of the same design choices that state attorneys general were originally able to bring litigation over pursuant to earlier drafts of KOSA. In that respect, this provision seems a bit like a legislative pressure valve, shunting the possibility of malicious enforcement from one portion of the bill to another.
While the duty of care has received the most attention, KOSA’s critics have also voiced concern over just how platforms will be able to distinguish which users are children and which are adults. This is not exactly straightforward, either technically or legally. Commentary on the initial 2022 KOSA draft suggested that the bill could de facto require platforms to require some type of age verification to avoid legal liability, but more recent versions of the legislation specify that the law does not require platforms “to implement an age gating or age verification functionality.” With that in mind, it’s not quite clear how platforms are meant to put the legislation’s requirements into practice. That’s particularly true given how the most recent version of KOSA states that platforms’ responsibilities to minors kick in only if the platforms have “actual knowledge or knowledge fairly implied on the basis of objective circumstances” that the user in question is underage. How will platforms understand their obligations under this requirement?
Other provisions of the legislation have also been tweaked. A provision that originally allowed for independent researcher access to platform data has vanished from recent drafts, replaced with a requirement that the National Academies of Sciences, Engineering, and Medicine conduct a series of studies on “the risk of harms to minors by use of social media and other online platforms.” According to Tech Policy Press, the provision on researcher access had “faced industry opposition and conservative pushback related to concerns about censorship.”
What’s particularly striking, though, is something else that isn’t in the bill: anything to do with Section 230.
The provision, which protects platforms from liability for third-party content, has been at the heart of most conversations about tech policy legislation for the past few years. Both Democrats and Republicans have called for its repeal or revision, introducing dozens upon dozens of bills on the subject. In fact, the last major piece of technology policy passed into law by Congress itself—FOSTA in 2018—constituted the first major change to Section 230 since the statute became law in 1996. Both Blumenthal and Blackburn have themselves sponsored legislation that would chip away at Section 230’s protections, and Blumenthal has called the law “outdated and outmoded.”
Yet not only does KOSA not touch Section 230’s liability shield, but the FAQ from Blumenthal and Blackburn actually seems to frame that absence as a selling point protecting against potential abuse of KOSA by the FTC. Because “the Kids Online Safety Act does not amend Section 230 of the Communications Decency Act,” the FAQ explains, “any lawsuits brought by the FTC over the content that online platforms host are likely to be quickly tossed out of court.”
This shift of focus away from Section 230 is a welcome one. As much as critics of big technology platforms like to blame the statute for the internet’s ills, the fact is that many of the proposed reforms to Section 230 either would be completely ineffective in addressing the problem in question or would likely make the issue even worse. (If your complaint is that social media companies remove too many of your posts, for example, scrapping the statute’s immunity shield and opening platforms up to liability for user content is likely to push companies to remove even more of your posts, not fewer.)
Part of this shift may simply have to do with a change in focus toward child safety following Haugen’s 2021 testimony before Congress. Alongside this surge of interest in online child safety came a drop in attention toward Section 230. Together with Slate and the Center on Technology Policy, Lawfare has worked to monitor new proposals to reform the statute introduced in Congress. Over more than two years, our tracker has seen a notable—if not precipitous—slowdown, with 18 reform proposals introduced both 2020 and 2021; 11 in 2022; 13 in 2023; and none in 2024 so far.
There is, however, another possible explanation. Put simply, Section 230 reform is hard. Because platforms have relied so heavily on the law’s immunity shield for so long, it’s difficult to fine-tune any revisions to the statute without also creating a cascade of unintended consequences. Law professor Blake Reid describes this as “interpretive debt”: For two decades, Section 230’s liability shield obviated the need for courts to develop a jurisprudence of platform liability, such that “any substantial narrowing of Section 230’s scope would force courts to bridge across the discontinuity in every case no longer covered by Section 230” by essentially starting from scratch. This is not an appealing prospect either for courts—which may be why the Supreme Court ultimately declined to weigh in on the scope of Section 230 in the Gonzalez v. Google case in 2023—or for members of Congress, who probably don’t want to be held responsible for breaking the internet if their reform attempts go haywire.
And what exactly were those reform attempts trying to do, anyway? Both Democrats and Republicans enjoyed attacking Section 230, but—examined more closely—that apparent bipartisan consensus quickly fell apart. Roughly speaking, Republicans, who complained about supposed censorship of conservative speech, wanted platforms to leave more content up; Democrats, who complained about hate speech and disinformation, wanted platforms to take more content down. Even a successful reform in one direction would inevitably leave the other side unsatisfied. That’s not exactly a recipe for legislative success.
In this context, child safety makes for an appealing alternative focus for tech policymaking in Congress. The subject offers another way for lawmakers to appear tough on Big Tech, and legislators from both sides of the aisle can get behind the importance of keeping children safe. State legislatures had already built up momentum on the issue. And this new approach also allows policymakers to set aside the seemingly impossible task of untangling all the complexities of Section 230.
But as KOSA shows, those complexities are hard to get away from. When it comes to mechanics, the child safety legislation runs into one of the major difficulties that also plagues attempts to reform Section 230: It’s very difficult to game out how any of this will work in practice. Some of this has to do with the uncertainties of calibrating how risk-averse platforms will respond to potential new forms of liability. Will they curate their platforms more carefully in a way that leads to a healthier internet, or will they remove anything potentially offensive and leave the online world a barren, sanitized wasteland?
Some of it also has to do with the absence of data on how platform design shapes human behavior, which leaves lawmakers in the dark when it comes to what changes are actually necessary and how to achieve them. In the case of FOSTA, for example, lawmakers confidently moved forward with revisions to Section 230 that they claimed would help combat sex trafficking but seem to have done anything but. When it comes to KOSA, it remains unclear, for example, how and to what extent internet use actually affects childrens’ mental health; the existing research is far more complicated and unsettled than political talking points often allow. In this respect, KOSA’s requirements for research on the topic are welcome, but the rest of the bill remains somewhat of a shot in the dark. While KOSA requires the National Academies to look into the effects of internet use on youth mental health, it doesn’t extend that research mandate toward examining how the bill itself might shape platform behavior.
Proposals to fix the internet by amending Section 230 inevitably ran into the small problem of the First Amendment, which protects platforms’ judgments of what material to allow on their services whether or not they receive immunity for it. However badly members of Congress would like to pass a law banning hate speech or LGBTQ+ content from Facebook, in other words, that proposal would go nowhere. By focusing on platform design instead, KOSA moves away from government efforts to directly regulate content. But the First Amendment is not so easily avoided, and the legislation seems likely to face constitutional scrutiny if it ever does become law. Critics have argued that KOSA would chill speech and is effectively a content-based restriction on particular kinds of material that lawmakers view as harmful to children, even if it’s framed in terms of design. At the state level, online child safety laws in both California and Arkansas have been held up in court on similar grounds.
The promise of bipartisanship has proved slippery, too, even once Section 230 is set aside. “Child safety” sounds like a slogan agreeable to all—until you dig more deeply into which children and what they’re being protected from. The right’s obsession with “grooming” snapped this lack of agreement into focus. The result is a compromise bill that can’t entirely resolve these tensions, seeking to wall off potential avenues for abuse that worry Democrats but also opening up new potential concerns.
This story isn’t over yet. If KOSA passes in the Senate, the bill will still have to make it through the House—potentially a tall order given the ongoing dysfunction in that chamber. Meanwhile, reporting suggests that the bill’s path toward a Senate vote has already been delayed thanks to jockeying among senators seeking to move forward their own, separate legislative proposals for tech regulation and child safety. In that sense, at least, all of this is politics as usual.
– Quinta Jurecic is a fellow in Governance Studies at the Brookings Institution and a senior editor at Lawfare. She previously served as Lawfare’s managing editor and as an editorial writer for the Washington Post. Published of Lawfare.