S

Social Media Regulation at the Supreme Court

By G.S. Hans, Associate Clinical Professor of Law

The Supreme Court will issue major rulings in several prominent free speech and technology cases over the next year, perhaps answering some long-standing questions in First Amendment doctrine while undoubtedly raising new ones. Speech and technology topics have come to the Court intermittently over the last three decades; the dominance of technology companies in the economy will probably lead to cases coming to the court more frequently in the coming years. The unique aspects of social media present challenges for both regulators and judges seeking to apply existing doctrine to a new context. While the Supreme Court has avoided answering many of the most challenging questions thus far, it cannot do so forever.

Just before the Supreme Court term began, the court granted review in two long-running cases, NetChoice v. Moody and NetChoice v. Paxton. The NetChoice cases implicate the power of the state to regulate social media companies’ content practices. Much recent attention has focused on Section 230, the federal statute that provides broad immunity to technology companies for liability stemming from hosting the content of their users. The NetChoice cases involve a different question: whether and how governments can regulate the editorial choices of social media companies, and whether they can compel disclosure of those companies’ editorial practices.

Both Texas and Florida enacted laws that levied requirements on how social media companies could moderate content posted on their platforms. Both laws also mandated disclosure of the companies’ content moderation practices, such as the standards used in making takedown or banning decisions. Under the Texas and Florida regimes, companies would need to disclose not only why specific content moderation decisions (such as removing content or banning a user) were made, but also statistics.

Because private entities have at least some First Amendment protections over what speech they publish or promulgate, NetChoice, a trade organization that represents some of the companies targeted by these laws, filed federal lawsuits in both states to prevent the laws from going into effect. In both cases, NetChoice asserted multiple First Amendment claims against the states. NetChoice claimed that the laws improperly interfere with the companies’ First Amendment rights to determine what content appears on their services. NetChoice also argued that transparency requirements also fail to comport with Supreme Court precedent on disclosure of information.

NetChoice largely prevailed in the Eleventh Circuit (while at my last institution, I filed an amicus brief on behalf of First Amendment law professors supporting their position in the case), but lost in the Fifth Circuit. The cases have taken longer than some expected to reach the court, though many observers assumed the Court would take up the cases given the importance of the constitutional questions. The Court called for the views of the Solicitor General, who recommended that the Court grant review on the questions involving regulation of content moderation standards and requiring individualized explanations for specific moderation decisions, but not on the general information disclosure requirements. The Court agreed with the Solicitor General’s office, and thus we should soon have some answers.

The provisions that regulate content moderation practices implicate a First Amendment doctrine loosely known as “editorial discretion.” NetChoice and the states fundamentally disagree on how to best interpret that doctrine, which specifies when and how a private entity has leeway to determine, without government interference, what content it hosts or publishes. The foundational Supreme Court case in this area, Miami Herald v. Tornillo, concerned a Florida law that mandated a right of reply for political candidates whom a newspaper had criticized. The Miami Herald argued that the Florida law unconstitutionally infringed upon its ability to choose what content it chose to publish. The Supreme Court agreed, holding that the Florida interfered with the function of editors to choose what content they would publish.

Proponents of the Florida law at issue in Tornillo pointed to the concentration of newspapers as a justification for allowing a “right of access” to the Herald and other papers. Media consolidation creates an inability for opposing views to find an audience; a newspaper could choose to propound only a narrow range of perspectives, and dissenters would have difficulty finding an audience. Under this view, the right of access contributes to the goals of the First Amendment by allowing for a broader array of voices to find a place in the marketplace of ideas.

The court rejected this view of the First Amendment in Tornillo and in more recent cases. It has consistently held that the First Amendment only speaks to freedom from governmental actions that restrict speech and association, rather than allowing for the state to provide affirmative access. The Court held that even if the right of access cost nothing to newspapers and wouldn’t require them to cut other articles, it still improperly interfered with editorial functions. NetChoice and its allies rely heavily upon Tornillo, even though newspapers and social media companies seem quite different in their practices. But, NetChoice contends, the core principles are analogous.



Subsequent Supreme Court cases expanded on editorial discretion doctrine. Two cases, PG&E v. Public Utilities Commission of California and Hurley v. Irish-American Gay, Lesbian, and Bisexual Group of Boston, extended Tornillo’s holding to the newsletters sent by a public utility and a parade, respectively. In PG&E, the California Public Utilities Commission had required that PG&E, a public utility company, distribute a publication from another source in a monthly billing envelop that contained PG&E’s own newsletter. PG&E successfully contended that the requirement resembled the Florida law in Tornillo by mandating the inclusion of content that PG&E did not want to carry and did not agree with.

Hurley involved a challenge to a Massachusetts anti-discrimination law that mandated the inclusion of a LGBTQ group in an Irish-American parade. The parade organizers argued that requiring the parade to allow the LGBTQ group to march in the parade against the organizers’ wishes constituted an improper imposition on the organizers’ speech interests. As in Tornillo and PG&E, the Court agreed given the nature of the parade as an expressive event that communicated a message to the public.

But two other cases, Pruneyard Shopping Center v. Robins and Rumsfeld v. FAIR, declined to apply Tornillo to a shopping mall and law school recruiting practices, holding that challenged regulations did not interfere with any expressive interests. The Court reasoned that the mall in Pruneyard had to host protesters exercising their California constitutional rights, because the mall owners had not claimed that they necessarily disagreed with the protesters and the mall’s expressive interests were limited, if they even existed. In FAIR, multiple law schools challenged a federal law that required them to host military recruiters or lose federal funding. The FAIR Court held that the law did not require law schools to say anything or limit them from speaking; its speech was not affected by hosting recruiters, just as in Pruneyard.

The Court’s editorial discretion cases have created a bit of a Rorschach test. One’s specific regulatory or policy goals can allow for a reading of these five cases that adopts the language of those that support those goals while distinguishing the cases that don’t. For those who support the social media companies, as I do to a degree, Pruneyard and FAIR don’t apply given the companies’ expressive choices in choosing what types of content they want to host. For those who support the states, the social media companies’ reliance on automated review and hosting of a wide range of third-party content mean that they don’t have as strong a claim to First Amendment protections.

Each side in the NetChoice litigation have taken opposingly expansive interpretations of how the editorial discretion cases should apply to the Florida and Texas laws. It seems more likely, therefore, that the court will seek a middle ground. A decision that allows for some regulation—particularly for transparency reporting—of social media companies, but overturns these specific laws, would thread the needle on protecting constitutional interests while ensuring that states can indeed regulate social media companies in ways that don’t directly limit their speech rights.

Whether or not the Court takes that approach, it will have to balance a few of the competing impulses that have motivated some of its recent decisions. Most of the Justices have promoted a maximalist reading of the First Amendment’s speech protections, which benefits the social media companies’ arguments. The Court has also ruled frequently in favor of corporate interests, particularly when it comes to regulatory interventions. But some on the Court have concerns about the power of technology companies specifically, particularly given the perception that those companies demonstrate a political bias against conservative speakers. These conflicting dynamics show that predicting an outcome in these cases remains challenging, at least in advance of oral argument. When the Court finally issues a decision, we will hopefully have more guidance for the many cases that will almost certainly arise in the coming years as states and Congress turn more actively to regulate the Internet.