Should Social Media Platforms Be Treated Like Public Squares?

Should dominant social media platforms be required to carry all speech, the way government-owned public forums must?

The Supreme Court in 2017 called social media 'the modern public square.' But does that mean platforms should be legally required to operate like public squares — hosting all speech without viewpoint discrimination?

The Case for More Speech

The Supreme Court itself acknowledged the reality in Packingham v. North Carolina (2017), describing social media as "the modern public square" and "the principal sources that govern both private and public discourse." When the primary venues of democratic participation are owned by a handful of private corporations, the legal fiction that they are merely private editorial operations fails to capture their public function.

Network effects create functional monopolies that eliminate meaningful alternatives. Facebook, YouTube, and Twitter/X benefit from network effects that make switching platforms costly in a way that switching newspapers never was. Telling a banned speaker to "use a different platform" ignores the reality that exclusion from dominant platforms is functional exclusion from public conversation — not analogous to a magazine declining to publish an op-ed.

The state action doctrine should evolve with technological reality. Marsh v. Alabama (1946) held that a privately owned company town could not exclude speakers because it functioned as a public forum. The principle — that sufficiently public private spaces carry public obligations — has not been applied to platforms, but the underlying logic remains compelling as platforms assume a quasi-governmental role in structuring public discourse.

Justice Thomas's concurrence in Biden v. Knight Enterprise (2021) is significant. Thomas argued directly that dominant platforms may be subject to common carrier or public accommodation obligations that would limit their discretion to exclude users. His view that platforms "behave more like common carriers" — like telephone companies — signals that at least some justices are open to rethinking the private actor framework.

Viewpoint discrimination by dominant platforms is qualitatively different from editorial discretion. When a newspaper declines to publish an op-ed, the writer can go elsewhere. When dominant platforms simultaneously de-platform a speaker, there is no meaningful elsewhere. Treating this as ordinary editorial discretion misunderstands the structural power at stake.

The Case for Restriction

Private companies have First Amendment rights of their own — rights that include editorial discretion over what speech they host and amplify. Moody v. NetChoice (2024) confirmed that platforms engage in protected editorial activity when they curate content. Forcing platforms to carry all speech would itself be a First Amendment violation — compelled hosting of unwanted content — exactly the kind of government-mandated speech that the Court struck down in Hurley v. Irish-American Gay Group (1995) and Miami Herald Publishing Co. v. Tornillo (1974).

The public square analogy is descriptively evocative but legally imprecise. The First Amendment's public forum doctrine applies to government-owned property opened for public expression — parks, sidewalks, public buildings. Private companies, even dominant ones, are not governments. The state action doctrine exists precisely to maintain this distinction. Collapsing it would convert every large private actor into a quasi-governmental entity subject to constitutional obligations.

Common carrier status requires affirmative legislative action, not judicial extension. Telephone companies and railroads became common carriers by statute, after democratic deliberation about the appropriate scope of public obligations. Extending common carrier-like obligations to platforms by judicial decision would bypass this process. If the public believes platforms should carry all speech, the remedy is legislation — not constitutional revision by courts.

Mandatory platform hosting would harm marginalized communities most. Platforms' ability to remove harassment, threats, and coordinated abuse campaigns protects the people who are most often targeted by such campaigns — women, minorities, and other historically marginalized groups. A neutrality mandate that requires platforms to host all speech would eliminate these protections.

Historical Context

The question of when powerful private intermediaries must carry speech they would prefer to exclude has a long history in American law. In the early 20th century, telephone companies and railroads were regulated as common carriers obligated to serve all comers on non-discriminatory terms. The rationale was their natural monopoly status — network effects made it economically impossible for meaningful competitors to emerge, so regulation substituted for competition.

Broadcast television and radio were regulated differently under the FCC's fairness doctrine (1949–1987), which required licensees to present balanced coverage of controversial issues. The Court upheld the fairness doctrine in Red Lion Broadcasting Co. v. FCC (1969) on the grounds that spectrum scarcity gave broadcasters a privileged position that justified public obligations. When the FCC abandoned the doctrine, broadcast media quickly became more partisan — demonstrating that the regulatory choice had real speech effects.

The internet was deliberately structured to avoid both models. Section 230 of the Communications Decency Act (1996) created a framework in which platforms were neither publishers (subject to editorial liability) nor common carriers (subject to neutrality obligations) — a legal hybrid that enabled rapid growth but left unresolved questions about accountability. The current debate is essentially a renegotiation of that 1996 settlement in light of a platform ecosystem its architects could not have imagined.

First Amendment Context

The foundational tension is between two First Amendment principles that pull in opposite directions. On one side: Hurley v. Irish-American Gay Group (1995) established that private entities cannot be compelled to carry messages they disagree with — a parade organizer's selection of participants is protected expression. On the other side: PruneYard Shopping Center v. Robins (1980) held that California could require a private shopping center to allow petition circulators without violating the First Amendment — suggesting some accommodation obligations are constitutionally permissible.

Moody v. NetChoice (2024) is the most recent and directly relevant case. The Supreme Court vacated lower court rulings on Texas and Florida must-carry laws and sent them back for further analysis, while signaling that platforms likely engage in protected editorial activity. The Court did not resolve the ultimate constitutional question of whether any platform neutrality requirement could survive First Amendment scrutiny, leaving substantial uncertainty.

Packingham v. North Carolina (2017) — which struck down a North Carolina law barring registered sex offenders from social media — contains the Court's most expansive recognition of platforms as essential speech venues, but was decided on narrower grounds than the public square question requires. The doctrinal framework for resolving this debate has not yet been fully articulated by the Court.

Internet & AI Implications

AI moderation systems have changed the nature of platform editorial discretion from occasional human decisions to continuous automated curation affecting billions of pieces of content daily. This creates a new dimension to the public square debate: when content removal is done by algorithmic systems at scale, the "editorial discretion" framing — borrowed from the newspaper metaphor — fits less well. An AI system that removes 10 million posts per day is not exercising the kind of considered editorial judgment that the First Amendment has traditionally protected.

Conversely, AI-generated content at scale has increased the volume of spam, disinformation, and coordinated manipulation that platforms must manage — strengthening the case for robust moderation tools and weakening the case for neutrality mandates. The same AI capabilities that make platform editorial discretion harder to analogize to traditional editing also make the platform's moderation function more essential to the health of digital discourse.

Free Speech Atlas Editorial View

Editorial view

The public square analogy captures something real and important: when a handful of private companies control most of the world's public discourse infrastructure, their decisions about who can speak have democratic consequences that the ordinary private-actor framework does not fully address.

But the answer is almost certainly not mandatory hosting requirements — the constitutional and practical objections are too strong. Platforms have real interests in editorial discretion, and forcing them to host all content would degrade the value of their platforms for most users while benefiting bad actors.

The more workable approach is transparency and accountability without content mandates: requiring platforms to disclose their moderation criteria, provide meaningful appeals processes, report systematically on enforcement, and disclose when government agencies request content removal. These measures address the democratic concern without the First Amendment costs of must-carry rules — and they preserve the public's ability to evaluate and ultimately hold platforms accountable for their speech choices.