Supreme Court puts content moderation on solid legal ground

Civic and tech groups say a ruling this week indicates a growing consensus that social media moderation is free speech.
Chris Marchese, director of NetChoice's litigation center, speaks to the press outside the Supreme Court in Washington, D.C. on Feb. 26, 2024. (Photo by ANDREW CABALLERO-REYNOLDS/AFP via Getty Images)

The Supreme Court handed a major, if partial, victory to tech companies this week by ruling that content moderation falls within the First Amendment rights of online platforms. 

Conservative critics of major social media companies have seized on content moderation — particularly around election- and COVID-related topics — as unacceptable censorship that tramples on American speech rights. But on Monday, the high court said efforts by states to restrict the ability of platforms to decide what content appears on the platforms represents an infringement of companies’ rights. 

“States (and their citizens) are of course right to want an expressive realm in which the public has access to a wide range of views,” Justice Elena Kagan wrote on behalf of the court. “But the way the First Amendment achieves that goal is by preventing the government from ‘tilt[ing] public debate in a preferred direction,’ not by licensing the government to stop private actors from speaking as they wish and preferring some views over others.”

The case — Moody v. NetChoice — centers around laws passed by two states, Florida and Texas, that sought to restrict the ability of social media platforms to moderate content, ban users and remove or filter posts.


Trade associations representing major tech and social media firms challenged those laws, arguing that the state laws impeded on their First Amendment right to freedom of expression to moderate undesirable content.

Monday’s ruling sends the case back to the lower courts to carry out a more detailed factual analysis of exactly what services the Florida and Texas laws would cover. 

Kagan wrote that lower courts failed to consider how the state laws might apply not just to social platforms like Facebook and YouTube, but to a range of other virtual services. Florida’s law, for instance, defines a social media platform as having either 100 million monthly users or $100 million in annual gross revenue. Texas’ law sets the limit at 50 million monthly users for services that are used “to communicate with other users for the primary purpose of posting information, comments, messages, or images.”

Those definitions are overly expansive, Kagan wrote, with the potential to impact “how an email provider like Gmail filters incoming messages, how an online marketplace like Etsy displays customer reviews, how a payment service like Venmo manages friends’ financial exchanges, or how a ride-sharing service like Uber runs.”

The decision strikes a blow against efforts by conservative lawmakers to treat social media sites as public squares, subject to the same First Amendment constraints around moderating speech as governments. By banning and removing posts or using algorithms to prioritize certain posts over others, proponents have argued, these companies are engaging in censorship of public speech and violating the right of free expression by its users.


Last week, the court rejected an effort by two states, Missouri and Louisiana, to limit the federal government’s ability to communicate with social media companies and share information around how to limit disinformation and foreign influence networks on their platforms.

In Texas, Attorney General Ken Paxton has railed against the content moderation of social media companies and likened them to unconstitutional censorship. Reacting to the ruling on X, Paxton called the issue “one of the biggest threats to free public discourse and election integrity” and vowed not to give up.

“I will keep fighting for our law that protects Texans’ voice. No American should be silenced by Big Tech oligarchs,” Paxton wrote.

But this week’s ruling is unlikely to be the last word on the matter.

Corbin Barthold, director of appellate litigation at the think tank TechFreedom, said during a live Spaces event on X that typically in this scenario, the Supreme Court would send the case back to the appellate courts, which would then send them back to lower district courts to reexamine how the laws would affect other kinds of services and whether they would pass First Amendment muster.


In this case, the only guarantee is that “there is going to be a lot of maneuvering” by lower courts, particularly the Fifth Circuit, whose rulings helped propel the NetChoice case to the Supreme Court and has historically leaned to the right — and where six of the judges were appointed by former President Donald Trump.

“They will have an opportunity to have a say, to shape things” for a future court challenge, Barthold said.

But even a revamped law or court ruling will have to contend with a Supreme Court where a majority of justices signaled broad unease with the concept of government regulation of how private social media companies moderate speech on their own platforms.

Kagan wrote that even if considered on the merits, laws like the one passed by Texas are “unlikely to withstand First Amendment scrutiny.” “It prevents a platform from compiling the third-party speech it wants in the way it wants and thus from offering the expressive product that most reflects its own views and priorities,” Kagan wrote.

Even as the court sidestepped a strict ruling on the merits, Eric Segall, a constitutional scholar at Georgia State University’s College of Law, told CyberScoop that the substance of the opinion undermines the idea that state regulations on the moderation choices of private entities aligns with the First Amendment.


“The court is very clear you can’t regulate this way,” Segall said.

Tech and civic groups lauded the decision as a victory for free speech.

“Removing these protections would have fundamentally changed how we communicate and interact online, created a less safe user experience, endangered lawful online expression, and weakened U.S. innovation,” said Linda Moore, CEO and president of the industry group TechNet.

Nora Benavidez, senior counsel and director of digital justice and civil rights at the advocacy group Free Press, said the ruling sent an important message to social platforms, many of which have deprioritized content moderation and slashed their trust and safety teams ahead of the 2024 elections in the face of relentless legal and political challenges from the right.

“Today’s ruling should send a message to the likes of Mark Zuckerberg and Elon Musk: Your commitment to platform integrity is protected under the First Amendment,” Benavidez said. “While there could be further court proceedings based on [Monday’s] procedural holding, the Court sent a clear signal that unconstitutional efforts to regulate content moderation will face withering scrutiny.”

Derek B. Johnson

Written by Derek B. Johnson

Derek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Latest Podcasts