Back to Top

November 21, 2024
close
Supreme Court skeptical of social media laws that bar content removal

Supreme Court skeptical of social media laws that bar content removal

  • Science
  • February 27, 2024
  • No Comment
  • 37

A majority of the Supreme Court seemed broadly skeptical Monday that state governments have the power to set rules for how social media platforms curate content, with both liberal and conservative justices inclined to stop Texas and Florida from immediately implementing laws that ban the removal of certain controversial posts or political content.

Even as justices expressed concern about the power of social media giants that have become the dominant modern public forum, a majority of the court seemed to think the First Amendment prevents state governments from requiring platforms such as Facebook and YouTube to host certain content.

The high court’s decision in the two cases, likely to come near the end of the term in June, will have a significant impact on the operation of online platforms that are playing an increasingly important role in U.S. elections, democracy and public discussion.

The justices were reviewing a challenge from two tech industry associations, whose members include YouTube, Facebook and X, to Texas and Florida laws passed in 2021 in response to concerns from conservatives who said their voices are often censored by the editorial decisions of tech companies.

At issue for the court is whether the First Amendment protects the editorial discretion of large social media platforms or prohibits censorship of unpopular views. Social media posts have the potential to spread extremism and election disinformation, but taking down controversial views can silence discussion of important political issues.

A key question, Chief Justice John G. Roberts Jr. said during almost four hours of argument Monday, is whether the power to decide who can or cannot speak on a particular platform belongs to the government, or to social media companies.

“The First Amendment restricts what the government can do, and what the government is doing here is saying, you must do this, you must carry these people; you’ve got to explain if you don’t,” said Roberts, a conservative. “That’s not the First Amendment.”

Justice Sonia Sotomayor, a liberal, also called the Florida and Texas laws problematic, saying they are “so broad that they stifle speech just on their face.”

But many justices also seemed unconvinced that the First Amendment protects all aspects or types of digital platforms. Some suggested that sections of the state laws prohibiting the removal of certain content or users could be constitutional as applied to e-commerce and communications sites such as Uber and Gmail.

Justice Samuel A. Alito Jr. asked whether Gmail, for instance, has a First Amendment right to delete the email accounts of conservative commentator Tucker Carlson or liberal commentator Rachel Maddow if Google does not agree with one or the other’s viewpoints. Justice Ketanji Brown Jackson raised similar concerns about Facebook’s messaging feature.

A majority of justices seemed to agree, however, that the First Amendment protects the right of Facebook and YouTube to rank and moderate posts on their platforms, just as newspapers can make editorial decisions and bookstores and theaters may choose which content to promote.

Justice Amy Coney Barrett asked whether Florida could enact a law “telling bookstores that they have to put everything out by alphabetical order and that they can’t organize or put some things closer to the front of the store that they think, you know, their customers will want to buy?”

When platforms choose to remove misinformation about elections or take down content from anti-vaccination advocates or insurrectionists, Justice Elena Kagan suggested, they are exercising judgments “about the kind of speech they think they want on the site and the kinds of speech that they think is intolerable.”

Justice Brett M. Kavanaugh also pushed back on the assertion by Florida’s solicitor general, Henry Whitaker, that the First Amendment is designed to prevent suppression of speech by private entities. “You left out what I understand to be three key words,” Kavanaugh said, emphasizing the amendment’s inclusion of the words “by the government.”

State government officials argued that regulations are needed to ensure the public has access to diverse sources of information. Unlike traditional media, the platforms make money not from speaking themselves, they said, but from attracting users to their platforms to speak, and therefore are more akin to utilities such as phone companies that must provide open access to all.

Tech companies “contend that they possess a broad First Amendment right to censor anything they host on their sites, even when doing so contradicts their own representations to consumers” that their platforms are neutral forums for free speech, Whitaker said.

Noting that millions of Americans rely on social media to work or socialize with family and friends, Texas Solicitor General Aaron Nielson said allowing those platforms to remove problematic content would mean “there will be no public square to speak of.”

The hearing gave a rare glimpse into how the nine justices — who have joked that they are not the world’s foremost internet experts — use technology themselves. Justice Clarence Thomas appeared to suggest he was not a social media user, saying he was “not on any” when pressing the lawyer for the trade association NetChoice about how the companies’ algorithms functioned. Some justices appeared familiar with the workings of popular tech services, with Barrett describing Etsy as an online “flea market” and Alito asking repeated questions about Gmail.

Thomas and Alito, two of the court’s most conservative justices, sharply questioned the companies’ claims that they are engaging in editorial discretion when they take down objectionable posts or remove users. Alito pressed NetChoice to define the term “content moderation,” asking whether the term was “anything more than a euphemism for censorship.”

“If the government’s doing it, then content moderation might be a euphemism for censorship,” said attorney Paul Clement, representing NetChoice. “If a private party is doing it, content moderation is a euphemism for editorial discretion.”

Thomas and Alito also questioned how that stance squared with decades in which the companies argued against changes to a provision of the 1996 Communications Decency Act provision — Section 230 — that immunizes the platforms from lawsuits over posts that users share on their services. In making those arguments, Thomas said, the companies described their services as “merely a conduit” for those making the posts. On Monday, he continued, they described themselves as engaged in “expressive conduct,” effectively taking on the role of a publisher that would traditionally be liable for the content it hosts.

“Either it’s your message or it’s not your message. I don’t understand how it can be both,” Alito added. “It’s your message when you want to escape state regulation, but it’s not your message when you want to escape liability.”

But Clement disputed the characterization, focusing instead on the aspect of Section 230 that protects companies from lawsuits over their decisions to remove content from their websites. He argued that “the whole point” of the provision was to allow online platforms to “essentially exercise editorial discretion” in removing harmful content without fear that it would expose them to liability as a publisher of user speech they don’t moderate. If the Texas and Florida laws were to take effect, Clement said, platforms would be forced to carry the type of content that Congress was trying to prevent when it drafted Section 230 nearly 30 years ago.

Throughout the marathon arguments, the justices struggled to identify a specific path for resolving the challenges to the state laws. They seemed interested in suggestions from Solicitor General Elizabeth B. Prelogar, representing the Biden administration, who urged them to rule narrowly that the laws interfering with content placement decisions are unconstitutional, while leaving open for another day questions about other aspects of the laws.

Even if state officials have concerns about a social media company’s dominance, she said, the government cannot take over a private party’s judgment about how to present a product. But Prelogar acknowledged legitimate concerns about the kind of power and influence that social media platforms wield.

“It’s not like the government lacks tools to deal with this,” she added, pointing to “a whole body of government regulation that would be permissible that would target conduct, things like antitrust laws that could be applied or data privacy or consumer protection, things that we think wouldn’t come into any conflict with the First Amendment at all.”

The Supreme Court decided to take up the issue after two appeals courts issued conflicting rulings, both written by judges nominated by former president Donald Trump. In Florida, a unanimous panel of the U.S. Court of Appeals for the 11th Circuit held that the restrictions of that state’s law probably violate the First Amendment. A divided panel of the U.S. Court of Appeals for the 5th Circuit, however, upheld the Texas law that bars companies from removing posts based on political ideology.

At its core, the First Amendment protects against government infringement on speech. Courts have also held that the First Amendment protects the right of private companies, including newspapers and broadcasters, to control the speech they publish and disseminate. It also includes the right of editors not to publish something they don’t want to publish.

In the 11th Circuit ruling, Judge Kevin Newsom said social media platforms are distinct from other communications services and utilities that carry data from point A to point B, and their “content-moderation decisions constitute the same sort of editorial judgments” entitled to First Amendment protections when made by a newspaper or other media outlet.

Judge Andrew Oldham of the 5th Circuit ruled the other way, saying social media companies had turned the First Amendment on its head by suggesting that a corporation has an “unenumerated right to muzzle speech” by banning users or removing certain posts. Oldham compared social media platforms to “common carriers” such as telephone companies.

Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University, said it was difficult to determine from the Supreme Court argument on Monday how the court would rule.

“It was very clear at today’s hearing that the platforms want a First Amendment that immunizes them from regulation altogether,” he said. “And the states think the First Amendment shouldn’t be relevant here at all. The court should really reject both of these arguments. Whether it will, I guess we’ll see.”

The cases are NetChoice v. Paxton and Moody v. NetChoice.

#Supreme #Court #skeptical #social #media #laws #bar #content #removal

Related post

Southport murders accused facing terror charge

Southport murders accused facing terror charge

Merseyside Police Elsie Dot Stancombe, Alice da Silva Aguiar and Bebe King were killed in the stabbings in Southport The teenager…
CNN bans conservative commentator after verbal attack on Mehdi Hasan | US Election 2024 News

CNN bans conservative commentator after verbal attack on Mehdi…

US network says it has ‘zero room for racism’ after Girdusky tells Hasan: ‘I hope your beeper doesn’t go off.’ CNN…
Rachel Reeves announces more details of NHS funding plan

Rachel Reeves announces more details of NHS funding plan

Reuters Rachel Reeves and Wes Streeting visited a hospital in south London on Monday The government has announced more details of…

Leave a Reply

Your email address will not be published. Required fields are marked *