There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Supreme Court examines whether government can combat disinformation online

In a major case testing the role of the First Amendment in the internet age, the U.S. Supreme Court on Monday hears arguments focused on the federal government’s ability to combat what it sees as false, misleading or dangerous information online.

Last September, the 5th U.S. Circuit Court of Appeals, the most conservative federal appeals court in the U.S., issued a broad ruling that barred key government officials from contacts with social media companies. Among the personnel targeted in the order were officials at the White House, the Centers for Disease Control and Prevention, the Office of the Surgeon General, the FBI and an important cybersecurity agency.

The appeals court said that individuals at those agencies likely violated the First Amendment by seeking to coerce social media platforms into moderating or changing their content about COVID-19, foreign interference in elections and even Hunter Biden’s laptop. The Supreme Court has put that ruling on hold while it examines the tricky issues in the case.

afraid_of_zombies ,

I wonder how Pope Francis will rule.

GrymEdm , (edited )

Ah, the age-old safety vs freedom debate. Do you let your citizens be dangerously misinformed about topics like medicine (e.g. people being harmed or perhaps even dying from off-label Ivermectin use), or do you give the government the power to decide what constitutes misinformation?

This timing is especially interesting given allegations that AIPAC and associated interest groups (who are trying to reverse public opinion on the humanitarian disaster that is Gaza, especially among under-40s) are the driving force behind the recent “give us control or be banned” TikTok ruling in Congress. If the Supreme Court decides that the government can combat misinformation, what’s preventing politicians from deciding that the news coming from places like Gaza is “misleading” or, to quote Trump, “Fake News!”?

SnotFlickerman , (edited )

to quote Trump, “Fake News!”?

Can we admit that like “woke” and “politically correct” and every word that the Right steals, turns on its head, and makes into a slur, didn’t start with these chucklefucks.

It’s like how everyone suddenly forgot that “woke” actually originated in black culture and was a positive phrase.

Before Trump ran with it, several news commentators, Hillary Clinton herself, and a Washington Post article all used the term “fake news” in reference to pro-Trump lunacy spreading online. Trump stole the phrase, ran with it, and it’s become a fucking shitshow.

So, at the very least, if we’re going to reference “fake news” can we not attribute it to this thoughtless loser motherfucker who definitely couldn’t come up with something on his own because he is devoid of creativity. He only mimics and copies.

washingtonpost.com/…/793903b6-8a40-4ca9-b712-716a…

This was posted in November 2016. Trump would first Tweet “Fake News!” on December 10th, 2016. This was the major news story that promoted the term “fake news.” I remember because I remember valid skepticism around the PropOrNot organization. There is still a disclaimer on this article related to PropOrNot. (Interestingly enough, the genesis with PropOrNot supports your point that fact-checkers can be biased, and that can lead to issues that we would rather avoid)

Anyway, fuck Trump, can we not give him shit like this? He didn’t coin it, he’s a fucking idiot.

presidency.ucsb.edu/…/tweets-december-10-2016

WarmSoda ,

Wasn’t it that Skeleton lady and the press secretary with the lazy eye that started saying things were alternative truth and fake news?

Ranvier , (edited )

This isn’t anything with the force of law though, so saying it would “give government the power to decide what constitutes misinformation” is misleading. Government communicating important information about public health and national security issues is part of their job. The current ruling if let to stand essentially places a gag order on the US government from communicating important information to any private organization.

This is just agencies sending information to social media networks to help them make decisions. It’s still up to the social media networks to decide. If you dig into the case, you’ll see the government was even totally ignored in two thirds of cases. Luckily it appears the supreme court is likely to overturn the prior order based on their comments today, and allow the US government to communicate again.

If in the future there was an attempt to make an actual law to give the government a power to regulate misinformation, it would have to be extremely narrow and well defined or else would be quickly overturned on first amendment grounds.

The TikTok bill is also a separate issue. And again, I think the statement “give us control or be banned” would be misleading. That implies it’s trying to force a US government takeover of TikTok, which is not at all the case, it’s trying to force a sale to domestic private ownership (not under the control of the US government). Though I personally disagree with that bill and find it problematic. I’m not sure we should be following China’s example here and forcing international companies to divest assets to domestic ones. I think a more nuanced and more effective position would be following the eu’s lead, and regulating how all big tech companies are dealing with data.

GrymEdm , (edited )

it’s trying to force a sale to domestic private ownership (not under the control of the US government)

The two (private corporations and the government) are so intertwined that you and I are going to have to disagree. I firmly believe a lot of governance happens at the behest of big business, and vice versa. The evidence is in far-reaching, long-lasting decisions like expensive private health care, military spending, environmental choices (especially corporate regulations), and federal minimum wage where it’s clear that at least some decisions are being made on behalf of business over public opinion. There is also definitely a measure of governmental content control involved in the TikTok decision - e.g. here’s Ted Cruz talking during one of the TikTok hearings and mentioning anti-Israel content as a primary concern. Just because the sale is to a private entity doesn’t mean it’s not about control of what people see.

The rest of your points are well-made. To rebut: as I’ve made obvious, I think cases like this fit in well with safety vs. freedom discussions. Also please bear in mind the line from OP’s article: “The appeals court said that individuals at those agencies likely violated the First Amendment by seeking to coerce social media platforms into moderating or changing their content about COVID-19, foreign interference in elections and even Hunter Biden’s laptop.” So as per the article, 1st amendment law is being considered relevant, and coercion of social media is also an issue on the table.

Ranvier , (edited )

“Coerce” is I think is what at issue in the case. I still think saying this ruling would give the US government “the power to define misinformation” is misleading. No one, even those arguing it was “coercion,” is arguing the government was using any legal powers to enforce their recommendations. The evidence shows in most cases they were being ignored if anything. And the appeals court was relying on some very faulty factual findings at the trial court level, much of which was pointed out in the supreme court hearing today. The previous rulings used many out of context quotes or even just portions of sentences to create something that wasn’t there. They also overstepped by insituting a very broad gag order across all government communications that has been very damaging. I’ll be surprised if the Supreme Court doesn’t rule in favor of the government here, especially based on their comments today, and don’t worry, the government still won’t have any legal powers to enforce their definitions of misinformation if the Supreme Court rules in their favor.

I think the much greater threat to free speech comes in the form of the other Supreme Court cases heard last month (net choice v Paxton and moody v net choice), that would use the force of law to prevent social media companies from having any editorial discretion in what is posted on their sites. Despite what the Republicans pushing those laws claim, they are not in favor of free speech. The effext of those laws is, to quote the ACLU, “Under the guise of “prohibiting censorship,” these laws seek to replace the private entities’ editorial voice with preferences dictated by the government.” That’s the one that’s not only clearly coercive but comes with the force of law, and I would be very worried from a free speech perspective if the supreme court upheld those laws.

aclu.org/…/aclu-urges-supreme-court-to-uphold-pre…

grue ,

This isn’t even that, though: this is the “safety and freedom vs. deliberate endorsement of disinformation by a compromised judiciary” debate.

If you believe the 5^th^ Circuit actually ruled on this case with objectivity and sound legal principles – as opposed to thinking of the politically advantageous outcome they wanted and working backwards – I’ve got a bridge to sell you.

SkyNTP ,

Social media platforms are not havens of free speech. There’s nothing free about how the algorithms influence what sorts of information people are exposed to. The idea that companies get to have “free speech” is a cancer on society.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines