a Federal Judge Decision Government restrictions on communication with social media platforms this week could have widespread side effects, according to researchers and groups fighting hate speech, online abuse and disinformation, and efforts to curb harmful content. may be further hindered.
University of North Carolina Chapel Hill researcher Alice E. Marwick said Wednesday that the ruling could hamper efforts to stop false claims about vaccines and the spread of voter fraud. One of man’s disinformation experts.
He said the order largely follows other efforts by Republicans and is “part of a coordinated campaign to push back the whole idea of disinformation.”
Judge Terry A. Doughty issued a preliminary injunction on Tuesday that the Department of Health and Human Services and the Federal Bureau of Investigation, along with other departments of government, “must stop contacting social media companies for the purposes of encouraging and encouraging.” admitted. In any way pressure or induce the removal, deletion, suppression, or reduction of any Content that contains protected free speech. ”
The ruling stems from lawsuits by attorneys general of Louisiana and Missouri who accused Facebook, Twitter and other social media sites of censoring right-wing content, sometimes in concert with the government. . They and other Republicans cheered a judge’s ruling in the United States District Court for the Western District of Louisiana as a victory for the First Amendment.
But some researchers said the government’s cooperation with social media companies is fine as long as it doesn’t force content to be removed. Instead, the government has so far notified companies of potentially dangerous messages, including lies about election fraud and misleading information about the coronavirus pandemic, they said. Most misinformation or disinformation that violates a social platform’s policies is reported by researchers, non-profit organizations, or the platform’s own personnel and software.
Professor of Communications and Center for Information Technology and Society at the University of California, Santa Barbara.
Of even greater concern is the potential chill effect, the researchers said. The judge’s decision barred certain government agencies from communicating with some research institutions, such as the Stanford Internet Observatory and the Election Integrity Partnership, regarding the removal of social media content. Some of those groups have already been targeted. Republican-led legal campaign Against universities and think tanks.
Their colleagues said such provisions could discourage young academics from pursuing disinformation research and threaten donors who fund important grants.
Bond Benton, an associate professor of communications at Montclair State University who studies disinformation, said the ruling “could be a bit of a Trojan horse.” He said the document, although limited to the government’s relationship with social media platforms, contains the message that misinformation is considered speech and its removal is considered speech suppression.
“Previously, platforms could just say, ‘No shirts, no shoes, no service,’ and we didn’t want to host,” says Dr. Benton. “This ruling will probably make platforms a little more cautious in this regard.”
In recent years, platforms have come to rely heavily on automated tools and algorithms to detect harmful content, limiting the effectiveness of complaints from people outside the company. Victoria Wilk, director of digital safety and freedom of expression at PEN America, a nonprofit that supports free expression, said academics and anti-disinformation groups often complained that platforms didn’t address their concerns. He said he was.
“Platforms are good at ignoring civil society organizations and our requests for assistance, requests for information, or escalation of individual cases,” she said. “They don’t feel comfortable ignoring the government.”
Several disinformation researchers fear the ruling could protect social media platforms, some of whom have already scaled back efforts to curb misinformation, we need to let our guard down even further before the 2024 elections. It’s unclear how relatively new government initiatives that respond to researchers’ concerns and suggestions, such as the White House Task Force on Online Harassment and Abuse, will work, they said.
For Imran Ahmed, chief executive of the Digital Hate Center, Tuesday’s decision highlighted other issues: the US’s “particularly toothless” approach to dangerous content compared to the likes of Australia and the European Union; and the need for updates Rules Governing the Responsibilities of Social Media Platforms. Tuesday’s ruling said the center gave a presentation to the Surgeon General on its 2021 report on online anti-vaccine activists.disinformation dozen”
“It’s bananas that can’t show nipples at the Super Bowl, but Facebook can still broadcast Nazi propaganda, empower stalkers and harassers, undermine US public health, and promote extremism. said Mr Ahmed. “This court’s ruling further exacerbates the sense of impunity under which social media companies operate, even though they are the primary vehicle for hatred and disinformation in society.”