abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

这页面没有简体中文版本,现以English显示

文章

2024年8月2日

作者:
Adam Satariano, New York Times

UK: Far-right riots allegedly fuelled by misinformation spread on X, Telegram, & Meta

geralt

"The U.K. Riots Were Fomented Online. Will Social Media Companies Act?", 2 August 2024

Standing in front of a lectern..., his voice at times taut with anger, Britain’s prime minister announced a crackdown on what he called the “gangs of thugs” who instigated violent unrest in several towns this week.

But the question of how to confront one of the key accelerants — a flood of online misinformation about a deadly stabbing attack — remained largely unanswered.

Prime Minister Keir Starmer called out online companies directly, after false information about the identity of the 17-year-old suspected in the attack spread rapidly on their platforms, no matter how many times police and government officials pushed back against the claims.

Three girls died after the attacker rampaged through a dance class in Southport, northwest England, on Monday. Of the eight children injured, five remain in the hospital, along with their teacher, who had tried to protect them.

Immediately after the attack, false claims began circulating about the perpetrator, including that he was an asylum seeker from Syria. In fact, he was born in Cardiff, Wales, and had lived in Britain all his life. According to the BBC and The Times of London, his parents are from Rwanda.

The misinformation was amplified by far-right agitators with large online followings, many of whom used messaging apps like Telegram and X to call for people to protest. Clashes followed in several U.K. towns, leading to more than 50 police officers being injured in Southport and more than 100 arrests in London.

On Friday evening, a charged riot broke out in the working-class city of Sunderland in England’s northeast, during which police officers were injured and at least eight people were arrested, according to the local police. Footage of the unrest showed protesters hurling rocks, cars set ablaze and a police station engulfed in flames.

Officials fear more violence in the days ahead. The viral falsehoods were so prevalent that a judge took the unusual step of lifting restrictions on naming underage suspects, identifying the alleged attacker as Axel Rudakubana.

“Let me also say to large social media companies and those who run them: Violent disorder, clearly whipped up online, that is also a crime, it’s happening on your premises, and the law must be upheld everywhere,” Mr. Starmer said in his televised speech, though he did not name any company or executive specifically.

“We will take all necessary action to keep our streets safe,” he added.

The attack in Southport, England, has been a case study in how online misinformation can lead to actual violence. But governments, including Britain, have long struggled to find an effective way to respond. Policing the internet is legally murky terrain for most democracies, where individual rights and free speech protections are balanced against a desire to block harmful material.

Last year, Britain adopted a law called the Online Safety Act that requires social media companies to introduce new protections for child safety, while also forcing the firms to prevent and rapidly remove illegal content like terrorism propaganda and revenge pornography.

But the law is less clear about how companies must treat misinformation and incendiary, xenophobic language. Instead, the law gives the British agency Ofcom, which oversees television and other traditional media formats, more authority to regulate online platforms. Thus far, the agency has not taken much action to tackle the issue.

Jacob Davey, a director of policy and research at the Institute for Strategic Dialogue, a group that has tracked online far-right extremism, said many social media platforms have internal policies that prohibit hate speech and other illicit content, but enforcement is spotty. Other companies like X, now owned by Elon Musk, and Telegram have less moderation.

The European Union has a law called the Digital Services Act that requires the largest social media companies to have robust content moderation teams and policies in place. With the new powers, regulators in Brussels are investigating X and have threatened to fine the company in part for its content moderation policies.

In the United States, where free speech protections are more robust than in Europe, the government has few options to force companies to take down content.

X could not be reached for comment, though Mr. Musk replied “insane” to a video on X of Mr. Starmer’s remarks. Meta, owner of Facebook and Instagram, did not respond to a message seeking comment.

Telegram said that calls to violence are “explicitly forbidden” on its platform and that it was developing a tool that would allow fact-checkers within a country to add verified information to posts that are being viewed by users in that territory.

British policymakers said the country must address false information spread by the far right on social media.

Al Baker, the managing director of Prose Intelligence, a British company that provides services for monitoring Telegram, said the online discourse was a reflection of wider societal challenges.

“It’s important not to go too far and say the internet is the cause,” Mr. Baker said. “The internet and social media are an accelerant that intensify existing problems we have as a society.”