abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Эта страница недоступна на Русский и отображается на English

Статья

17 Окт 2023

Автор:
Paige Collings & Jillian C. York, EFF

EFF urges social media platforms to improve handling misinformation during time of crisis & provides specific recommendations

"Social Media Platforms Must Do Better When Handling Misinformation, Especially During Moments of Conflict", 17 October 2023

... in the wake of Hamas’ deadly attack on southern Israel last weekend—and Israel’s ongoing retributive military attack and siege on Gaza—misinformation has been thriving on social media platforms.

It can be difficult to parse out verified information from information that has been misconstrued, misrepresented, or manipulated.

... there are steps that social media platforms can take to increase the likelihood that their sites are places where reliable information is available—particularly during moments of conflict.

Platforms should:

  • have robust trust and safety mechanisms in place that are proportionate to the volume of posts on their site to address misinformation, and vet and respond to user and researcher complaints; 
  • ensure their content moderation practices are transparent, consistent, and sufficiently resourced in all locations where they operate and in all relevant languages; 
  • employ independent, third-party fact-checking, including to content posted by States and government representatives;
  • urge users to read articles and evaluate their reliability before boosting them through their own accounts; 
  • subject their systems of moderation to independent audits to assess their reliability, and
  • adhere to the Santa Clara Principles on Transparency and Accountability in Content Moderation and provide users with transparency, notice, and appeals in every instance, including misinformation and violent content. 

... without adequate guardrails for users and robust trust and safety mechanisms, this will not be the last instance where unproven allegations have such dire implications—both online and offline.

Хронология