Facebook establishes new policy to remove misinformation that could lead to violence
Facebook... said... that it would begin removing misinformation that could lead to people being physically harmed. The policy expands Facebook’s rules about what type of false information it will remove, and is largely a response to episodes in Sri Lanka, Myanmar and India in which rumors that spread on Facebook led to real-world attacks on ethnic minorities... “We have identified that there is a type of misinformation that is shared in certain countries that can incite underlying tensions and lead to physical harm offline,” said Tessa Lyons, a Facebook product manager. “We have a broader responsibility to not just reduce that type of content but remove it.”
Facebook has been roundly criticized over the way its platform has been used to spread hate speech and false information that prompted violence... In Myanmar, Facebook has been accused by United Nations investigators and human rights groups of facilitating violence against Rohingya Muslims... In Sri Lanka, riots broke out after false news pitted the country’s majority Buddhist community against Muslims... In an interview published Wednesday by the technology news site Recode, Mark Zuckerberg, Facebook’s chief executive [said]... “I think that there’s a terrible situation where there’s underlying sectarian violence and intention... It is clearly the responsibility of all of the players who were involved there.”... Under the new rules, Facebook said it would create partnerships with local civil society groups to identify misinformation for removal. The new rules are already being put in effect in Sri Lanka... The company has started identifying posts that are categorized as false by independent fact checkers. Facebook will “downrank” those posts... so that they are not highly promoted across the platform.