abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

这页面没有简体中文版本,现以English显示

企业回应

2024年12月16日

Meta response

Since 2016 we have been evolving our approach to elections to incorporate the lessons we learn and stay ahead of emerging threats. We have a dedicated team responsible for Meta’s cross-company election integrity efforts, which includes experts from our intelligence, data science, product and engineering, research, operations, content and public policy, and legal teams. In 2024, we ran a number of election operations centers around the world to monitor and react swiftly to issues that arose, including in relation to the major elections in the US, Bangladesh, Indonesia, India, Pakistan, the EU Parliament, France, the UK, South Africa, Mexico and Brazil...

Content Reviewers:

We have a global team of over 40,000 people working on safety and security, with more than $20 billion invested in teams and technology since 2016. This includes over 15,000 content reviewers who review content 24/7 in more than 70 languages. We have over 20 content reviewing sites around the world...

Our Approach to Tackling Misinformation:

Meta is committed to stopping the spread of misinformation. We use a combination of enforcement technology, human review and independent fact-checkers to identify, review and take action on this type of content. We remove the most serious kinds of misinformation from Facebook, Instagram and Threads, such as content that could contribute to imminent violence or physical harm, or that is intended to suppress voting during an election. For content that doesn’t violate these particular policies, we work with independent fact-checking organisations who review and rate content, including if it was created or edited by digital tools such as AI. When content is debunked by these fact checkers, we attach warning labels to the content and reduce its distribution in Feed so people are less likely to see it...

Our Fact-Checking Programme:

Meta's fact-checking programme launched in 2016 for Facebook and expanded to Instagram in 2019 and Threads in 2024. It focuses on identifying and addressing viral misinformation, particularly hoaxes with no clear basis in fact. We work with nearly 100 certified, third-party fact-checking organisations in more than 60 languages around the world. This includes at least 16 African languages such as Hausa, Yoruba, Igbo, Amharic, Tigrinya, Oromo, Kiswahili, Swoto, Setswana, and Zulu. All our fact-checking partners are certified through the non-partisan International Fact-Checking Network (IFCN). Our partners prioritise provably false claims, especially those that are timely, trending and consequential. They don't prioritise claims that are inconsequential or only contain minor inaccuracies.

The programme is also not intended to interfere with individual expression, opinions and debate, clearly satirical or humorous content or business disputes.

时间线