abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

このページは 日本語 では利用できません。English で表示されています

記事

2020年12月15日

著者:
Drew Harwell and Eva Dou, Washington Post (USA)

China: Huawei reportedly tested facial recognition software that could identify minorities

"Huawei tested AI software that could recognize Uighur minorities and alert police, report says", 8 December 2020

The Chinese tech giant Huawei has tested facial recognition software that could send automated “Uighur alarms” to government authorities when its camera systems identify members of the oppressed minority group, according to an internal document that provides further details about China’s artificial-intelligence surveillance regime.

A document signed by Huawei representatives — discovered by the research organization IPVM and shared exclusively with The Washington Post — shows that the telecommunications firm worked in 2018 with the facial recognition start-up Megvii to test an artificial-intelligence camera system that could scan faces in a crowd and estimate each person’s age, sex and ethnicity.

If the system detected the face of a member of the mostly Muslim minority group, the test report said, it could trigger a “Uighur alarm” — potentially flagging them for police in China, where members of the group have been detained en masse as part of a brutal government crackdown. The document, which was found on Huawei’s website, was removed shortly after The Post and IPVM asked the companies for comment. [...]

Both companies have acknowledged the document is real. Shortly after this story published Tuesday morning, Huawei spokesman Glenn Schloss said the report “is simply a test and it has not seen real-world application. Huawei only supplies general-purpose products for this kind of testing. We do not provide custom algorithms or applications.”

Also after publication, a Megvii spokesman said the company’s systems are not designed to target or label ethnic groups. [...]

Part of the following timelines

報告書:省を超えて工場に配属された新疆の少数民族の強制労働に大手ブランド83社の関与が示唆。企業の回答も掲載

報告書:省を超えて工場に配属された新疆の少数民族の強制労働に大手ブランド83社の関与が示唆。企業の回答も掲載

China: Mounting concerns over forced labour in Xinjiang

China: Huawei's facial recognition software capable of identifying minorities stirs controversy

プライバシー情報

このサイトでは、クッキーやその他のウェブストレージ技術を使用しています。お客様は、以下の方法でプライバシーに関する選択肢を設定することができます。変更は直ちに反映されます。

ウェブストレージの使用についての詳細は、当社の データ使用およびクッキーに関するポリシーをご覧ください

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

クッキーのアナリティクス

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

本サイトにおけるお客様のプライバシーに関する選択

このサイトでは、必要なコア機能を超えてお客様の利便性を高めるために、クッキーやその他のウェブストレージ技術を使用しています。