abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Эта страница недоступна на Русский и отображается на English

Статья

10 Фев 2020

Автор:
Caroline Haskins, Ryan Mac & Logan McDonald, BuzzFeed News

The ACLU slammed a facial recognition company that scrapes photos from Instagram & Facebook

Clearview AI... has been telling prospective law enforcement clients that a review of its software based on “methodology used by the American Civil Liberties Union” is stunningly accurate... But the ACLU said that claim is highly misleading and noted that Clearview's effort to mimic the methodology of its 2018 facial recognition study was a misguided attempt in “manufacturing endorsements." “The report is absurd on many levels and further demonstrates that Clearview simply does not understand the harms of its technology in law enforcement hands,” [said] ACLU Northern California attorney Jacob Snow.

... Clearview, which claims to be working with more than 600 law enforcement agencies, has also been sued and publicly denounced by critics including New Jersey Attorney General Gurbir Grewal, who ordered a moratorium on the state’s use of the technology... Facebook, YouTube, LinkedIn, and PayPal had all sent cease-and-desist letters to Clearview... 

... “The ACLU is a highly-respected institution that conducted its own widely distributed test of facial recognition for accuracy across demographic groups,” Ton-That told BuzzFeed News. “We appreciated the ACLU’s efforts to highlight the potential for demographic bias in AI, which is why we applied their test and methodology to our own technology.”

Хронология