abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Esta página não está disponível em Português e está sendo exibida em English

Artigo

12 Jun 2018

Author:
Russell Brandom, The Verge

Amazon urged to disclose public bias testing data for its facial recognition software; co. didn't respond

"Amazon needs to come clean about racial bias in its algorithms", 23 May 2018

[A]mazon’s quiet Rekognition program became very public, as new documents obtained by the ACLU of Northern California showed the system partnering with the city of Orlando and police camera vendors like Motorola Solutions for an aggressive new real-time facial recognition service. Amazon insists that the service is a simple object-recognition tool and will only be used for legal purposes. But even if we take the company at its word, the project raises serious concerns, particularly around racial bias...Facial recognition systems have long struggled with higher error rates for women and people of color — error rates that can translate directly into more stops and arrests for marginalized groups. And while some companies have responded with public bias testing, Amazon hasn’t shared any data on the issue, if it’s collected data at all...ACLU-NC’s Matt Cagle [said]: "“Face recognition is a biased technology. It doesn’t make communities safer. It just powers even greater discriminatory surveillance and policing.”...In the most basic terms... facial recognition systems pose an added threat of wrongful accusation and arrest for non-white people...I asked Amazon directly if the company has any data on bias testing for Rekognition, but so far, nothing has turned up...

Linha do tempo