abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

16 Jul 2022

Author:
Mara Hvistendahl, The Intercept

Facial recognition tool "PimEyes" could allegedly contribute to child exploitation; incl. co. comments

"Facial recognition search engine pulls up “potentially explicit” photos of kids", 16 July 2022

The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.

Often called the Google of facial recognition, PimEyes search results include images that the site labels as “potentially explicit,” which could lead to further exploitation of children at a time when the dark web has sparked an explosion of images of abuse.

“There are privacy issues raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center. “But it’s particularly dangerous when we’re talking about children, when someone may use that to identify a child and to track them down.”

While The Intercept searched for fake faces due to privacy concerns, the results contained many images of actual children pulled from a wide range of sources, including charity groups and educational sites.

PimEyes ... in December 2021 was purchased by Georgian international relations scholar Giorgi Gobronidze...

In a ... video interview ...Gobronidze offered a vague and sometimes contradictory account of the site’s privacy protections.

He said that PimEyes was working to develop better safeguards for children, though he offered varying responses when asked what those might entail. “It’s a task that was given already to our technical group, and they have to bring me a solution,” he said. “I gave them several options.”

“Designed for Stalkers”

On its website, PimEyes maintains that people should only use the tool to search for their own faces, claiming that the service is “not intended for the surveillance of others and is not designed for that purpose.” But the company offers subscriptions that allow people to perform dozens of unique searches a day... People who shell out for the premium service can set alerts for up to 500 different images or combinations of images, so that they are notified when a particular face shows up on a new site.

Gobronidze claimed that many of PimEyes’s subscribers are women and girls searching for revenge porn images of themselves, and that the site allows multiple searches so that such users can get more robust results.

“We Do Not Want to Be a Monster Machine”

Alarmingly, search results for AI-generated kids also include images that PimEyes labels as “potentially explicit.” The backgrounds in the labeled images are blurred, and since clicking through to the source URLs could contribute to the exploitation of children...

Gobronidze was vague on how he might limit abuse of children on the site.