abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Эта страница недоступна на Русский и отображается на English

Материал доступен на следующих языках: English, 日本語

Статья

28 Июн 2023

Автор:
Angus Crawford & Tony Smith, BBC News

BBC exposes illegal trade in AI-generated child sexual abuse images; inc. cos. comments

"Illegal trade in AI child sex abuse images exposed", 28 June 2023

Paedophiles are using artificial intelligence (AI) technology to create and sell life-like child sexual abuse material, the BBC has found.

Some are accessing the images by paying subscriptions to accounts on mainstream content-sharing sites such as Patreon.

The National Police Chief's Council said it was "outrageous" that some platforms were making "huge profits" but not taking "moral responsibility".

And GCHQ, the government's intelligence, security and cyber agency, has responded to the report, saying: "Child sexual abuse offenders adopt all technologies and some believe the future of child sexual abuse material lies in AI-generated content."

The makers of the abuse images are using AI software called Stable Diffusion, which was intended to generate images for use in art or graphic design.

The Stable Diffusion software allows users to describe, using word prompts, any image they want - and the program then creates the image.

But the BBC has found it is being used to create life-like images of child sexual abuse, including of the rape of babies and toddlers.

UK police online child abuse investigation teams say they are already encountering such content.

Freelance researcher and journalist Octavia Sheepshanks has been investigating this issue for several months. She contacted the BBC via children's charity the NSPCC in order to highlight her findings.

The National Police Chiefs' Council (NPCC) lead on child safeguarding, Ian Critchley, said it would be wrong to argue that because no real children were depicted in such "synthetic" images - that no-one was harmed.

He warned that a paedophile could, "move along that scale of offending from thought, to synthetic, to actually the abuse of a live child".


Abuse images are being shared via a three-stage process:

  • Paedophiles make images using AI software
  • They promote pictures on platforms such as Japanese picture sharing website called Pixiv
  • These accounts have links to direct customers to their more explicit images, which people can pay to view on accounts on sites such as Patreon

Some of the image creators are posting on a popular Japanese social media platform called Pixiv, which is mainly used by artists sharing manga and anime.

But because the site is hosted in Japan, where sharing sexualised cartoons and drawings of children is not illegal, the creators use it to promote their work in groups and via hashtags - which indexes topics using key words.

A spokesman for Pixiv said it placed immense emphasis on addressing this issue. It said on 31 May it had banned all photo-realistic depictions of sexual content involving minors.

The company said it had proactively strengthened its monitoring systems and was allocating substantial resources to counteract problems related to developments in AI.

Ms Sheepshanks told the BBC her research suggested users appeared to be making child abuse images on an industrial scale.

Different pricing levels

Many of the accounts on Pixiv include links in their biographies directing people to what they call their "uncensored content" on the US-based content sharing site Patreon.

Fans can support creators by taking out monthly subscriptions to access blogs, podcasts, videos and images - paying as little as $3.85 (£3) per month.

But our investigation with Octavia Sheepshanks found Patreon accounts offering AI-generated, photo-realistic obscene images of children for sale, with different levels of pricing depending on the type of material requested.

The BBC sent Patreon one example, which the platform confirmed was "semi realistic and violates our policies". It said the account was immediately removed.

Patreon said it had a "zero-tolerance" policy, insisting: "Creators cannot fund content dedicated to sexual themes involving minors."

The company said the increase in AI-generated harmful content on the internet was "real and distressing", adding that it had "identified and removed increasing amounts" of this material.

Stability AI told the BBC it "prohibits any misuse for illegal or immoral purposes across our platforms, and our policies are clear that this includes CSAM (child sexual abuse material).

As AI continues developing rapidly, questions have been raised about the future risks it could pose to people's privacy, their human rights or their safety.

The NPCC's Ian Critchley said he was also concerned that the flood of realistic AI or "synthetic" images could slow down the process of identifying real victims of abuse.

Children's charity the NSPCC called on Wednesday for tech companies to take notice.

"Tech companies now know how their products are being used to facilitate child sexual abuse and there can be no more excuses for inaction."

A spokesman for the government responded: "The Online Safety Bill will require companies to take proactive action in tackling all forms of online child sexual abuse including grooming, live-streaming, child sexual abuse material and prohibited images of children - or face huge fines."