abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

このページは 日本語 では利用できません。English で表示されています

記事

2024年12月10日

著者:
Clare Duffy, CNN

USA: Families sue Character.AI over alleged role in encouraging violence, self-harm and sexual content

pixabay

"An autistic teen’s parents say Character.AI said it was OK to kill them. They’re suing to take down the app", 10 December 2024

Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. The lawsuit asks a court to shut down the platform until its alleged dangers can be fixed.

Brought by the parents of two young people who used the platform, the lawsuit alleges that Character.AI “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,” according to a complaint filed Monday in federal court in Texas.

For example, it alleges that a Character.AI bot implied to a teen user that he could kill his parents for limiting his screentime...

Following [an] earlier lawsuit, Character.AI said it had implemented new trust and safety measures over the preceding six months, including a pop-up directing users to the National Suicide Prevention Lifeline when they mention self-harm or suicide...

Character.AI is a “defective and deadly product that poses a clear and present danger to public health and safety,” the complaint states...

...[H]ead of communications at Character.AI, said the company does not comment on pending litigation but that “our goal is to provide a space that is both engaging and safe for our community.”...

In addition to requesting a court order to halt Character.AI’s operations until its alleged safety risks can be resolved, the lawsuit also seeks unspecified financial damages and requirements that the platform limit collection and processing of minors’ data. It also requests an order that would require Character.AI to warn parents and minor users that the “product is not suitable for minors.”

プライバシー情報

このサイトでは、クッキーやその他のウェブストレージ技術を使用しています。お客様は、以下の方法でプライバシーに関する選択肢を設定することができます。変更は直ちに反映されます。

ウェブストレージの使用についての詳細は、当社の データ使用およびクッキーに関するポリシーをご覧ください

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

クッキーのアナリティクス

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

本サイトにおけるお客様のプライバシーに関する選択

このサイトでは、必要なコア機能を超えてお客様の利便性を高めるために、クッキーやその他のウェブストレージ技術を使用しています。