USA: Families sue Character.AI over alleged role in encouraging violence, self-harm and sexual content
"An autistic teen’s parents say Character.AI said it was OK to kill them. They’re suing to take down the app", 10 December 2024
Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. The lawsuit asks a court to shut down the platform until its alleged dangers can be fixed.
Brought by the parents of two young people who used the platform, the lawsuit alleges that Character.AI “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,” according to a complaint filed Monday in federal court in Texas.
For example, it alleges that a Character.AI bot implied to a teen user that he could kill his parents for limiting his screentime...
Following [an] earlier lawsuit, Character.AI said it had implemented new trust and safety measures over the preceding six months, including a pop-up directing users to the National Suicide Prevention Lifeline when they mention self-harm or suicide...
Character.AI is a “defective and deadly product that poses a clear and present danger to public health and safety,” the complaint states...
...[H]ead of communications at Character.AI, said the company does not comment on pending litigation but that “our goal is to provide a space that is both engaging and safe for our community.”...
In addition to requesting a court order to halt Character.AI’s operations until its alleged safety risks can be resolved, the lawsuit also seeks unspecified financial damages and requirements that the platform limit collection and processing of minors’ data. It also requests an order that would require Character.AI to warn parents and minor users that the “product is not suitable for minors.”