USA: TikTok must face lawsuit over 10-year-old girl's death, court rules
"TikTok must face lawsuit over 10-year-old girl's death, US court rules", 28 August 2024
A U.S. appeals court has revived a lawsuit against TikTok by the mother of a 10-year-old girl who died after taking part in a viral "blackout challenge" in which users of the social media platform were dared to choke themselves until they passed out.
While a federal law typically shields internet companies from lawsuits over content posted by users, the Philadelphia-based 3rd U.S. Circuit Court of Appeals... ruled, opens new tab the law does not bar Nylah Anderson's mother from pursuing claims that TikTok's algorithm recommended the challenge to her daughter.
U.S. Circuit Judge Patty Shwartz, writing for the three-judge panel, said that Section 230 of the Communications Decency Act of 1996 only immunizes information provided by third parties and not recommendations TikTok itself made via an algorithm underlying its platform.
She acknowledged the holding was a departure from past court rulings by her court and others holding that Section 230 immunizes an online platform from liability for failing to prevent users from transmitting harmful messages to others.
But she said that reasoning no longer held after a U.S. Supreme Court ruling in July on whether state laws designed to restrict the power of social media platforms to curb content they deem objectionable violate their free speech rights.
In those cases, the Supreme Court held a platform's algorithm reflects "editorial judgments" about "compiling the third-party speech it wants in the way it wants." Shwartz said under that logic, content curation using algorithms is speech by the company itself, which is not protected by Section 230.
TikTok did not respond to requests for comment.