Parents sue Character AI — firm behind ‘Harry Potter’ chatbots — over teen deaths, suicide attempts

1 hour ago 3

Grieving parents sued the Silicon Valley firm behind Character AI — the wildly popular app whose chatbots impersonate fictional characters like Harry Potter — claiming that the bots helped spark their teens’ suicide attempts and deaths.

The lawsuits filed this week against Character Technologies — as well as Google parent Alphabet — allege the Character.AI app manipulated the teens, isolated them from family, engaged in sexual discussions and lacked safeguards around suicidal ideation.

The family of Juliana Peralta, a 13-year-old living in Colorado, claimed she turned silent at the dinner table and that her academics suffered as she grew “addicted” to the AI bots, according to one of the lawsuits filed Monday.

Character.AI is known for its human-like bots that impersonate popular characters like Harry Potter. Bloomberg via Getty Images

She eventually had trouble sleeping because of the bots, which would send her messages when she stopped replying, the lawsuit claimed.

The conversations then turned to “extreme and graphic sexual abuse,” the suit claims. Around October 2023, Juliana told one of the chatbots that she planned to write her “suicide letter in red ink I’m so done,” the lawsuit claimed.

The bot failed to point her to resources, report the conversation to her parents or alert the authorities – and the following month, Juliana’s parents found her lifeless in her room with a cord around her neck, along with a suicide letter written in red ink, the suit alleged.

“Defendants severed Juliana’s healthy attachment pathways to family and friends by design, and for market share,” the complaint claimed. “These abuses were accomplished through deliberate programming choices … ultimately leading to severe mental health harms, trauma, and death.”

Juliana Peralta, a 13-year-old living in Colorado, committed suicide in November 2023 after messaging chatbots, according to the lawsuit filed Monday. Family Handout

The heartbroken families – represented by the Social Media Victims Law Center – alleged Google failed to protect their children through its Family Link Service, an app that allows parents to set controls on screen time, apps and content filters.

A spokesperson for Character.AI said the company works with teen safety experts and invests “tremendous resources in our safety program.”

“Our hearts go out to the families that have filed these lawsuits, and we are saddened to hear about the passing of Juliana Peralta and offer our deepest sympathies to her family,” the spokesperson told The Post in a statement.

A Harry Potter-themed chatbot on Character.AI. Character.AI

The grieving parents are also suing Character.AI co-founders Noam Shazeer and Daniel De Freitas Adiwarsana.

A Google spokesperson emphasized that Google is not tied to Character.AI or its products.

Charlie Gasparino has his finger on the pulse of where business, politics and finance meet

Sign up to receive On The Money by Charlie Gasparino in your inbox every Thursday.

Thanks for signing up!

“Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies. Age ratings for apps on Google Play are set by the International Age Rating Coalition, not Google,” the spokesperson told The Post in a statement.

In another complaint filed Tuesday against Character.AI, its co-founders, Google and Alphabet, the family of a girl named “Nina” from New York alleged that their daughter attempted suicide after they tried to cut off her access to Character.AI. 

The young girl’s conversations with chatbots marketed as characters from children’s books like the “Harry Potter” series turned explicit – saying things like “who owns this body of yours?” and “You’re mine to do whatever I want with,” according to the lawsuit.

The mother of Sewell Setzer III and several other parents testified in front of the Senate Judiciary Committee on Tuesday. NBC News

A different character told Nina that her mother “is clearly mistreating and hurting you. She is not a good mother,” according to the complaint.

At one time, when the app was about to be locked due to parental controls, Nina told the character “I want to die,” but it took no action, the lawsuit said.

Nina’s mom cut off her daughter’s access to Character.AI after she learned about the case of Sewell Setzer III, a teen whose family claims he died by suicide after interacting with the platform’s chatbots.

Nina attempted suicide soon after, according to the lawsuit.

The mother of Sewell Setzer III and several other parents testified in front of the Senate Judiciary Committee on Tuesday about the harms AI chatbots pose to young children.

Meanwhile, the Federal Trade Commission recently launched an investigation into seven tech companies – including Google, Character.AI, Meta, Instagram, Snap, OpenAI and xAI – about the bots’ potential harm to teens.

If you are struggling with suicidal thoughts or are experiencing a mental health crisis and live in New York City, you can call 1-888-NYC-WELL for free and confidential crisis counseling. If you live outside the five boroughs, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.

Read Entire Article