Parents in multiple states agreed to settle lawsuits against AI startup Character.ai and Google which alleged their children harmed or killed themselves after using the company’s chatbots, according to court documents made public on Tuesday and Wednesday.

Key Facts
- The companies agreed to settle lawsuits with parents in Florida, New York, Texas and two families in Colorado, according to court documents, but details of the settlements were not immediately known.
- Character.ai was founded by two former Google engineers in 2021, but the search giant rehired the founders and licensed the startup’s technology in 2024, and both companies were named in the lawsuits.
- The lawsuits were brought after several suicides of teenagers made headlines, including the death of a 14-year-old in Florida who had an explicit relationship with a chatbot mimicking “Game of Thrones” character Daenerys Targaryen, and the death of a 13-year-old in Colorado who turned to the app while struggling with bullying.
- Representatives for Character.ai declined to comment at this time, and Google did not immediately return a request for comment from Forbes.
Key Background
The lawsuits against Character.ai came as chatbot providers faced more and more scrutiny over how their products are interacting with consumers, especially vulnerable users like children. Character.ai announced sweeping changes to its platform in October, banning underage users from having “open-ended” chats and limiting their time on the app to two hours per day.
Tangent
OpenAI is facing a lawsuit from the parents of a California teenager who say the chatbot served as a “suicide coach” for their son. OpenAI has since argued in court that the teenager “misused” the chatbot. The company is also facing a wrongful death lawsuit from the family of a 23-year-old Texas college graduate, who messaged with the popular chatbot just moments before taking his own life, CNN reported. In December, the company was hit with another wrongful death suit from the estate of an 83-year-old woman who was killed by her adult son before he took his own life. The lawsuit alleges the chatbot “validated and magnified” the son’s paranoid beliefs.
Look back on the week that was with hand-picked articles from Australia and around the world. Sign up to the Forbes Australia newsletter here or become a member here.
This story was originally published on forbes.com.