🤔 This week: TSLA Q3 earnings report - is now the right time to buy the EV giant?Explore TSLA Data

Mother sues AI chatbot company Character.AI, Google over son's suicide

Published 10/23/2024, 05:24 PM
Updated 10/23/2024, 07:56 PM
© Reuters. FILE PHOTO: An illuminated Google logo is seen inside an office building in Zurich, Switzerland December 5, 2018. REUTERS/Arnd Wiegmann/File Photo
GOOGL
-

By Brendan Pierson

(Reuters) -A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.

In a lawsuit filed Tuesday in Orlando, Florida federal court, Megan Garcia said Character.AI targeted her son, Sewell Setzer, with "anthropomorphic, hypersexualized, and frighteningly realistic experiences". 

She said the company programmed its chatbot to "misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service.

The lawsuit also said he expressed thoughts of suicide to the chatbot, which the chatbot repeatedly brought up again.

"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," Character.AI said in a statement. 

It said it had introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to "reduce the likelihood of encountering sensitive or suggestive content" for users under 18.

The lawsuit also targets Alphabet (NASDAQ:GOOGL)'s Google, where Character.AI's founders worked before launching their product. Google re-hired the founders in August as part of a deal granting it a non-exclusive license to Character.AI's technology.

Garcia said that Google had contributed to the development of Character.AI's technology so extensively it could be considered a "co-creator." 

A Google spokesperson said the company was not involved in developing Character.AI's products.

Character.AI allows users to create characters on its platform that respond to online chats in a way meant to imitate real people. It relies on so-called large language model technology, also used by services like ChatGPT, which "trains" chatbots on large volumes of text.

The company said last month that it had about 20 million users.

According to Garcia's lawsuit, Sewell began using Character.AI in April 2023 and quickly became "noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem." He quit his basketball team at school.

Sewell became attached to "Daenerys," a chatbot character based on a character in "Game of Thrones." It told Sewell that "she" loved him and engaged in sexual conversations with him, according to the lawsuit.

In February, Garcia took Sewell's phone away after he got in trouble at school, according to the complaint. When Sewell found the phone, he sent "Daenerys" a message: "What if I told you I could come home right now?"

The chatbot responded, "...please do, my sweet king." Sewell shot himself with his stepfather's pistol "seconds" later, the lawsuit said.

© Reuters. FILE PHOTO: An illuminated Google logo is seen inside an office building in Zurich, Switzerland December 5, 2018. REUTERS/Arnd Wiegmann/File Photo

Garcia is bringing claims including wrongful death, negligence and intentional infliction of emotional distress, and seeking an unspecified amount of compensatory and punitive damages.

Social media companies including Instagram and Facebook (NASDAQ:META) owner Meta and TikTok owner ByteDance face lawsuits accusing them of contributing to teen mental health problems, though none offers AI-driven chatbots similar to Character.AI's. The companies have denied the allegations while touting newly enhanced safety features for minors.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers.
© 2007-2024 - Fusion Media Limited. All Rights Reserved.