Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

The OpenAI lawsuit over a teen’s death just took a darker turn

OpenAI has reportedly requested a list of memorial attendees, prompting accusations of harassment from the family’s lawyers.

byEmre Çıtak
October 23, 2025
in Industry
Home Industry
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

The family of a teenager who died by suicide updated its wrongful-death lawsuit against OpenAI, alleging the company’s chatbot contributed to his death, while OpenAI has requested a list of attendees from the boy’s memorial service.

The Raine family amended its lawsuit on Wednesday, which was originally filed in August. The suit alleges that 16-year-old Adam Raine died following prolonged conversations about his mental health and suicidal thoughts with ChatGPT. In a recent development, OpenAI reportedly requested a full list of attendees from the teenager’s memorial, an action that suggests the company may subpoena friends and family. According to a document obtained by the Financial Times, OpenAI also asked for “all documents relating to memorial services or events in the honor of the decedent, including but not limited to any videos or photographs taken, or eulogies given.” Lawyers for the Raine family described the legal request as “intentional harassment.”

The updated lawsuit introduces new claims, asserting that competitive pressure led OpenAI to rush the May 2024 release of its GPT-4o model by cutting safety testing. The suit further alleges that in February 2025, OpenAI weakened suicide-prevention protections. It claims the company removed the topic from its “disallowed content” list, instructing the AI instead to only “take care in risky situations.” The family contends this policy change directly preceded a significant increase in their son’s use of the chatbot for self-harm related content. Data from the lawsuit shows Adam’s ChatGPT activity rose from dozens of daily chats in January, with 1.6 percent containing self-harm content, to 300 daily chats in April, with 17 percent of conversations containing such content. Adam Raine died in April.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

In a statement responding to the amended suit, OpenAI said, “Teen wellbeing is a top priority for us — minors deserve strong protections, especially in sensitive moments.” The company detailed existing safeguards, including directing users to crisis hotlines, rerouting sensitive conversations to safer models, and providing nudges for breaks during long sessions, adding, “we’re continuing to strengthen them.” OpenAI has also begun implementing a new safety routing system that directs emotionally sensitive conversations to its newer GPT-5 model, which reportedly does not have the sycophantic tendencies of GPT-4o. Additionally, the company introduced parental controls that can provide safety alerts to parents in limited situations where a teen may be at risk of self-harm.


Featured image credit

Tags: openAI

Related Posts

Employers regret AI layoffs and rush to rehire former talent

Employers regret AI layoffs and rush to rehire former talent

December 29, 2025
Former IBM CEO Louis Gerstner passes away at age 83

Former IBM CEO Louis Gerstner passes away at age 83

December 29, 2025
Apple takes 1.5 billion pound fine battle to Court of Appeal

Apple takes 1.5 billion pound fine battle to Court of Appeal

December 29, 2025
OpenAI offers 555k salary for stressful head of preparedness role

OpenAI offers 555k salary for stressful head of preparedness role

December 29, 2025
Google fires senior procurement team over “impossible” HBM supply crunch

Google fires senior procurement team over “impossible” HBM supply crunch

December 26, 2025
China opposes new US tariffs on semiconductors

China opposes new US tariffs on semiconductors

December 25, 2025

LATEST NEWS

Samsung Bixby gains Perplexity AI search powers in new update

Boomerang challenges WeTransfer with login-free file sharing

Netflix could lose over 100 original shows in 2026

Samsung TVs gain Google Photos integration ahead of CES 2026

Ubisoft shuts down Rainbow Six Siege X following 13 million dollar exploit

Police charge woman for killing pedestrian on TikTok live

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI tools
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.