Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

NYT lawsuit forces OpenAI to keep 20M ChatGPT chats

OpenAI argued that the order presents significant engineering challenges and conflicts with international data protection regulations like GDPR.

byKerem Gülen
November 13, 2025
in Industry
Home Industry
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

OpenAI is engaged in a legal battle over a U.S. court order compelling it to indefinitely retain 20 million randomly sampled ChatGPT conversations as part of its copyright infringement lawsuit with The New York Times.

The preservation order, issued on May 13 and affirmed by District Judge Sidney Stein on June 26, forces OpenAI to hold user data indefinitely, a move that directly conflicts with the company’s standard 30-day deletion policy for unsaved chats. The order affects data from December 2022 through November 2024 for ChatGPT Free, Plus, Pro, and Team subscribers, as well as API customers without Zero Data Retention agreements. Enterprise, Edu, and ZDR customers are excluded.

OpenAI has stated it implemented restricted access protocols for the preserved data, limiting access to a small legal and security team, and affirmed the data will not be used for training or turned over to external parties at this time.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

The New York Times filed the lawsuit in December 2023, alleging OpenAI illegally used millions of its articles for training models. The lawsuit seeks the destruction of all models trained on its works and potential billions in damages.

OpenAI argued that the order presents significant engineering challenges and conflicts with international data protection regulations like GDPR. Judge Stein rejected these arguments, emphasizing that OpenAI’s terms of service allow data preservation for legal requirements.

A recent modification to the order on September 26, 2025, provided limited relief, ending the requirement for OpenAI to preserve all new chat logs moving forward. However, the company must retain the data already saved and any information from ChatGPT accounts flagged by The New York Times.

Security practitioners warn the case shatters assumptions about data deletion in AI interactions. OpenAI CEO Sam Altman suggested the situation accelerates the need for an “AI privilege” concept, similar to attorney-client privilege. The litigation also raises concerns for enterprise users regarding compliance with regulations like HIPAA and GDPR.

In a statement regarding security, OpenAI CISO Dane Stuckey said, “Only serious misuse and critical risks—such as threats to someone’s life, plans to harm others, or cybersecurity threats—may ever be escalated to a small, highly vetted team of human reviewers.”


Featured image credit

Tags: openAIthe new york times

Related Posts

How professionals avoid information overload and make confident decisions

How professionals avoid information overload and make confident decisions

December 3, 2025
AWS re:Invent 2025: Day 1-3 announcements in a nutshell

AWS re:Invent 2025: Day 1-3 announcements in a nutshell

December 3, 2025
Anthropic hires legal heavyweights for a potential 2026 IPO

Anthropic hires legal heavyweights for a potential 2026 IPO

December 3, 2025
Singularity Compute launches NVIDIA GPU cluster to supercharge enterprise AI and Web3 workloads

Singularity Compute launches NVIDIA GPU cluster to supercharge enterprise AI and Web3 workloads

December 2, 2025
How AI-enhanced video is leveling up online lessons—and why it matters for EdTech at scale

How AI-enhanced video is leveling up online lessons—and why it matters for EdTech at scale

December 2, 2025
Samsung and Turkish Airlines launch smart baggage tracking service

Samsung and Turkish Airlines launch smart baggage tracking service

December 2, 2025

LATEST NEWS

Spotify Wrapped 2025: More layers, stories and connection than ever before

Your next Android call might tell you exactly why it is urgent

Your Android 16 phone gets a dark mode that works on every app

Raspberry Pi just got up to $25 more expensive

Users are mad about app suggestions in the highly priced ChatGPT Pro Plan

Red Dead Redemption is now available on Netflix Games mobile

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.