OpenAI Resists NYT's Demand for 20 Million Chats, Announces Privacy Roadmap
Executive Summary
OpenAI is publicly opposing a legal demand from The New York Times (NYT) to turn over 20 million private ChatGPT user conversations as part of an ongoing lawsuit. The company frames the demand as a severe violation of user privacy and is asking the court to reject it. In response to this challenge, OpenAI is also announcing an accelerated security and privacy roadmap, which includes the development of client-side encryption to make user conversations inaccessible to any third party, including OpenAI itself.
Key Takeaways
* Core Conflict: The New York Times is legally demanding a random sample of 20 million private consumer ChatGPT conversations from December 2022 to November 2024 as part of its lawsuit against OpenAI.
* OpenAI's Stance: The company is actively fighting the demand in court, labeling it an overreach that disregards user privacy and is irrelevant to the lawsuit's core claims.
* Affected Users: The demand impacts a random sample of consumer ChatGPT users within the specified timeframe. It does not affect ChatGPT Enterprise, Edu, Business, or API customers.
* Accelerated Security Roadmap: OpenAI is fast-tracking new privacy features, most notably client-side encryption, to ensure conversations remain private and inaccessible to outside parties.
* Reduced Human Review: The company is developing fully automated systems to detect safety issues, aiming to limit human review of conversations to only the most critical risks, such as threats of harm.
* Immediate Mitigation: For the data in question, OpenAI is running a de-identification process to scrub Personally Identifiable Information (PII) and pushing for any review by the NYT to occur in a secure, legally controlled environment.
Strategic Importance
This public stance is a critical move by OpenAI to build user trust by positioning itself as a defender of privacy against legal challenges. The outcome of this dispute could set a significant precedent for the entire AI industry regarding the legal discoverability and protection of user conversation data.