ChatGPT introduces per-conversation opt-out for training data
AI Impact Summary
ChatGPT now lets users disable chat history and selectively allow conversations for model training, directly reducing the data pool and enforcing per-conversation consent. This changes how data is ingested into the training pipeline and requires the system to honor user preferences at the data collection stage, with privacy compliance implications and auditability. Engineers should evaluate how to handle opt-out signals, adjust labeling and dataset composition, and plan for alternative data sources to maintain model quality.
Affected Systems
Business Impact
Opt-out-enabled chat history reduces training data availability for model improvements, potentially slowing convergence or introducing bias unless compensating data strategies are used.
- Date
- Date not specified
- Change type
- capability
- Severity
- medium