How to audit what ChatGPT knows about you – and reclaim your data privacy
The article outlines steps users can take to audit and limit the personal data ChatGPT collects, including opting out of data training, deleting chats, using temporary chats, managing memories, and deleting accounts. It highlights privacy concerns around how AI might use personal information in unforeseen ways. Users can also directly ask ChatGPT what it knows about them to assess their data exposure. These measures aim to increase user control over privacy in AI interactions.
Full article excerpt tap to expand
Innovation Home Innovation Artificial Intelligence How to audit what ChatGPT knows about you - and reclaim your data privacy If you're looking to limit the amount of personal information you give ChatGPT, these are the main settings you should know about. Written by Erin Carson, Managing Editor, News & FeaturesManaging Editor, News & Features April 26, 2026 at 3:00 a.m. PT Bloomberg / Contributor/ Bloomberg via GettyFollow ZDNET: Add us as a preferred source on Google.If you're one of the 900 million people who reportedly use ChatGPT every week, the chatbot might be a staple of life. Maybe it helps you get work done or come up with meal plans. You might even consult it whenever you have a scuffle with a friend or family member. But as you turn to ChatGPT for increasingly more in your life, you may want to re-evaluate how much personal information you're disclosing along the way. Ideally, you know not to disclose sensitive financial information -- but other details about you could also be worth shielding. Also: I put GPT-5.5 through a 10-round test: It got a near-perfect score(Disclosure: Ziff Davis, ZDNET's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) Privacy experts are already sounding the alarm about the potential harms of saying too much to your chatbot. The underlying concern is that no one is entirely sure how your personal information, whether sensitive or seemingly innocuous, could be used in the future. Some fear personal data could end up in a mass surveillance system or be used in other unforeseen ways that will ultimately harm or disadvantage you. That ambiguity, they argue, is reason enough for caution. Here are five ways you can better manage the amount of personal information ChatGPT has about you, if you're using a consumer account. 1. Opt out of training data One step you can take to make your ChatGPT experience more secure is to stop OpenAI from using your information to train its models. Security experts are voicing concern that if your data ends up in a model, it could one day be used in a way we can't even anticipate right now.Go to Settings > Data controls > Improve the model for everyone. Then toggle the switches off and click "Done." Also: OpenAI is training models to 'confess' when they lie - what it means for future AIYou can also use OpenAI's privacy portal to "Make a Privacy Request." Select "I have a consumer ChatGPT account," and then "Do not train on my content." From there, you might be prompted to sign in. After that step, you'll see a button to "Submit Request." This technique only applies to your data in ChatGPT moving forward. 2. Delete old chats Another action you can take to clean up the information you've given to ChatGPT is to delete old chats. There are two ways to do this. One is to go to Settings > Data controls > Delete all chats. You can also delete individual chats from the left-hand sidebar by clicking the three dots next to the name of the chat. Also: How to clean up your digital footprint - and why it matters more than you thinkAlthough the conversation will disappear from your chat history immediately, it can take up to 30 days to be permanently deleted from OpenAI's systems, according to the company's website. OpenAI also stipulates two exceptions where this rule might not apply: circumstances where the company has to hang on to data for "security or legal obligations" or because the…
This excerpt is published under fair use for community discussion. Read the full article at ZDNET.