I told ChatGPT something that I still regret — here's 7 things you should never share
Stay clear of these common mistakes

Chatbots — from customer service agents to AI assistants like ChatGPT and Gemini — have become part of our daily lives. While they all offer convenience, not all chatbots train the same way, which is why many users don’t realize how much sensitive information they might accidentally reveal.
Some risks are obvious, like never sharing your credit card number or bank information, but others are more subtle.
Here's why you should think twice before sharing certain info
In a recent feature about Gemini Canvas, I mentioned how the chatbot helped improve the first chapter of my book. Some readers expressed concern that sharing my writing with a chatbot meant it would be used to train the AI.
Thankfully, that’s not the case, at least not with the major players. OpenAI and Google clearly state that they don’t use user inputs to train their chatbots. However, there’s a nuance worth noting: even if your data isn’t being used for training, it can still be remembered within your account. That means anyone with access to your account — or in rare cases, someone who hacks into it — could theoretically retrieve your input. Highly unlikely, but not impossible and it's why I might think twice next time.
And then there are the bigger risks. Not all chatbots follow the same data practices. While ChatGPT and Gemini steer clear of training on user inputs, some human review can still occur to flag abuse or harmful content. Your unfinished novel probably won’t raise eyebrows, but threats or dangerous language might.
Other bots, like DeepSeek, do train directly on user data. That means anything you type in could be used to improve future models — and that’s a good reason to be cautious.
So, regardless of which chatbot you’re chatting with, here are seven things you should never share — yes, even with the ones that don’t train on your input.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
1. Personal identifiable information (PII)
Never share your full name, address, social security number (or equivalent), passport or driver’s license details. If the chatbot’s data is breached, hackers could steal your identity.
If you use a chatbot for something such as updating your resume or job searching, never include that information when prompting the chatbot. Add it at the end when polishing your resume.
2. Financial information
It seems evident that sharing credit cards, bank account details, and cryptocurrency private keys is a bad idea. Users may unknowingly include this information when summarizing a document from their credit card company or bank.
Perhaps you want tax advice and lean on a chatbot for assistance, but it’s essential to know that you can get your questions answered without sharing this information. Instead, use similar scenarios without revealing personal info.
3. Passwords
When chatting with a bot on a company website, you should never share your email passwords or two-factor authentication (2FA) codes. Legitimate services will never ask for this information via chat.
Because well-known bots like ChatGPT remember information to help you, that data could be potentially used by someone who attempts to use your account. For example, password tips like your mother’s maiden name or childhood pet should never enter the chat.
4. Highly sensitive or embarrassing secrets
If you're dying to share your participation in illegal activities or want to confess deeply personal admissions, don't share them with your favorite chatbot.
Private health issues are also something you should never share, as some chatbots log conversations for training or remember to tailor the conversation to your needs better.
5. Company information
Product prototypes, confidential meeting notes, or travel plans of the CEO are all types of information that some users might forget and should never be used in chatbot conversations.
While it may seem like a great idea to use a chatbot for secret prototypes, these types of inputs are never a good idea.
6. Explicit or harmful content
Graphic violence, threats, and hate speech should not be a part of your chatbot experience.
Even as a joke ("How do I hide a body?"), some AI systems flag and report such content to authorities.
7. Medical information
Although it’s no substitute for a medical professional, some ChatGPT users find the chatbot helps identify various symptoms. If you are the type of user who does this, you’ll want to be sure to screen your prompts to avoid any personal information.
Do not enter any medication prescriptions or include medical charts. Instead, try a prompt such as “What types of exercises build muscle for an anemic woman aged 25-30?” Be general about yourself within the prompt.
Final thoughts
AI chatbots are incredible tools — but they’re not journals, therapists, or secure vaults. While companies like Google and OpenAI have guardrails, it’s still wise to be selective about what you share.
Understanding how different bots scrape for data and handle inputs is the first step in protecting your privacy. When in doubt, keep sensitive, personal, or creative information offline — or at least, out of the chat window.
Remember: if you wouldn’t want it repeated, reviewed, or resurfaced later, it probably doesn’t belong in a chatbot inquiry.
More from Tom's Guide
- What happens when you feed Gemini 2.5 ChatGPT’s best prompts? I found out
- I used ChatGPT-4o's new AI image generator to turn my family into cartoons — the results blew me away
- Forget ChatGPT— this site offers mind-blowing AI image generation for free












You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.