I’ve been using ChatGPT practically since its release, a long time ago. And experience has helped me learn a lesson and become clear about a number of things that, in my experience, it’s best not to ask this AI to do, unless you want it to end up backfiring on you. Take note of these things that I’ll never ask ChatGPT again, and I recommend you do the same.
Artificial intelligence is everywhere. It assists us at work, resolves questions in seconds, and even helps us organize our daily lives. But, like any technological advancement, it also has its dark side. ChatGPT and other AI chatbots aren’t just harmless tools. They can learn from you, remember details from your conversations, and, in the worst cases, put you in a compromising situation.
ChatGPT learns from what you say, even if you don’t believe it.
To understand the risks, you first need to understand how these chatbots work. ChatGPT is trained on millions of texts from the internet, but its learning doesn’t end there. Every conversation you have with it is a new source of information. That’s why, although OpenAI claims it doesn’t store user data in its free version, many companies use these systems to train more advanced models. In fact, there have already been cases of leaks where private user conversations have been exposed.
What does this mean for you? Anything you share with AI can be used to improve its responsiveness, but it can also be stored on external servers or, in the worst case, leaked in a cyberattack.
Artificial intelligence can be a great ally in many tasks, but we mustn’t forget that it still carries risks. Although OpenAI and other companies claim their models are designed to respect privacy, the reality is that any information you enter into ChatGPT could be stored, used to train future models, or, in the worst case, leaked in a cyberattack. Therefore, it’s essential to know what data you should never share with this tool to avoid jeopardizing your security and privacy.
Personal and financial information
It seems obvious, but many people make the mistake of providing information like their full name, address, phone number, or bank details. Some even ask the AI to remind them of passwords or account details. A fatal mistake.
Why? Although ChatGPT doesn’t have “memory” in the free version, the data you enter can be recorded, used to improve the model, and, in some cases, made accessible to OpenAI. If there’s ever a security breach, your information could be exposed.
Never share:
- Credit card numbers
- Bank accounts or payment details
- Passwords or access keys
- Personal information that can identify you
Details of your work or confidential projects
In 2023, Samsung banned ChatGPT among its employees after discovering some were uploading confidential source code to the platform. Yes, sharing your company’s information with AI can be that serious.
If you’re working on sensitive projects, avoid using ChatGPT to summarize, brainstorm, or fix code. Everything you input could be used to train future AI models, and that includes business data that could end up in the wrong hands.
Political, ideological, or personal opinions
Chatbots don’t have opinions of their own, but the companies behind them do analyze user conversations. Google, OpenAI, and other large companies can use that data to train their models or even for advertising segmentation and trend analysis. If you start talking about politics, ideology, or any other personal topic, there’s no guarantee that that information won’t be stored or used in some way. It’s best to be cautious.
Think about it: If a hacker or malicious entity were to gain access to OpenAI’s databases, what could they do with all that information about your opinions and beliefs?
Information about your location or daily routines
Never tell ChatGPT things like:
- “I go to the gym every morning at 7.”
- “I live in neighborhood X, on street Y.”
- “I work at company Z, in such a city.”
This data may seem harmless, but if someone accessed it, it could be used to track your location, predict your movements, or even plan a targeted attack. Think about cybercriminals using AI to gather data on potential victims. Don’t give out information that will make it easier for them to track you.
Intellectual property or creative ideas
If you’re a writer, programmer, designer, or work in any creative field, never use ChatGPT to save your ideas or ask for full development. Why?
- There is no guarantee that they will not be stored or reused.
- You could lose control over your own idea. AI learns from what you share and can be “inspired” by your ideas for future responses to other users.
If you have an important project, a script, a design, or any original idea, keep it private and use secure tools to develop it.
Companies use us to train their models
OpenAI, Google, Microsoft, and other major tech companies continue to train their AI models, and we’re part of that process. Every conversation we have with AI is another piece of data that helps improve its responsiveness. The question is: what happens to all that information in the long term?
To date, there are no clear regulations on how this data is used or what security measures are in place to protect it. Until the laws catch up, the best protection is to avoid sharing sensitive information.
How to use ChatGPT safely
If you want to take advantage of AI without putting your privacy at risk, follow these recommendations:
- Do not enter personal or financial information.
- Avoid sharing information about your work or projects.
- Use ChatGPT as a support tool, not as a personal journal.
- If you need to share something sensitive, check the platform’s privacy options.
- Be aware that anything you write could be used to train the AI.
Use AI, but wisely
ChatGPT is a powerful tool, but you need to use it carefully. It’s not your friend, your confidant, or a safe place to store sensitive information. It’s an AI designed to learn from you. If you use it intelligently, it can be a great help in your daily life. But if you share too much, you could be unwittingly giving away valuable data.