What should you never tell ChatGPT or any other GenAI chatbot?

"Preventing the problem is far easier than solving it after the fact..."

What should you never tell ChatGPT or any other GenAI chatbot?
A British version of the OpenAI logo, generated by ChatGPT

We're using chatbots to write our homework, offer life advice and even pose as simulated lovers.

But a British internet provider has issued an urgent reminder that interacting with ChatGPT and other GenAI bots is "not all AI-generated sunshine and rainbows".

Users should take care to remember that all chatbots are trained on user interactions and store away all the data you give them on company servers, which is associated with the account you use to access them.

The sheer amount of data collected and analyzed by AI firms has raised significant security concerns. The latest moral panic was sparked by DeepSeek, a Chinese AI model which was banned on Australian government devices due to data privacy and security risks.

So what should you avoid telling a chatbot under any circumstances?

“In general, anything that you wouldn’t tell a stranger you shouldn’t tell a chatbot,” said Rupert Bedell, Managing Director at Fasthosts. “There’s the obvious stuff like bank details, card information, and account passwords, but really you should avoid sharing anything personal with them.

"Full names, addresses, telephone numbers, remember that everything you say to a chatbot can and is being recorded and saved. If the servers ever experienced a security breach, any and all information tied to your account including your chat history could be stolen by malicious hackers, so yes, keep it to yourself.

“It’s not just your own information that you shouldn’t share, but also any sensitive data surrounding your workplace. Samsung had to ban the use of ChatGPT within the company last year thanks to an employee sharing sensitive code to the tool, with the company being concerned about the fact that they had no way of deleting the information from the server. 

“Avoiding giving chatbots company data isn’t just protecting the business though, it’s also protecting yourself. Revealing or sharing sensitive information about a business, even without any malicious intent behind it, can be grounds for instant dismissal and even legal action, and so should be treated like you would treat your own personal information.

“If you’re worried that you may have already revealed too much information to a chatbot, you can request that your data not be used for training purposes, but the only way for you to remove your data entirely is to delete your account.

"However, these chatbots often have privacy policies with vague wording around how long your data is stored for after account deletion, and many are still sceptical of how the data is handled. It’s very much a case of preventing the problem being far easier than solving it after the fact.”

Have you got a story or insights to share? Get in touch and let us know. 

Follow Machine on XBlueSky and LinkedIn