Flash News

E-TJERA

What information should you never share with ChatGPT?

What information should you never share with ChatGPT?

The presence of ChatGPT and other chatbots is becoming increasingly present in users' daily lives, with many of them relying on these systems for a wide range of topics.

But as people learn from these systems, they often share sensitive information, which can pose risks to privacy and data security.

Experts in the field of artificial intelligence warn that, while chatbots can provide accurate and fast answers, there are some types of information that should never be shared with these systems.

This is for reasons of privacy and the possibility of data misuse.

Companies developing such technology, including OpenAI and Google, also highlight this issue.

OpenAI asks users not to share sensitive information, while Google reminds Gemini users not to put up confidential data that they wouldn't want anyone to see.

Chatbot conversations can be used to train future artificial intelligence models, and this could lead to the risk of sensitive data leakage.

According to experts, these are five categories of information that should not be shared with chatbots:

Personal identification information: Identity, passport, tax numbers, date of birth, address and telephone number.

Medical data: Medical tests or personal health-related information.

Latest news