Logo
×

Follow Us

Science & Tech

Privacy Alert!

Never share these five things with ChatGPT

Rabiul Islam Tushar

Rabiul Islam Tushar

Published: 25 Jun 2025

Never share these five things with ChatGPT

Illustration: Daily Sun Online

Listen | 8:48 min
A A

Artificial intelligence (AI) chatbots like ChatGPT have revolutionised how we seek information, draft emails, and even brainstorm ideas. However, as these tools grow more advanced, so do the risks of oversharing. Unlike human confidants, AI does not forget—every input may be processed, stored, or potentially exposed.

To safeguard your privacy, never share these five categories of information with ChatGPT or similar AI tools:

1. Personally Identifiable Information (PII)

  • Full name, home address, or phone number
  • National ID (NID) number, passport, or driving licence details Birthplace, family members’ names, or biometric data

⚠️Why it’s risky

AI platforms may log conversations to improve performance. While companies like OpenAI claim data anonymisation, breaches or misuse could expose sensitive details to hackers or third parties.

2. Financial Secrets

  • Credit/debit card numbers and CVV codes
  • Bank account numbers and PINs
  • Bank account statements
  • Mobile phone numbers linked to Mobile Financial Services (MFS) Cryptocurrency wallet keys and investment amounts

⚠️Why it’s risky

Cybercriminals actively target financial data. Even if ChatGPT doesn’t store information permanently, a screenshot or accidental leak could lead to fraud or identity theft.

3. Passwords and Login Credentials

  • Email, social media, or banking passwords
  • Two-factor authentication (2FA) codes
  • Security question answers (e.g., “Mother’s maiden name”)

⚠️Why it’s risky

AI chats are not encrypted like password managers. If your chat history is compromised, attackers could take control of your accounts.

4. Confidential Work Data

  • Unreleased product designs or patents
  • Internal company strategies or financial reports
  • Proprietary code or client contracts

⚠️Why it’s risky

Many employers explicitly ban AI tools for confidential work. Leaks could violate non-disclosure agreements (NDAs) or intellectual property laws, risking legal consequences or job dismissal.

5. Private or Illegal Disclosures

  • Medical records or mental health struggles
  • Relationship conflicts or family disputes
  • Admissions of illegal activity (even jokingly)

⚠️Why it’s risky

AI responses aren’t legally privileged. In extreme cases, authorities could subpoena chat logs as evidence.

Bonus tip: How to use ChatGPT safely

While AI chatbots offer incredible convenience, protecting your privacy is crucial. Follow these simple rules to stay secure:

  • Use generic examples—replace real names, dates, or figures with fictional placeholders.
  • Never paste sensitive documents—avoid sharing contracts, NID scans, or confidential files.
  • Treat chats as public—even deleted conversations may persist in backups or logs.

The Bottom Line

ChatGPT is a tool, not a vault. Treat it like a crowded room: if you wouldn’t say it aloud there, don’t type it here.

Comment

Read More