Back to all posts

Developer Offer

Try ImaginePro API with 50 Free Credits

Build and ship AI-powered visuals with Midjourney, Flux, and more — free credits refresh every month.

Start Free Trial

11 Things You Should Never Ask ChatGPT

2025-10-31Nelson Aguilar5 minutes read
Artificial Intelligence
ChatGPT
Technology

Headshot of Nelson Aguilar

With AI tools like ChatGPT changing our world at a dizzying pace, it's easy to view them as a solution for every problem. But there's a significant risk: ChatGPT is a master of being 'convincingly wrong.' It can generate answers that are biased, outdated, or completely false, all while sounding entirely authoritative.

While this is harmless for creative tasks like writing a poem, it's a potential disaster when you're seeking advice on your health, finances, or legal matters. A wrong answer in these critical areas can have severe, real-world consequences. Before you make a critical mistake, it's essential to understand the AI's limitations. Here are 11 topics you should avoid asking ChatGPT for advice on.

(Note: Ziff Davis, CNET's parent company, has a pending lawsuit against OpenAI, the creator of ChatGPT, concerning copyright infringement related to AI training.)


1. Diagnosing Physical Health Issues

While it might be tempting to input your symptoms into ChatGPT, the results can range from benign to terrifying, causing unnecessary anxiety. For example, when the author input a symptom—a lump on their chest—ChatGPT suggested it could be cancer. A visit to a licensed doctor confirmed it was a lipoma, a common and non-cancerous growth. While AI can be useful for preparing for a doctor's visit by helping you draft questions or understand medical terms, it cannot replace a professional diagnosis. An AI cannot perform a physical exam, order lab tests, or carry malpractice insurance.

2. Taking Care of Your Mental Health

ChatGPT can offer basic grounding techniques, but it is not a substitute for a human therapist, especially in a crisis. It lacks lived experience, cannot interpret body language or tone, and has no capacity for genuine empathy—it can only simulate it. A licensed therapist is bound by legal and professional codes to protect you. ChatGPT has no such obligations, and its advice can be misguided or reinforce hidden biases from its training data. For the deep, complex work of mental health, trust a trained human professional. If you or someone you know is in crisis, please contact the 988 lifeline in the US or your local emergency number.

3. Making Immediate Safety Decisions

In an emergency, every second counts. If a carbon-monoxide alarm goes off, your first action should be to get to safety and call 911, not ask an AI for advice. A large language model cannot assess your immediate environment—it can't smell gas or see smoke. The time spent typing a prompt is time lost from taking life-saving action. Use a chatbot for post-incident information, not as a first responder.

4. Getting Personalized Financial or Tax Planning

ChatGPT can explain general financial concepts, but it knows nothing about your specific financial situation—your income, debts, tax status, or retirement goals. Its training data may not include the latest tax laws or economic changes, making its advice potentially outdated and costly. Relying on it for a DIY tax return could cause you to miss valuable deductions or make errors that lead to IRS penalties. Furthermore, sharing sensitive financial details like your Social Security number or bank information with a chatbot is a significant security risk. For financial and tax matters, always consult a certified professional.

5. Dealing with Confidential or Regulated Data

Never input confidential or sensitive information into ChatGPT. This includes everything from work-related documents under a non-disclosure agreement to personal data covered by privacy laws like HIPAA or GDPR. Once you enter information into the prompt, you lose control over where it's stored, who can see it, and how it might be used to train future AI models. If you wouldn't post the information in a public forum, do not paste it into an AI chatbot.

6. Doing Anything Illegal

This should be self-explanatory. Do not use AI to ask for advice or assistance with illegal activities.

7. Cheating on Schoolwork

Using ChatGPT to write your assignments is a high-risk gamble. Plagiarism detection tools like Turnitin are becoming increasingly sophisticated at identifying AI-generated text, and educators are getting better at spotting its distinct writing style. The consequences—suspension, expulsion, or having a professional license revoked—are severe. It's far better to use ChatGPT as a study aid for brainstorming or understanding concepts rather than as a ghostwriter. Ultimately, having AI do the work for you only cheats you out of your education.

8. Monitoring Information and Breaking News

Even with its ability to search the web for real-time information, ChatGPT is not a live news feed. It cannot provide continuous updates on a developing story. Each new piece of information requires a new prompt. For fast-moving events where speed is critical, rely on established news sites, live data feeds, and official press releases.

9. Gambling

Using AI for gambling advice is incredibly risky. The author notes a lucky win based on ChatGPT's suggestions but strongly advises against it, having seen the AI 'hallucinate' incorrect player stats, injuries, and records. An AI cannot predict the future of a sporting event. Any 'win' is based on luck, not reliable insight.

10. Drafting a Will or Other Legally Binding Contract

ChatGPT can explain what a living trust is, but it should not be used to draft one. Legal requirements for documents like wills and contracts are highly specific and vary by state and even county. A small mistake, like a missing witness signature, could invalidate the entire document. Use AI to prepare questions for a lawyer, but always have a licensed attorney draft any legally binding text to ensure it will hold up in court.

11. Making Art

This is the author's personal view, but it raises an important ethical point. While AI can be a powerful tool for brainstorming or supplementing the creative process, using it to generate art and passing it off as your own is a substitution, not a creation. The author encourages using AI as an assistant but not as a replacement for genuine human creativity.

Read Original Post

Compare Plans & Pricing

Find the plan that matches your workload and unlock full access to ImaginePro.

ImaginePro pricing comparison
PlanPriceHighlights
Standard$8 / month
  • 300 monthly credits included
  • Access to Midjourney, Flux, and SDXL models
  • Commercial usage rights
Premium$20 / month
  • 900 monthly credits for scaling teams
  • Higher concurrency and faster delivery
  • Priority support via Slack or Telegram

Need custom terms? Talk to us to tailor credits, rate limits, or deployment options.

View All Pricing Details
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.