Back to all posts

11 Tasks You Should Never Give ChatGPT

2025-07-03Nelson Aguilar5 minutes read
Artificial Intelligence
ChatGPT
Productivity

ChatGPT has proven to be a remarkably useful tool. It can help you create good prompts, brainstorm ideas, and even assist with complex tasks like coding or picking a winning NCAA bracket. As a daily user, I'm a fan, but I'm also keenly aware of its limitations.

While it's great for trying new recipes or learning a language, you shouldn't give ChatGPT free rein over every aspect of your life. It's not a master of all trades and can be dangerously unreliable. The AI is known to 'hallucinate' information it presents as fact, it often works with outdated data, and it projects incredible confidence even when it's completely wrong. This is true for other generative AI tools as well.

The risks increase significantly when dealing with important matters like taxes, medical advice, legal issues, or finances.

If you're unsure when to use AI and when to rely on your own intelligence, here are 11 scenarios where you should absolutely choose another option over ChatGPT.

AI Atlas

(Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against ChatGPT maker OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

1. Diagnosing Health Issues

While it might be tempting to enter your symptoms into ChatGPT out of curiosity, the results can be a terrifying spiral of worst-case scenarios, from dehydration to cancer. I once entered a lump on my chest and was told it could be cancer. My licensed doctor, however, confirmed it was a lipoma—a non-cancerous growth found in 1 in 1,000 people. While AI can help you draft questions for your doctor or translate medical jargon, it cannot order labs, perform an examination, or carry malpractice insurance. Know its limits.

2. Handling Your Mental Health

ChatGPT can suggest grounding techniques, but it cannot provide real help in a mental health crisis. Some people use it as a substitute therapist, but it's a pale imitation at best and incredibly risky at worst. A human therapist has lived experience, can read body language, and offers genuine empathy—things an AI can only simulate. A licensed professional is bound by legal and ethical codes to protect you; ChatGPT is not. Its advice can misfire or overlook critical red flags. For deep, messy, human work, trust a trained human.

If you or someone you know is in crisis, please dial 988 in the US or your local hotline.

3. Making Immediate Safety Decisions

If your carbon monoxide alarm goes off, do not ask ChatGPT if you're in danger. Go outside first, and ask questions later. An AI cannot smell gas, detect smoke, or call an emergency crew. In a crisis, every second spent typing is a second lost. Use the chatbot as a post-incident explainer, never as a first responder.

4. Getting Personalized Financial or Tax Advice

ChatGPT can explain what an ETF is, but it doesn't know your personal financial situation, your state tax laws, or your risk tolerance. Its training data might not be current with the latest tax codes or rate changes, meaning its advice could be outdated. Don't dump your 1099s into the chatbot for a DIY tax return. A CPA can find hidden deductions and flag costly mistakes. When real money and IRS penalties are involved, call a professional.

Remember, anything you share with an AI, including financial details, can become part of its training data.

5. Dealing with Confidential or Regulated Data

Never input confidential information into ChatGPT. This includes embargoed press releases, client contracts, medical charts, or anything covered by privacy laws like CCPA, HIPAA, or GDPR. Once sensitive information like your Social Security number or passport details is in the prompt, you lose control over where it's stored or who might review it. If you wouldn't post it in a public Slack channel, don't paste it into ChatGPT.

6. Doing Anything Illegal

This should be self-explanatory.

7. Cheating on Schoolwork

With modern AI, the scale of academic dishonesty has grown immensely. Plagiarism detectors are constantly improving to spot AI-generated text, and professors are getting good at recognizing the 'ChatGPT voice'. The risks of suspension or expulsion are real. Use ChatGPT as a study partner, not a ghostwriter. Ultimately, you are only cheating yourself out of an education.

8. Monitoring Breaking News and Up-to-Date Information

Even with its ability to browse the web, ChatGPT is not a real-time news source. It requires a new prompt for every update. For fast-moving events, live data feeds, official press releases, and streaming news coverage are still your best and most reliable sources.

9. Gambling

While I once had some luck using ChatGPT for a sports bet, I would never recommend it. The AI is prone to hallucinating player stats, misreporting injuries, and getting win-loss records wrong. I only won because I double-checked every piece of information against real-time odds. Don't rely on it to predict tomorrow's outcomes.

10. Drafting a Will or Legally Binding Contract

ChatGPT is great for explaining basic legal concepts, but asking it to draft a legal document is a huge risk. Estate and contract laws vary by state and even county. A small error, like a missing witness signature, can invalidate the entire document. Use the AI to build a list of questions for your lawyer, then pay the lawyer to create a document that will hold up in court.

11. Making Art

This is a personal opinion, but I believe AI should not be used to create art that you pass off as your own. I use ChatGPT for brainstorming and headline ideas—supplementation, not substitution. It's one thing to use a tool to enhance your creativity, but another to have it replace the creative process entirely.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.