Close up of ChatGPT discussing How-To Geek on a Google Pixel 7 Pro
Justin Duino / How-To Geek

While ChatGPT is a powerful AI tool capable of generating coherent and relevant responses, it has limitations. It's not a secure channel for sensitive information, a reliable source for legal or medical advice, a substitute for human decision-making or professional mental health support, a definitive source of truth, or a precise tool for complex mathematics.
ChatGPT is incredibly powerful and has had a transformative effect on how we interact with computers. However, like any tool, it’s important to understand its limitations and to use it responsibly. Here are five things you shouldn’t use ChatGPT for.

The Limits of ChatGPT

Before we delve into the specifics, it’s crucial to understand the limitations of ChatGPT. Firstly, it cannot access real-time or personal data unless explicitly provided during the conversation or if you’ve enabled ChatGPT’s plugins. Without browsing enabled (which requires ChatGPT Plus), it generates responses based on patterns and information it learned during its training, which includes a diverse range of internet text up until its training cut-off in September 2021. But it doesn’t “know” anything in the human sense or understand the context the way people do.

While ChatGPT often generates impressively coherent and relevant responses, it’s not infallible. It can produce incorrect or nonsensical answers. Its proficiency largely depends on the quality and clarity of the prompt it’s given.

RELATED: 8 Surprising Things You Can Do With ChatGPT

1. Don’t Use ChatGPT With Sensitive Information

Given its design and how it works, ChatGPT is not a secure channel for sharing or handling sensitive information. This includes financial details, passwords, personal identification information, or confidential data.

Recently, OpenAI has added a new sort of “incognito” mode to prevent your chats from being stored or used for future training, but only you can decide whether you trust that promise. Some companies, such as Samsung, have already banned the use of ChatGPT by their employees for work purposes because of data leaks.

2. Don’t Use It for Legal or Medical Advice

ChatGPT is not certified and cannot provide accurate legal or medical advice. Its responses are based on patterns and information available in the data it was trained on. It can’t understand the nuances and specifics of individual legal or medical cases. While it might provide general information on legal or medical topics, you should always consult a qualified professional for such advice.

RELATED: The 6 Best Uses for ChatGPT 4

GPT is a promising technology that definitely has the potential to perform legitimate medical diagnoses , but this will be in the form of specialized, certified medical AI systems down the line. It is not the general-purpose ChatGPT product available to the public.

3. Don’t Use it To Make Decisions For You

ChatGPT can provide information, suggest options, and even simulate decision-making processes based on prompts. But, it’s essential to remember that the AI doesn’t understand the real-world implications of its output. It’s incapable of considering all the human aspects involved in decision-making, such as emotions, ethics, or personal values. Therefore, while it can be a useful tool for brainstorming or exploring ideas, humans should always make final decisions.

This is particularly true for ChatGPT 3.5, which is the default ChatGPT model and the only one available to free users. GPT 3.5 has a significantly worse reasoning ability than GPT 4!

RELATED: GPT 3.5 vs. GPT 4: What's the Difference?

4. Don’t Use It As a Trusted Source

While ChatGPT is trained on a vast amount of information and often provides accurate responses, it’s not a definitive source of truth. It can’t verify information or check facts in real-time. Therefore, any information received from ChatGPT should be cross-verified with trusted and authoritative sources, especially regarding important matters like news, scientific facts, or historical events.

ChatGPT is prone to “hallucinating” facts that sound true, but are completely made up. Be careful!

5. Don’t Use ChatGPT as a Therapist

While AI technologies like ChatGPT can simulate empathetic responses and offer general advice, they’re not substitutes for professional mental health support. They cannot understand and process human emotions deeply.

AI cannot replace the nuanced understanding, emotional resonance, and ethical guidelines inherent to human therapists. For any serious emotional or psychological issues, always seek help from a licensed mental health professional.

6. Don’t Use ChatGPT For Math!

At first glance, it might seem like a natural application for an AI like ChatGPT to help you with your math homework. However, it’s essential to note that ChatGPT’s forte is language, not mathematics. Despite its vast training data, its ability to accurately perform complex math operations or solve intricate problems is limited.

While ChatGPT is an impressive tool with a wide range of applications, it’s crucial to understand its limitations. Using this tool responsibly will help ensure that it serves as a beneficial aid rather than a misleading or potentially harmful source of information.

RELATED: How to Create ChatGPT Personas for Every Occasion

Profile Photo for Sydney Butler Sydney Butler
Sydney Butler has over 20 years of experience as a freelance PC technician and system builder. He's worked for more than a decade in user education and spends his time explaining technology to professional, educational, and mainstream audiences. His interests include VR, PC, Mac, gaming, 3D printing, consumer electronics, the web, and privacy. He holds a Master of Arts degree in Research Psychology with a focus on Cyberpsychology in particular.
Read Full Bio »