The 11 Times You Should Close ChatGPT and Use Your Own Brain

AI is not an infallible search engine, but many people take its word as gospel. There are things that ChatGPT and its peers can handle on their own, but there hundreds of tasks they simply cannot be trusted with. These models cannot “know” things, and can only predict the next most likely word based on patterns. This means they can deliver a somewhat satisfactory answer confidently or a completely incorrect response otherwise known as a “hallucination” that can’t be trusted. 

12 Days of Tips Zooey Liao/CNET

In some cases, an AI hallucination is relatively harmless. But relying on it for things like guidance regarding your finances, medical health or legal matters is a recipe for disaster. With those questions, a single incorrect answer can lead to serious real-world consequences.

This story is part of 12 Days of Tips, helping you make the most of your tech, home and health during the

...

Keep reading this article on CNET.