AI is not lying
When an “AI” (which is what they call language models these days, just because they are large) can’t pinpoint the exact thing to answer, it’ll grab one out of its bag that is “good enough”. This process is called hallucinating. But is this really the right word?
What is a lie?
A lie is something that is not true, the one telling it knows it’s not true, and it is certainly not the correct answer the listener/reader wants to hear or read.
Now, let’s apply that to the process of hallucinating:
- is it true? Most certainly not.
- does the AI know it’s not true? Of course, else the process of hallucination would never have started in the first place.
- is it what the listener/reader wants to hear or read? No, else he or she would have asked for it.
What is the problem with this?
“Hallucinated” answers are often taken at face value, even more so than if a human is “hallucinating” (well, lying). And since there are no additional clues like the tell-tale signs a human is telling a lie (blinking, sweating etc.) there is no other way for a human recipient of the output to determine the truth value of the statemenet being made than maybe reading the logs.
And the big problem with this is that we rely more and more on AI to do our work, keep us safe etc. pp.
This is what you see when the Tesla “Autopilot” (also one of those words which are overly confident) suddenly doesn’t see the lines on the road anymore and rather than wake the driver and abort operation just assumes where the lane continues, driving your car right into a barrier or oncoming traffic. Or into the bushes, if you’re lucky1.
And this is what some lazy lawyers in New York discovered when they let ChatGPT write their case filings. When ChatGPT didn’t know the answer, it just made up some legal documents to cite that just didn’t exist2. The judge decided this was a lie, not a “hallucination” or “technical glitch”. And I’m with the judge on this one.
Conclusion
Maybe the best way to describe these LLMs/AIs is probably what Kevlin Henney said about them:
[ChatGPT] is a people pleaser, that is a savant and also sociopathic and easily bought. It is not required to tell you the truth, it is just required to tell you things that keep you happy.