We have robots that lie now, but what is a lie? A lie is a known untruth, usually told to make the person, or robot, that is lying sound good.
It’s easier to say than to do, after all.
The anatomy of the ChatGPT/Bing lying is fairly simple. These tools have studied trillions of pieces of text from around the world (without much regard to whether these pieces of text are accurate) and they use mathematical probability to figure out what to say next, given any amount of text.
It’s a lot like a normal flesh and blood lie. You’re asked by your friend or your lover or your boss for something, and you take what you think they want to hear and you say it.
What is more difficult, much more difficult, is telling the truth.
The truth is elusive. It’s hard to nail down. You could try your hardest but, given the limits of language, the situation, and your emotions, still fail, and still lie.
The truth is unsatisfying. It sits there like a boulder. It’s hard and not pretty and won’t give in to force.
And the truth is often destructive. Most things that are real are destructive. The truth points to the chaos of the universe and highlights it. It says, here is a problem, without a solution. The truth is a matter for the present.
Predicting the future is never true, after all, it can only be well-intentioned, like a lie.