The Mythology Of Conscious AI
Anil Seth:
If we confuse ourselves too readily with our machine creations, we not only overestimate them, we also underestimate ourselves.
Also, I like the idea to stop using the word “hallucinate”:
The language that we ourselves use matters too. Consider how normal it has become to say that LLMs “hallucinate” when they spew falsehoods. Hallucinations in human beings are mainly conscious experiences that have lost their grip on reality (uncontrolled perceptions, one might say). We hallucinate when we hear voices that aren’t there or see a dead relative standing at the foot of the bed. When we say that AI systems “hallucinate,” we implicitly confer on them a capacity for experience.
We use language that implies consciousness exists where it actually doesn’t.