Hallucinations
Today, I came across an interesting term — AI hallucinations. It's weird to even think about it but the term is pretty much self-explanatory. When incorrect, flawed or biased data is fed to the AI training models, the engine perceives things that are actually not true or don't exist. I instantly had a flashback of all my ChatGPT queries since its inception; it's safe to say that it hallucinates — a lot. I'm sure, you all agree. I don't believe this technology can ever overcome this flaw just like there's no transformative treatment for this neurological disorder when it comes to humans.
This term did stop me in my tracks, but it was the word 'hallucinations' that caught my attention at first because I deal with this formidable disorder every day — a family member is a victim.
The affected person can suddenly point at an elephant outside the window; you would consider checking it out for a moment, even knowing that there are no elephants anywhere near the house, not even hundreds of kilometres away. At another moment, they see big monkeys with elephants and ants on the sheets. A baby in the form of a cushion. These are only some examples from a few minutes of their daily lives.
After reading about AI hallucinations, my next search was, how do hallucinations work (in humans)? Wrong signals going to the neural circuit, that much I know. But then I read something about an altered imagery — nobody knows how it forms. For now, scientists and doctors believe it's a complex brain activity which is caused by internal and external experiences leading to flawed sensory information or signals within the brain.
In the last few months, I learned that hallucinations are different from illusions and delusional perceptions — these are distinct sensations, almost like reality. Moreover, hallucinations are not limited to sight but also affect the auditory, touching and smelling senses. The eyes of the affected person look dilated too.
It's strange how this flaw or disorder connects the digital realm and the real world. Or it's just me who fabricated this new bridge because of my own experiences. But there's no denying of one huge difference: when a machine hallucinates, it's only a flawed output, but when a person hallucinates, their whole life turns upside down; they feel isolated, alone, and can't differentiate between reality and imagination. They are emotionally vulnerable, and this takes a toll on their loved ones too.
In the back of my mind, it was this eerie similarity between AI hallucinations and my personal experiences that made me pause, read, reflect and write about it. Apologies, if it went above your head or this piece doesn't speak to you. I write for myself.
I had wide-awake nightmares as a child. I was in that sweet spot between awake and asleep and would hallucinate creatures in my bedroom that obviously were not there. I can still see some of them in my mind to this day. Most of them were innocent enough like Kermit the Frog or something like that, but obviously they were not actually there. I grew out of that around 10 years of age but for a while there I was afraid to go to sleep at night.
That must feel terrible at that young age.
It's not linked to your situation but I don't allow my kids to watch creepy or horror cartoons. Whenever they do, they always get scared in their sleep or whenever they are alone they fear monsters will come and eat them. 🫠 It's annoying when they get extra clingy for no reason. But yeah sometimes it is serious and you need to listen and counsel them to help them out of the situation.
Me encantó tu ensayo, en el que de manera muy ingeniosa lograste conectar la frase "alucinaciones de IA" con las alucinaciones que padecen las personas.
Me encantó leerte.
I am glad you liked reading it. I wasn't so sure about it but had to get it out of my system.
Drenar nos ayuda a retomar el trayecto de la vida con más ánimo. Me encantó leerte.