14 Comments

100% agree on this. Right along side Halucinating AI is Biased or Racist AI which sufferes the same problem. Error. What we see is an accuacy problem because we aren't seeing the people coding AI intentionally adding ethical bias. What's funny is that there are three layers of bias in an algorithm:

1. Ethical Bias (typically an artifact of an org)

2. Measurement or Data Bias

3. Mathematical Bias (how we code our algorithm)

But again, it all boils down to error not ethics, not conciousness, not hallucinations.

https://polymathicbeing.substack.com/p/eliminating-bias-in-aiml

Expand full comment

You make a good point about the accuracy of the terms we use - and this from someone who used the word "Hallucinating" in an article sharing my predictions about AI just last week. I think that part of it is that because AI is supposedly "the next big thing" (another debate entirely), we have to find special words to describe it. I also think that part of it is because "hallucination" is an interesting word to say, read, or write. I love delicious words. Error is distinctly much more correct, it's just not as "fun."

But, to your main point about anthropomorphism - you point out something that's often misunderstood about AI. It's not actually *intelligent.* It's just using vast amounts of data and clever algorithms to predict things about the dataset it's querying, and presenting that to the user.

It doesn't "think" - It's just an extremely clever parlor trick.

Expand full comment

So much richness here about AI and the languaging around it but I am paused thinking about your acupressure hallucinations. It immediately made me think of synesthesia. Not having it myself I have no idea if it is similar but it, particularly among artists, is something that fascinates me endlessly.

Expand full comment

The Hallucinations that I have seen coming out of LLM-based “AIs” are complete bibliographical references that do not exist. I would call an error if the date or one of the authors of the reference was incorrect, but not a title, plus authors, plus date.

In Neural Network-based computer vision, the hallucinations are the “AIs” telling that a patch of dirt on the floor is a person.

Expand full comment