Realistic Hallucinations

Leor Grebler
1 min readMar 22, 2023

--

Generated by author using Midjourney

I asked GPT4 for some background on a specific question in history. It returned to me information that I found fascinating but in the end was questioning whether it was real.

In one instance, I asked for a list of books about the topic and it provided me authoritatively with five real sounding books. It even pointed out that one book was adjacent to the topic I was researching but could shed some light.

I was thrilled to see one book that was a specific resource on the topic and thought it would be worthwhile to acquire this. A Google search as well as an Amazon search yielded nothing. I then asked GPT4 to provide a summary of the book so I could understand what it discussed.

Poof. “Sorry, it doesn’t actually exist.” What!?? The chat bot then continued to move forward like nothing happened. The lack of emotional intelligence showed as it acted like a compulsive liar who makes something up and when confronted just moves past it like it’s no big deal.

Trust but verify would be a good mantra for using generative AI. Verification AIs might be the next frontier.

--

--

Leor Grebler

Independent daily thoughts on all things future, voice technologies and AI. More at http://linkedin.com/in/grebler