Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.
Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.
No, hallucination is a really good term. It can be super confident and seemingly correct but still completely made up.
I think delusion might be a better word. You can hallucinate and know it’s not real
My experience with certain chemicals suggests this is true.
It’s a really bad term because it’s usually associated with a mind, and LLMs are nothing of the sort.