What happens when you ask AI to describe something it doesn't know? Hallucinations. That’s what I found when I asked the old version of ChatGPT about inkubatour.ch. Then, I tried the latest version—with augmented Web search. The result? Accurate, timely, and real. This morning I run an experiment with ChatGPT's new Web Search functionality. First I asked the prior version to describe our inkubatour.ch program. Here is the original information on the Web: I kept it simple, typing: "Write an extended description of the inkubatour.ch program." intor the o1-preview version of ChatGPT. And then I watched as it produced a response that was mostly incorrect. Instead of giving me a straightforward, factual description, the AI dove into an elaborate story about entrepreneurship, mentorship, and workshops—all with colorful but largely fictional details. It was a hallucination, a well-crafted collection of non-existent facts. I couldn't help but laugh: In the meantime, I h
Thomas Steiner's coffee-break posts about A.I.