AI Foresights — A New Dawn Is Here
Ethics & SafetyLast updated: April 2026

Hallucination

When AI confidently generates false or made-up information.

In Plain English

A Hallucination occurs when an AI generates information that sounds plausible but is actually false, fabricated, or nonsensical. AI doesn't "know" facts — it predicts likely text patterns. This can lead to invented quotes, fake citations, wrong dates, or completely made-up "facts" stated with confidence. Always verify important AI-generated information.

💡Real-World Example

An AI might confidently cite a research paper that doesn't exist, complete with fake authors and publication details.

What did you think of our explanation?

Want to learn more about AI?

Explore our curated collection of AI news, tools, and guides — all explained in plain English.