Ethics & SafetyLast updated: April 2026
Hallucination
When AI confidently generates false or made-up information.
In Plain English
A Hallucination occurs when an AI generates information that sounds plausible but is actually false, fabricated, or nonsensical. AI doesn't "know" facts — it predicts likely text patterns. This can lead to invented quotes, fake citations, wrong dates, or completely made-up "facts" stated with confidence. Always verify important AI-generated information.
💡Real-World Example
An AI might confidently cite a research paper that doesn't exist, complete with fake authors and publication details.
Related Terms
What did you think of our explanation?
