Word List: Tier 3
Term: hallucinate
Definition
An artificial intelligence (AI) application generates information that has no basis in fact.
Related Terms
- hallucination
Recommendation
Recommended to replace when possible.
Recommended Replacements
- inaccurate information (noun)
- *create* or *generate* inaccurate information (verb)
- Factual error
- Incorrect assertion
- False positive
- Distortion
Unsuitable Replacements
- None
Rationale
AI “hallucination” is a misleading analogy and term that allows its LLM makers and owners to evade responsibility. There is also an anthropomorphic argument to be made for applying human traits to computer systems.
When we build software and systems, we test them. When we find failures, we call them bugs. When we excuse our LLM models for “hallucinating” we do not learn why the failure happened and owners and makers do not take responsibility. Worse, this stigmatizes the mentally ill.
Hallucinations – perceptions that are not based in reality – are also associated with mental illness or drug use. Using the term in a technology context, in either its noun or verb form, can be seen as insensitive to people who experience those conditions.
Term Status
N/A
Supporting Content
N/A