Hacker News
new
past
comments
ask
show
jobs
points
by
KalMann
7 days ago
|
comments
by
antonvs
7 days ago
|
[-]
That's a poor definition, then. It claims that a model is "hallucinating" when its output doesn't match a reference point that it can't possibly have accurate information about. How is that an "hallucination" in any meaningful sense?
reply