upvote
You're saying it as if the poor author just had no choice but to let LLM write their bibliography. To avoid hallucinations, maybe just don't let an LLM write any part of your paper?

You can only get in this situation if you let a bullshit generator write your paper, and the fraud is that you are generating bullshit and calling it a paper. No buts. It's impossible to trigger this accidentally, or without reckless disregard for the truth.

reply
Calling LLMs "bullshit generators" in the year 2026 just shows a lack of seriousness.
reply
Not as much of a lack of seriousness as excusing away hallucinations as not that big of a deal in what's supposed to be a researched, scholarly body of work written by humans.
reply
Not really - much of work consists of what David Graeber described as “bullshit jobs”. Now AI and its backers are proposing to automate all that bullshit.
reply
deleted
reply
And yet people are trying to defend LLM-generated made-up bullshit citations in scientific papers.
reply
> You’re right that a single hallucinated line is not evidence of reckless disregard

It absolutely is.

> - because that could have happened on a final follow-up pass after you had performed due diligence.

A "final follow-up pass" that lets the LLM make whatever changes it deems appropriate completely negates all the due diligence you did before, unless you very carefully review the diffs. And a new or substantially changed citation should stand out in that diff so much that there's no possible excuse to missing it.

> It’s happened to me.

Then you were guilty of reckless disregard.

> I know how challenging it can be to keep bad patterns out of LLM generated output

If your research paper contains any LLM generated output you did not manually vet, you are a hack and should not get published.

reply