Hopefully he would be using the LLM as an enhanced search engine that can point him to relevant authoritative sources that he can use to fact-check its output. I have done that in the past to some effect.
Maybe he just needs a reminder and he’ll have an oh yeah moment when he reads the output maybe he’ll ask it for primary sources. There’s a lot of bad faith going around.