neither are humans
> They optimize for next-token probability and human approval, not factual verification.
while there are outliers, most humans also tend to tell people what they want to hear and to fit in.
> factuality is emergent and contingent, not enforced by architecture.
like humans; as far as we know, there is no "factuality" gene, and we lie to ourselves, to others, in politics, scientific papers, to our partners, etc.
> If we’re going to treat them as coworkers or exoskeletons, we should be clear about that distinction.
I don't see the distinction. Humans exhibit many of the same behaviours.
You're just indulging in sort of idle cynical judgement of people. To lie well even takes careful truthful evaluation of the possible effects of that lie and the likelihood and consequences of being caught. If you yourself claim to have observed a lie, and can verify that it was a lie, then you understand a truth; you're confounding truthfulness with honesty.
So that's the (obvious) distinction. A distributed algorithm that predicts likely strings of words doesn't do any of that, and doesn't have any concerns or consequences. It doesn't exist at all (even if calculation is existence - maybe we're all reductively just calculators, right?) after your query has run. You have to save a context and feed it back into an algorithm that hasn't changed an iota from when you ran it the last time. There's no capacity to evaluate anything.
You'll know we're getting closer to the fantasy abstract AI of your imagination when a system gets more out of the second time it trains on the same book than it did the first time.
For example fact checking a news article and making sure what's get reported line up with base reality.
I once fact check a virology lecture and found out that the professor confused two brothers as one individual.
I am sure about the professor having a super solid grasp of how viruses work, but errors like these probably creeps in all the time.
This doesn't jive with reality at all. Language is a relatively recent invention, yet somehow Homo sapiens were able to survive in the world and even use tools before the appearance of language. You're saying they did this without an internal notion of "fact" or "truth"?
I hate the trend of downplaying human capabilities to make the wild promises of AI more plausible.