upvote
> it's almost certainly also powered by an autoregressive transformer model, just like ChatGPT

The objective of that model, however, is quite different to that of an LLM.

reply
I have seen Google Translate hallucinate exactly zero times over thousands of queries over the years. Meanwhile, LLMs emit garbage roughly 1/3 of the time, in my experience. Can you provide an example of Translate hallucinating something?
reply
Agreed, and I use G translate daily to handle living in a country where 95% of the population doesn’t speak any language I do.

It occasionally messes up, but not by hallucinating, usually grammar salad because what I put into it was somewhat ambiguous. It’s also terrible with genders in Romance languages, but then that is a nightmare for humans too.

Palmada palmada bot.

reply
Every single time it mistranslates something it is hallucinations.
reply
Google Translate hasn't moved to LLM-style translation yet, unfortunately
reply