upvote
The key missing step is where the traveler exercises critical thinking and checks the advice they get. Some people seem to turn that off for LLMs.
reply
Because they aren't probabilistic parrots? If they get it wrong, there's usually an understandable reason behind it.
reply
Because the vast and overwhelmingly majority of the time, if you ask a question into the ether that nobody has a good answer to, most people will gloss over it and not bother answering, as attested by decades of relatable memes ( https://xkcd.com/979/ ). In contrast, the chatbot is trained to always attempt to give an answer, and is seemingly disincentivized via its training set to just shrug and say "I don't know, good luck fam".
reply