upvote
That just implies LLMs are suggestible. The same is true of children. As we get older and build a more complete world model in our heads, it's harder to get us to believe things which go against that model.

Tell a 5-yr old about Santa, and they will believe it sincerely. Do the same with a 30-year old immigrant who has never heard of Santa, and I suspect you'll have a harder time.

That's not because the 5-year old is dumber, but just because their life-experience ("training data") is much more limited.

Even so, trying to convince a modern LLM of something ridiculous is getting harder. I invite you to try telling ChatGPT or Gemini that the president died a week ago and was replaced by a body-double facsimile until January 2027, so that Vance can have a full term. I suspect you'll have significant difficulty.

reply
> Do the same with a 30-year old immigrant who has never heard of Santa, and I suspect you'll have a harder time.

There's a plethora of people who convert to religion at an older age, and that seems far more far fetched than Santa.

reply
Sure.

But I bet you'd have a significantly easier time converting a child rather than a 30/40/50-yr old to a religion.

My point is that LLMs are suggestible, perhaps more so than the average adult, but less so than I child I suspect. I don't think suggestibility really solves the problem of whether something has AGI or not. To me, on the contrary, it seems like to be intelligent and adaptable you need to be able to modify your world model. How easily you are fooled is a function of how mature / data-rich your existing world model is.

reply