upvote
> But this isn’t unlike the general populous. At scale humans accept new things slowly.

right, the model works like humans at scale. Not like a human who reads the actual paper disproving the fact they thought was correct and is able to adapt. True not every human manages to do that, science advancing one death at a time, but some can.

But since the model is a statistical one, it works like humans at scale.

reply
> At scale humans accept new things slowly.

I think this is true, but there are big differences. Motivated humans with a reasonable background learn lots of things quickly, even though we also swim in an ocean of half-truths or outdated facts.

We also are resistant to certain controversial ideas.

But neither of those things are really that analogous to the limitations on what models can currently learn without a new training run.

reply
Context learning means learning facts or rules without pre-training. They are two distinct phases.

An interesting question is, if pre-trained specialized models are available for a thousand or ten thousand most common tasks humans do every day, of what use a general model could be?

reply
Yes, that's precisely the problem, you want continuous learning but you also want continuous pruning.
reply