upvote
For neural networks, yeah continuous learning is basically dead.

But for other ML approaches, it works really well. KNN is one example that works particularly well.

reply
Ehhh KNN doesn’t have a training phase, so it’s really more that the concept of continual learning doesn’t apply. You have to store your entire dataset and recalculate everything from scratch every time anyway.
reply
So, surprising, that is not completely true - I know of 2 finance HFT trading firms that do CL at scale, and it works - but in a relatively narrow context of predicting profitable actions. It is still very surprising it works, and the compute is impressively large to do it - but it does work. I do have some hope of it translating to the wider energy landscapers we want AI to work over…
reply
During covid almost every prediction model like that exploded, everything went out of distribution really fast. In your sense we've been doing "CL" for a decade or more. It can also be cheap if you use smaller models.

But true CL is the ability to learn out of distribution information on the fly.

The only true solution I know to continual learning is to completely retrain the model from scratch with every new example you encounter. That technically is achievable now but it also is effectively useless.

reply
Bandits?

Spaced repetition algos

reply