upvote
>A queryable semantic network of all human thought

This hyperbole would describe any LLM of any size and quality, including a 0.5b model.

reply
Sure - and the people responsible for a new freaking era of computing are the ones who asked "given how incredible it is that this works at all at 0.5b params, let's scale it up*.

It's not hyperbole - that it's an accurate description at a small scale was the core insight that enabled the large scale.

reply
Well it's obviously hyperbole because "all human thought" is not in a model's training data nor available in a model's output.

If your gushing fits a 0.5b it probably doesn't tell us much about A.I. capabilities.

reply
Yes, it has so much potential, that it forgets the actual, the reasonable and the probable.
reply
> It's a possibility space of pure potential, the scale of which is limited only by one's own wonder, industriousness, and curiosity.

Did you use an LLM to write this comment?

reply