upvote
> No amount of bloat matches what an LLM needs.

I don't think that's necessarily true. For instance, LinkedIn uses more memory than Gemma E2B inference does.

reply
LinkedIn is an entirely different category and an extreme case at that. We’re not talking about LLM’s replacing LinkedIn either. It’s an entirely different comparison/discussion.
reply
Finally, we've fully documented the Singularity-is-actually-just-bloated software.
reply