upvote
I've been watching the drizzle of LLM papers come through, and I think we're going to hit a 1T param MoE on consumer hardware before this year is out. It'll still be behind the bigco models, but it'll be a force multiplier. Ideally, we'd get these models to run on a CPU. MS BitNet is one way to do this. You can already run ternary LLMs on consumer CPUs with a decent tps.
reply
You can still continue to master actual software engineering while others spend their time turning their minds into a palimpsest of tricks and lessons of how to convince one model after another after another after another into giving reasonable output. That you'd still have to vet yourself anyway.
reply
While I think a lot of the AI hype is just hype - everyone saying most of these things have _hitherto untold riches_ levels of financial incentives to say them - I think it's also undeniable that LLMs speed up many aspects of coding.

I also think that AI might be the beginning of the end of copyright. While before, everyone with money clearly had tremendous incentive to keep copyright strong, now all of a sudden trillions of dollars are basically predicated on the idea that LLMs aren't violating copyright. Copyleft has been a major tool in the FOSS toolbox. If that's weakening, I don't ALSO want free software to be locked out of agentic programming too.

reply
[dead]
reply