upvote
Ask HN: Are we gonna back less powerful local LLMs
I think the trend is that top models are meant for companies. Small devs did our job of hyping and training and we can now either pay way more, or pay more and use not sota models, or give our data to access train chinese models in hopes they keep 6 months behind in the cold war and need still need some of our input, or invest around 5-10K for powerful local personal AI [0].

On the other hand I think that AI can really raise the bar of "average tech", and we devs are wired to think that better tech == more value... but this might not be the case in the many many many cases where existing average tech and velocity is already good enough and the real moat is the handshake, trust, marketing, etc etc

[0] https://github.com/antirez/ds4

reply
Funny I just came across a comment that accurately answers your question: https://news.ycombinator.com/item?id=48047722
reply
This is one take. Not every project is made better by expanding the scope by 100x. This often makes things worse.
reply
One of the hardest things to do in software is to make things simple and easy to use. That could mean increasing the scope to do so. Scope doesn't automatically mean more bloat.
reply
Thanks!
reply
The same way most of us use linux (not windows) and postgres/mysql (not oracle), many of us are using open source models and not proprietary ones.
reply
yeah, i understand
reply
I need a proficinal hiking system
reply
[flagged]
reply
[flagged]
reply
[dead]
reply
[dead]
reply