upvote
I'm building this in the hope that AI will be cheap one day. For now, I'll add many optimizations
reply
Have you tested this with a local model? I'm going to try this with GLM 4.7
reply
What would be the best model to try something like this on a 5800XT with 8 GB RAM?
reply
Yes, it certainly makes sense if you have the budget for it.

Could you share what it costs to run this? That could convince people to try it out.

reply
I mean, you can just say Hi to it, and it will cost nothing. It only adds code and features if you ask it to
reply
AI is cheap right now. At some point the AI companies must turn to generate profit
reply
Anthropic has stated that their inference process is cash positive. It would be very surprising if this wasn't the case for everyone.

It's certainly an open question whether the providers can recoup the investments being made with growth alone, but it's not out of the question.

reply
Problem is the models need constant training or they become outdated. That the less expensive part generates profit is nice but doesn’t help if you look at the complete picture. Hardware also needs replacement
reply