upvote
> Speed, cost, security, job/task management

All of that will inevitably be solved.

50 years ago, using a personal computer was an extravagant luxury. Until it wasn't.

30 years ago, carrying a powerful computer in your pocket was unthinkable. Until it wasn't.

Right now, it's cheaper to run your accounting math on dedicated adder hardware. But Llms will only get cheaper. When you can run massive LLMs locally on your phone, it's hard to justify not using it for everything.

reply
Not until power access/generation is MUCH cheaper. Long, long, long way off.

If I can run 50,000 fixed tasks that cost me $0.834/hr but OpenAI is costing $37/hr and the automation takes 40x as long and can make TERRIBLE errors why the fuck would I not move to the deterministic system?

Also, battery life of mobile devices.

reply
These exact arguments could have been made 50 years ago about why laptops are impossible.

But now, we not only have laptops, we run horribly inefficient GUIs in horribly inefficient VMs on them.

The dollar-per-compute trend goes ever downward.

reply
It will never ever be as cheap as as cron job and a shell script. There is a certain limit to how efficient using an LLM to do a job vs using an LLM to create a job is. There is a large distinction in compute and power resources between the two. Don't mistake one for the other.
reply
If I can run 50,000 fixed tasks that cost me $0.834/hr but OpenAI is costing $37/hr and the automation takes 40x as long and can make TERRIBLE errors why the fuck would I not move to the deterministic system?

Because you'll be outcompeted by people who make the best of the nondeterministic system.

reply