And forget scripting languages, take a C program that writes a string to disk and reads it back.
How many times longer does it get the moment we have to ensure the string was actually committed to non-volatile NAND and actually read back? 5x? 10x?
Is it even doable if we have to support arbitrary consumer hardware?
First of all, I pick the hardware I support and the operating systems. I can make those things requirements when they are required.
But when you boil down your argument, it's that because one thing may introduce non-determinism, then any degree of non-determinism is acceptable.
At that point we don't even need LLMs. We can just have the computer do random things.
It's just a rehash of the infinite monkeys with infinite type writers which is ridiculous
> A few years ago we didn't have an imprecise nondeterministic programming language that would allow your mom to achieve SOTA results on a wide range of NLP tasks by asking nicely, or I'm sure people would have taken it.
But that (accurate) point makes your point invalid, so you'd rather focus on the dressing.