I like task vectors and soft prompts because I think they show how prompt engineering is cool and useful.
https://arxiv.org/pdf/2310.15916
https://huggingface.co/docs/peft/conceptual_guides/prompting
Are you not aware the random sampling makes something non-deterministic?
You should follow the HN Guidelines. I'm trying to have a discussion, not a snarkfest.
> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.
I think a lot of prompt engineering is voodoo, but it's not all baseless: a more formal way to look at it is aligning your task with the pre-training and post-training of the model.
The whole "it's a bad language" refrain feels half-baked when most of us use relatively high level languages on non-realtime OSes that obfuscate so much that they might as well be well worded prompts compared to how deterministic the underlying primitives they were built on are... at least until you zoom in too far.
Prompting is none of those things. It is a ball of math we can throw words into, and it approximates meaning and returns an output with randomness built in. That is incredible, truly, but it is not a programming language.
Coding languages haven't been describing even a fraction of the rules and state they encapsulate since what? Punch cards?
It wasn't long until we started to rely on exponential number of layered abstractions to do anything useful with computers, and very quickly we traded precision and determinism for benefits like being concise and easier to reason about.
-
But also, the context here was someone calling prompting a "imprecise nondeterministic programming language": obviously their bone is the "imprecise nondeterministic" part, not distilling what defines a programming language.
I get it doesn't feel warm and fuzzy to the average engineer, but realistically we were hand engineering solutions with "precise deterministic programming languages", they were similarly probabilistic, and they performed worse.
- https://labs.oracle.com/pls/apex/f?p=LABS:0:5033606075766:AP...
- https://en.wikipedia.org/wiki/Stan_(software)
- https://en.wikipedia.org/wiki/Probabilistic_programming
I explained in the most clear language possible why a fixation on the "programming language" part of the original comment is borderline non-sequitur. But if you're insistent on railroading the conversation regardless... at least try to be good at it, no?
And forget scripting languages, take a C program that writes a string to disk and reads it back.
How many times longer does it get the moment we have to ensure the string was actually committed to non-volatile NAND and actually read back? 5x? 10x?
Is it even doable if we have to support arbitrary consumer hardware?
First of all, I pick the hardware I support and the operating systems. I can make those things requirements when they are required.
But when you boil down your argument, it's that because one thing may introduce non-determinism, then any degree of non-determinism is acceptable.
At that point we don't even need LLMs. We can just have the computer do random things.
It's just a rehash of the infinite monkeys with infinite type writers which is ridiculous
> A few years ago we didn't have an imprecise nondeterministic programming language that would allow your mom to achieve SOTA results on a wide range of NLP tasks by asking nicely, or I'm sure people would have taken it.
But that (accurate) point makes your point invalid, so you'd rather focus on the dressing.
I actually think it's great for giving non-programmers the ability to program to solve basic problems. That's really cool and it's pretty darn good at it.
I would refute that you get SOTA results.
That has never been my personal experience. Given that we don't see a large increase in innovative companies spinning up now that this technology is a few years old, I doubt it's the experience of most users.
> The whole "it's a bad language" refrain feels half-baked when most of us use relatively high level languages on non-realtime OSes that obfuscate so much that they might as well be well worded prompts compared to how deterministic the underlying primitives they were built on are... at least until you zoom in too far.
Obfuscation and abstraction are not the same thing. The other core difference is the precision and the determinism both of which are lacking with LLMs.