upvote
Feels like the truth is somewhere in between. For example if it was a "smart" hammer and you could tell your hammer "go pound in those nails" and it pounded in the wrong ones, or did it too hard, or something, that feels more equivalent. You would still be blamed for your ambiguous prompt, and fault/liability is ultimately on you the hammer director, but it still wasn't you who chose the exact nails to hammer on.

I also think taking credit for writing an exploit that you didn't write and may not even have the knowledge to do yourself is a bit gray.

reply
Wrong questions.

Could a script kiddy stear an LLM? How much does this reduce the cost of attacks? Can this scale?

What does this mean for the future of cyber security?

reply
You could call the LLMs role "smart grep," and mean it to be derisive. But I would have gladly used a real smart grep.
reply
deleted
reply
If I just point to the wall and say "nail" then I would day the hammer drive the nail
reply
You didn't, you figured out where the nail needs to go, got the nail and then swung the hammer until the nail was driven.

This is really just closer to a drill in that it automated the grunt work with full guidance.

reply
Do you have a defense of why human-hammer-nail is a good analogy for human-chatgpt5.4-pwndsamsung?
reply
AI without a suitably well crafted prompt is like a firework tube held by a 3 year old.

AI without a prompt is a hammer sitting in a drawer.

reply