upvote
Given that Gemini seems to have frequent availability issues, I wonder if this is a strategy to offload low-hanging fruit (from a human-effort pov) to the user. If it is, I think that's still kinda impressive.
reply
Somehow I like this. I hate that current LLMs act like yes-men, you can't trust them to give unbiased results. If it told me my approach is stupid, and why, I would appreciate it.
reply
I just asked ChatGPT to help me design a house where the walls are made of fleas and it told me the idea is not going to work, and also has ethical concerns.
reply
I tried it with a Gemini personality that uses this kind of attack, and since that kind of prompt strongly encourages it to provide a working answer, it decided that the fleas were a metaphor about botnet clients, and the walls were my network, all so it could give an actionable answer.

I inadvertently made a stronger yes-man.

reply