But I've noticed that a lot of people think of LLM's as being _good_ at predicting the future and that's what I find concerning.
(I'll make my prediction: 10 years from now, most things will be more similar to what things are today than most people expected them to be)
And the answer is no.
If I gave a prompt like that and got the response I did, I'd be very pleased with the result. If I somehow intended something serious, I'd have a second look at the prompt, go mea culpa, and write a far longer prompt with parameters to make something somewhat like a serious prediction possible.