My only gripe was that single sentence, and we might just mean something slightly different there.
Also, I'm out of my depth here, but I believe these sort of issues are solved in a post-training step, which may look more like applying a band-aid. I'm not convinced these issues can actually be fully fixed (due to the way these work) - but of course this tradeoff doesn't make LLMs useless, and it can be limited/eliminated via clever applications.