It's an insane perspective I'm taking I know....call me crazy. /s
edit: the fact that humans are going out of their way to type or speak some sort of emotional content into their prompting is beyond me. Why would I waste time typing out a pronoun to a large-language model agent? Why would I do the lazy intellectual thing and blur the line between pure factual communication of concepts by expressing emotional content to a machine? What are we doing, folks?
That said these are large language models, you are guiding the output through vector space with your input, and so you really do have to leverage language to get the results you want. You don't have to believe it has emotions or feels anything for that to still be true.
If anything, it's been fantastic to have an "interlocutor" that is vastly capable of producing possible solutions without emotional bias, superfluous flourishes, or having to endure personal proclivities or eccentricities.