upvote
> Are you sure “readily willing to concede” is worth absolutely anything as a user or consumer?

The company can't have it both ways. Either they have to admit the ai "support" is bollocks, or they are culpable. Either way they are in the wrong.

reply
Better than actual human customer agents who give an obviously scripted “I’m sorry about that” when you explain a problem. At least the computer isn’t being forced to lie to me.

We need a law that forces management to be regularly exposed to their own customer service.

reply
I knew someone would respond with this. HN is rampant with this sort of contrarian defeatism, and I just responded the other day to a nearly identical comment on a different topic, so:

No, it is not better. I have spent $AGE years of my life developing the ability to determine whether someone is authentically providing me sympathy, and when they are, I actually appreciate it. When they aren’t, I realize that that person is probably being mistreated by some corporate monstrosity or they’re having a shit day, and I provide them benefit of the doubt.

> At least the computer isn’t being forced to lie to me.

Isn’t it though?

> We need a law that forces management to be regularly exposed to their own customer service.

Yeah we need something. I joke about with my friends creating an AI concierge service that deals with these chatbots and alerts you when a human is finally somehow involved in the chain of communication. What a beautiful world where we’ll be burning absurd amounts of carbon in some sort of antisocial AI arms race to try to maximize shareholder profit.

reply
The world would not actually be improved by having 1000s of customer service reps genuinely authentically feel sorry. You're literally demanding real people to experience real negative emotions over some IT problem you have.
reply
People authentically, genuinely, naturally care about other people; empathy - founded at least partly in mirror neurons - is the most fundamental human nature. It's part of being social animals that live, survive, and thrive only in groups. It's even important for conflict - you need to anticipate the other person's moves, which requires instintively understanding their emotions.

The exceptions are generally when people are scared, and sadly some people are scared all the time.

reply
They don't have to be but they at least can try to help. When dealing with automated response units the outcome is the same: much talk, no solution. With a rep you can at lease see what's available within their means and if you are nice to them they might actually be able to help you or at least make you feel less bad about it.
reply
But it would be improved by having them be honest and not say they’re sorry when they’re not.
reply
It's an Americanism. You might enjoy e.g. a Northern European culture more?
reply
Lying means to make a statement that you believe to be untrue. LLMs don’t believe things, so they can’t lie.

I haven’t had the pleasure of one of these phone systems yet. I think I’d still be more irritated by a human fake apology because the company is abusing two people for that.

At any rate, I didn’t mean for it to be some sort of contest, more of a lament that modern customer service is a garbage fire in many ways and I dream of forcing the sociopaths who design these systems to suffer their own handiwork.

reply