upvote
What if this is modeled around the premise that in any situation where reasoning can be used, someone would have access to super-human reasoning?

Where does the human in the loop somehow manage to utilize super-human reasoning better than another person?

I'm not suggesting it's impossible, so much as wondering if we can reach a place where the human is truly irrelevant to the process, and can't make a better decision than the superhuman entity.

I'm not sure this is ever possible. It's more of a thought experiment. What's between here and there? Right now we can use pseudo-intelligence from silicon to our advantage, and being smarter than average is clearly a massively outsized advantage. It's similar to how being able to automate tasks gives you an outsized advantage, yet in so many more ways. But what if that advantage thins or even vanishes?

reply
> all else being equal.

"Load bearing phrase", as they say.

A stupid ass that just keeps pushing on often goes further than a smart ass who gets distracted.

reply
But they might be pushing in the wrong direction.
reply