Where does the human in the loop somehow manage to utilize super-human reasoning better than another person?
I'm not suggesting it's impossible, so much as wondering if we can reach a place where the human is truly irrelevant to the process, and can't make a better decision than the superhuman entity.
I'm not sure this is ever possible. It's more of a thought experiment. What's between here and there? Right now we can use pseudo-intelligence from silicon to our advantage, and being smarter than average is clearly a massively outsized advantage. It's similar to how being able to automate tasks gives you an outsized advantage, yet in so many more ways. But what if that advantage thins or even vanishes?
"Load bearing phrase", as they say.
A stupid ass that just keeps pushing on often goes further than a smart ass who gets distracted.
None of this Jack inherits but wants to live in the big city and be an architect. He'll inherit and keep because there is no architecture job to be had.
As someone who grew up on a farm, "you may be a farmer but you could be a productive one" is so intensely depressing. Farming is a shitty job that requires insane amounts of back-breaking labor, never-ending toil, and all this at a time when climate change is going to utterly fuck over farmland and destroy crop yields.