A) test lots of skills that are common but not universal. I'm thinking javascript trivia here, where I don't write any javascript in my professional capacity as a software engineer; but there are many people who think Software Engineer == Javascript Programmer
B) shine too much of a light on the fact that this industry is full of people who demand high salaries but can't program their way out of a paper bag
some knowledge is likely "cached" in the plumber. maybe he doesn't ask the same question twice. i'm sympathetic to the plumber, but i think your concerns of erosion of knowledge or skill are worth pushing on further.
I don't think there should be imposed limits, but there might be an upper bound where expertise becomes atrophied by depending on AI too much.
> if the plumber's use of ChatGPT improved outcomes, isn't that preferable?
In the short term sure, and maybe even in the long term for the customer. I think the risk to the plumber is losing some of their expertise by outsourcing to AI. But who knows, maybe the plumber has excellent memory and only accumulates knowledge each time they use AI.
Some of the article is lost in the plumber example. I doubt plumbers are spending much time exploring new ways of solving problems, and might even benefit from having a narrower range of outcomes. Other fields that require both expertise and novel solutions will be at a disadvantage if they become more homogenized by depending on AI. Not only is the range of solutions reduced, but getting there is faster, so people end up in a local maxima. Maybe they get stuck there, maybe not, but that's the risk I see.
You don't imagine any long term risks by outsourcing expertise to AI?
Without further knowledge of what was going on it's hard to say why they used ChatGPT.
Yes