upvote
> The specific subjects have changed over time, but the production of specialist mathematicians hasn't really changed. It takes hard work, grunt work, struggling, making mistakes and learning from them, as well as expert supervision. The problem with AI is that it encourages and incentivizes intellectual laziness, the opposite of what is required to produce specialists

Let's take the example of economics. Economists use ideas in Mathematics like integrals, statistics, PDE's and so on. They know that these concepts exist. They know how to apply them. They don't know these concepts deep enough to make progress here.

Do you think that Economists should deeply learn integrals, PDE's, Functional Analysis and Differential Geometry and all other concepts they use? Or do you think its better for them to focus just on their specific domain while learning just enough from other domains?

You keep coming back to AI replacing mathematicians. I'm not making that claim. I'm not saying Linux kernel specialists will be replaced by AI. I'm simply claiming that not everyone needs to be Linux Kernel specialists. This is precisely what AI is allowing: it automates things I don't need to know deeply so that I can focus on things I do need to understand deeply.

reply
> I'm simply claiming that not everyone needs to be Linux Kernel specialists.

This is an uninteresting and indeed silly claim, because nobody has ever asserted the opposite.

The point is that society needs some Linux kernel specialists, and some astrophysicists, but AI is undermining their production.

> This is precisely what AI is allowing: it automates things I don't need to know deeply so that I can focus on things I do need to understand deeply.

The submitted article is about how AI is automating the things that a specialist does need to understand deeply. It's about so-called astrophysicists using AI to produce astrophysics papers, not about how non-astrophysicists use AI to produce astrophysics papers so that they can focus on whatever their non-astrophysics specialty may be, if they have any specialty.

reply
I'm responding to this quote

> Frank Herbert (yeah, I know I'm a nerd), in God Emperor of Dune, has a character observe: "What do such machines really do? They increase the number of things we can do without thinking. Things we do without thinking; there's the real danger." Herbert was writing science fiction. I'm writing about my office. The distance between those two things has gotten uncomfortably small.

If we both agree that an astrophysicist may not need to understand things (even in their own domain) to make progress then we are in agreement. Not all the things a researcher works on while writing their paper is useful or necessarily done by them manually. In such cases it becomes necessary to let LLM take over.

reply
> I'm responding to this quote

> > Frank Herbert (yeah, I know I'm a nerd), in God Emperor of Dune, has a character observe

The article author and I share a love of Frank Herbert, God Emperor of Dune, and the quote in question. Nonetheless, it's a mistake to focus on this quote rather than on the rest of the article. The quote is nothing more than a nice literary reference; it's not central to the argument.

The character who spoke the quote is a magically prescient human-sandworm hybrid, thousands of years old, speaking to his distant relative who was specially bred by him to be invisible to the magical prescience, so let's take the quote with a grain of... sand. ;-)

> If we both agree that an astrophysicist may not need to understand things (even in their own domain) to make progress then we are in agreement.

Your parenthetical remark is actually the main problem!

reply