* They are growing up in a climate that is worse than any prior generation had and getting worse.
* In the US, they are growing up in a time with less upward mobility and more economic inequality than the previous several generations had.
* Trust in social institutions and government is crumbling before their eyes.
* Blue collar jobs are already gone and white collar jobs have no certainty because of AI. Almost all of the money has already been sucked out of artistic professions and what little is left is quickly evaporating because of AI.
Imagine you're 17 like my daughter and trying to decide what to major in in college. You want to pick something that you think is likely to give you some kind of decent career and sense of stability. What do you pick?
Because, I'll tell you, she asks me and I have no fucking idea what to say.
This isn't true at all. There's never been a better time to be in the trades.
Those days of grinding on some grad school maths homework until insight.
Figuring out how to configure and recompile the Linux kernel to get a sound card driver working, hitting roadblocks, eventually succeeding.
Without AI on a gnarly problem: grind grind grind, try different thing, some things work, some things don't, step back, try another approach, hit a wall, try again.
This effort is a feature, not a bug, it's how you experientially acquire skills and understanding. e.g. Linux kernel: learnt about Makefiles, learnt about GCC flags, improved shell skills, etc.
With AI on a gnarly problem: It does this all for you! So no experiential learning.
I would NOT have had the mental strength in college / grad school to resist. Which would have robbed me of all the skill acquisition that now lets me use AI more effectively. The scaffolding of hard skill acquisition means you have more context to be able to ask AI the right questions, and what you learn from the AI can be bound more easily to your existing knowledge.
The problem is: (almost) nobody does that. You'll just ask Claude Code to fix the build, go grab a coffee and come back with everything working.
It's like the difference between hand-made furniture and IKEA.
Until OpenAI etc need to turn a profit.
Now, part of me thinks 'is not letting students having AI like not letting them have a calculator'. On the other hand, if I just let the AI do the exam, well I don't really need the student at all do I?
Like years of manually studying, fixing and reviewing code is experience that only pre ~2020 devs will have.
The intuitive/tacit knowledge that lets you look at code and "feel" that something is off with it cannot really be gained when using Claude Code, it takes just 1000s of hours of tinkering.
It will suck if the job shifts to reviewing and owning whatever an LLM spits out, but I don't really know how effective new juniors are going to be.
True. Pretty soon, pre-AI devs may be the COBOL/Fortran engineers of this era: niche and hard to replace.