upvote
I feel this way too. Wish I could fully understand the 'why'. I know all of the usual arguments, but nothing seems to fully capture it for me - maybe it' all of them, maybe it's simply the pace of change and having to adapt quicker than we're comfortable with. Anyway best of luck from someone who understands this sentiment.
reply
Really? I think it's pretty straightforward, at least for me - fear of AI replacing my profession and also fear that it will become harder to succeed with a side project.
reply
Yeah I can understand that, and sure this is part of it, just not all of it. There is also broader societal issues (ie. inequality), personal questions around meaning and purpose, and a sprinkling of existential (but not much). I suspect anyone surveyed would have a different formula for what causes this unease - I struggle to define it (yet think about it constantly), hence my comment above.

Ultimately when I think deeper, none of this would worry me if these changes occurred over 20 years - societies and cultures change and are constantly in flux, and that includes jobs and what people value. It's the rate of change and inability to adapt quick enough which overwhelms me.

reply
I have some of those too, to a limited extent.

Not worried about inequality, at least not in the sense that AI would increase it, I'm expecting the opposite. Being intelligent will become less valuable than today, which will make the world more equal, but it may be not be a net positive change for everybody.

Regarding meaning and purpose, I have some worries here too, but can easily imagine a ton of things to do and enjoy in a post-AGI world. Travelling, watching technological progress, playing amazing games.

Maybe the unidentified cause of unease is simply the expectation that the world is going to change and we don't know how and have no control over it. It will just happen and we can only hope that the changes will be positive.

reply
> fear of AI replacing my profession

See i don't have any of this fear, I have 0 concerns that LLMs will replace software engineering because the bulk of the work we do (not code) is not at risk.

My worries are almost purely personal.

reply
Thank you thank you, misery loves company lol! I haven't fully pinned down what the exact cause is as well, an ongoing journey.
reply
I felt this way from a year ago up until February 2026. Claude Code and Codex becoming the norm cemented for me that a lot of the projects people are working on (including mine) are totally obsolete. As far as I'm concerned, most code is now abstracted away, and people only want better agents - not traditional software products, except as infrastructure or platforms.

It also looks like the final form of the AI roll-out: whatever the model or application, this is the era of agents, and probably in the near-future mostly automated agents. We'll see an overflow of bespoke automation and in-house agents doing everything from personal task management to enterprise business processes, so releasing a "Personal Fitness Tracker" or a "CRO Auditor" in 2026 doesn't make any sense.

All of my anxiety around it has evaporated because I can see what it actually is: an ouroboros of AI output generating automation of more AI output. What most software engineers will be working on now is guiding that output, making it easier to inspect/configure it, optimizing it, and improving the consumer and developer experience.

Otherwise, we just have to drop our old concepts for projects and work on something else.

For the consumer the floor is rising, and for the experienced developer the ceiling is rising. I personally hate web dev anyway, and I'm glad I can work on interesting engineering problems (even with the help of an AI) instead of having to manually stitch together yet another REST API, or website, or service pipeline.

reply
Why? Good anxiety or bad?
reply