FWIW it feels like GH Copilot is a cheaper version of OpenRouter but with trade-offs like being locked into VSCode and the Microsoft ecosystem overall. I already use VSCode though and otherwise I don't see much downside to using GH Copilot outside of that.
I also wouldn’t say you’re locked into Microsoft’s ecosystem. At work we just have skills that allow for interaction with Bitbucket and other internal tooling. You’re not forced to use GitHub at all.
https://github.blog/changelog/2026-01-16-github-copilot-now-...
You can use GH Copilot with most of Jetbrains IDEs.
In general I view VS Code and VS.NET Community + SQL Server free universe as the most effective option :) I think these products are great actually.
Cancelled the plan I had with them and happily went back to just coding like normal in VSCode with occasional dips into Copilot when a need arose or for rubber ducking and planning. Feels much better as I'm in full control and not trusting the magic black box to get it right or getting fatigue from reading thousands of lines of generated code.
Anyone who says they're able to review thousands of lines effectively that Claude might slop out in a day are lying to themselves.
The amount you can review before burning out is now the reasonable limit, for the same reason that a car is supposed to stay at the speed you can handle and not the max speed of the engine.
Of course, many people are secretly skipping reviews and some dare to publicly advocate for getting rid of them entirely.
I realized this is the crux of our moment, because a variant of Amdahl's law applies to AI code gen.
{time gained} = {time saved via gen AI} - {time spent in human review}
There's no way that results in a positive number with 100% human review coverage, which means that human review coverage is headed to < 100% (ideally as low as possible).
The question is whether humans can sensibly judge the break even point and not generate faster than that. It's very easy to get lost in the woods and suddenly have a bunch of generated stuff you no longer grok.
As we know with driving, sensible drivers stick to the speed limit most of the time, but there's a good percentage of knuckle draggers who just love speeding, some people get drunk, some they just drive the wrong way down the highway entirely. Either way it's usually the sensible people who end up suffering.
It's likely you didn't learn how to use the tool properly, and I'd suggest 'trying again' because not using AI soon will be tantamount to digging holes with shovels instead of using construction equipment. Yes, we still need our 'core skill's but, we're not going to be able to live without the leverage of AI.
Yes - AI can generate slop, and probably too many Engineers do that.
Yes - you can 'feel a loss of control' but that's where you have to find your comfort zone.
It's generally a bad idea to produce 'huge amounts of code' - unless it's perfectly consistent with a design, and he architecture is derived from well-known conventions.
Start by using it as an 'assistant' aka research, fill in all the extra bits, and get your testing going.
You'll probably want to guide the architecture, and at least keep an eye on the test code.
Then it's a matter of how much further 'up' you can go,
There are few situations in which we should be 'accepting' large amounts of code, but some of it can be reviewed quickly.
The AI, already now in 2026 can write better code than you at the algorithmic level - it will be tight, clean, 'by the book' and far lesss likley to have erros.
It fails at the architectural and modular level still, that will probably change.
The AI 'makes a clean cut' in the wood, tighter to the line than any carpenter could - like a power tool.
A carpenter that does not use power tools is an 'artisnal craft person' , not really building functional things.
This is the era of motor cars, there is really no option - I don't say that because I'm pro or anti anything, AI is often way over-hyped - that's something else entirely.
It's like the web / cloud etc. it's just 'imminent'.
So try again, experiment, stay open minded.
To use your own analogy, there's plenty of carpenters still around for when someone needs something doing properly and bespoke, even though we can all go to Ikea, or any other flat pack furniture company, to get wobbly furniture cheaply at any time.
I'd rather be the last carpenter charging a liveable wage, working on interesting problems for clients who appreciate a human touch than just pumping out mountains of slop to keep up with the broligarchy. If that makes me ignorant that's fine, but I'll be happily enjoying the craft while you're worrying about your metrics.
Or in other words - 'non existent'.
It is arrogant and luddite to suggest that 'using AI is not doing it properly' or that anyone will care.
They care that it's done well - that's it.
FYI, the code that AI produces is probably better than what you produce - at least a functional level.
'Artisanility' is worthless in 'code' - there are no 'winding staircases' for us to custom build, as a master carpenter would.
Where you can continue to 'write code by hand' is for very arcane, things, but even then you're still going to have to use AI for a lot of things in support of that.
So if you want to get into compiler design - sure.
But still - without mastery of AI, you'll be left behind.
At least with horses, there's a naturalist component, with 'code' - nobody cares at all. There's zero interest in it, there's not 'organic' angle to sell.
If you want to have a race to the bottom and be Sam Altman's lap dog, that your business.
Your average 50 year old business owner doesn't understand AI at all and doesn't care to know, he's too busy thinking about getting a new order for 5000 widgets that he invented. What he needs is a website with inventory management, some sort of email marketing software, some sort of CRM, maybe a dashboard or something. What he wants to do is pick up the phone to someone and get them to take care of it for a reasonable price.
AI is coming for programmers with no social skills, but it isn't coming for the human relationship side of the business where you need to have a few meetings to work out what they want to achieve, build a plan that works long term, have a call with other third parties or their vendors etc to alleviate pain points and then build a project around the business needs that won't crash every five minutes and leak their internal information because Claude decided security was optional.
Half my job is understanding what they need and then instead of accepting their original scope, building a brand new scope in collaboration with them to fit the business needs long term. If one of these guys just wants to plow the original scope into Claude and let it rip then the customer isn't getting what they need.
In 2005, Tim Bryce wrote that programmers were by and large a lazy, discipline-averse lot who are of average intelligence at best but get very precious about their "craft", not realizing that it's only a small part of a greater whole and it's the business people who drive actual value in a company. AI is proving him 100% correct.
You forget that templates and off the shelf SAAS products have been around forever and yet I'm still here getting work because there's always a catch and it always shits the bed.
You mention that I must have a user/skill issue because the AI can't be trusted, I had to explain multiple times to Claude during my work that it had left a very obvious security hole in a controller and in a different policy. Stop pretending it's some sort of super intelligence, they can't even do a timer bro and OpenAI is laughing at you while taking your money.
Is it some kind of fear or doubt? It's a strange phenomenon.
Like for example I strongly believe Typescript is better than Javascript and needs to be used instead for any serious project. But if someone says they don't like it, I cannot imagine myself writing a post like yours about it. First of all I don't care what they use, but second of all if I really wanted to convince them it would not look anything like this. Your post and many like it reads like anger and condescension and incredulity.
The codebase disconnect is real.
We are like blue collar workers that need to hit the gym to maintain the body that our cavemen ancestors could maintain by doing their daily duties.
Codebase gym sessions might become a thing.
Of course it is. Returns are diminishing, AGI isn't happening with current techniques but it is good enough to sell, so it's time to monetize. I just got an email from OpenAI as well about ads in their free tier (I signed up once out of curiosity).
It's true AGI is 'not happening' but it doesn't matter.
Demand for AI is explosive, sales are skyrocketing.
We have another 5-8 years of this crazy investment stuff.
Altman will step aside before they turn into a 'normal company'.
Like they did at Uber.
Or perhaps it was a scam in the first place for an IPO.
- Plus is still the same $20
- 20x Pro is still the same $200
- This is a new 5x tier is $100
https://help.openai.com/en/articles/9793128-what-is-chatgpt-... is probably a better direct comparison of the 3
You can't just say because they've added more things the old things are over - the old things actually have to go away first. Eventually they may get there (or not). It may be another few years (or not). Nothing is actually now over though any more than it was now over in 2024.