How long before those lines cross? Intuitively it feels like we have about 2-3 years before claude is better at writing code than most - or all - humans.
And, pray tell, how people are going to come up with such design?
The other day I tested an AI by giving it a folder of images, each named to describe the content/use/proportions (e.g., drone-overview-hero-landscape.jpg), told it the site it was redesigning, and it did a very serviceable job that would match at least a cheap designer. On the first run, in a few seconds and with a very basic prompt. Obviously with a different AI, it could understand the image contents and skip that step easily enough.
I think exceptional work, AI tools or not, still takes exceptional people with experience and skill. But I do feel like a certain level of access to technology has been unlocked for people smart enough, but without the time or tools to dive into the real industry's tools (figma, code, data tools etc).
I think the idea that LLM's will usher in some new era where everyone and their mom are building software is a fantasy.
I am usually a bit of an AI skeptic but I can already see that this is within the realm of possibility, even if models stopped improving today. I think we underestimate how technical things like WIX or Squarespace are, to a non-technical person, but many are skilled business people who could probably work with an LLM agent to get a simple product together.
People keep saying code was never the real skill of an engineer, but rather solving business logic issues and codifying them. Well people running a business can probably do that too, and it would be interesting to see them work with an LLM to produce a product.
They wouldn’t even know where to begin!
Even if all sandboxing is done right, programs will be depended on to store data correctly and to show correct outputs.