upvote
> That's the distinction I'm talking about, that's the threat to software industry, and it doesn't take "true AI" - the LLMs as we have today are enough already.

They most certainly are not. With the current state of LLMs, anyone who puts them in charge of things is being a fool. They have zero intelligence, zero ability to cope with novel situations, and even for things in their training data they do worse than a typical skilled practitioner would. Right now they are usable only for something where you don't care about the quality of the result.

reply
> and it doesn't take "true AI" - the LLMs as we have today are enough already.

I believe that relatively few people would agree with you on that point. LLMs aren’t good enough (yet?), and very obviously so, IMO, to be autonomous problem solvers for the vast majority of problems being solved by software companies today.

reply
What you lose is control. Even in the case of an actually-intelligent agent, if you task a subordinate with producing a document for you, they are going to come up with something that is different from exactly what you had in mind. If they are really good, they might even surprise you and do a better job than than you'd have done yourself, but it still will be their vision, not yours.

Your notion of a "mortal wound" to the software industry seems to assume that today's SaaS portals are the only form that industry can take. Great software is "tool calls for agents". Those human agents who care about getting exactly the result they want will not be keen on giving up Photoshop for Photoshop-but-with-an-AI-in-front-of-it.

reply