-- we had a terrible time building something so now we're only going to buy things
-- we had a terrible time buying something so now we're only going to build things
-- repeat...
Either way you can have a brilliant success and either way you fail abjectly, usually you succeed at most but not all of the goals and it is late and over budget.
If you build you take the risks of building something that doesn't exist and may never exist.
If you buy you have to pay for a lot of structure that pushes risks around in space and time. The vendor people needs marketing people not to figure out what you need, but what customers need in the abstract. Sales people are needed to help you match up your perception of what you need with the reality of the product. All those folks are expensive, not just because of their salaries but because a pretty good chunk of a salesperson's time is burned up on sales that don't go through, sales that take 10x as long they really should because there are too many people in the room, etc.
When I was envisioning an enterprise product in the early 2010s for instance I got all hung up on the deployment model -- we figured some customers would insist on everything being on-premise, some would want to host in their own AWS/Azure/GCP and others would be happy if we did it all for them. We found the phrase "hybrid cloud" would cause their eyes to glaze over and maybe they were right because in five years this became a synonym for Kubernetes. Building our demos we just built things that were easy for us to deploy and the same would be true for anything people build in house.
To some extent I think AI does push the line towards build.
I’m not opposed to AI or bemoaning “vibe coding”. The answer is still the same with build vs buy “does it make the beer taste better?”. “Do I get a competitive advantage by building vs buying”?
To a point, but I think this overstates it by quite a bit. At the moment I'm weighing some tradeoffs around this myself. I'm currently making an app for a niche interest of mine. I have a few acquaintances who would find it useful as well but I'm not sure if I want to take that on. If I keep the project for personal use I can make a lot of simplifying decisions like just running it on my own machine and using the CLI for certain steps.
To deploy this to for non-tech users I need to figure out a whole deployment approach, make the UI more polished, and worry more about bugs and uptime. It sucks to get invested in some software that then constantly starts breaking or crashing. GenAI will help with this somewhat, but certainly won't drop the extra coding time cost down to zero.
Classic MacOS was designed to support handling events from the keyboard, mouse and floppy in 1984 and adding events from the internet broke it. It was fun using a Mac and being able to get all your work done without touching a command line, but for a while it crashed, crashed and crashed when you tried to browse the web until that fateful version where they added locks to stop the crashes but then it was beachball... beachball... beachball...
They took investment from Microsoft at their bottom and then they came out with OS X which is as POSIXy as any modern OS and was able to handle running a web browser.
In the 1990s you could also run Linux and at the time I thought Linux was far ahead of Windows in every way. Granted there were many categories of software like office suites that were not available, but installing most software was
./configure
make
sudo make install
but if your system was unusual (Linux in 1994, Solaris in 2004) you might need to patch the source somewhere.I started with Windows 98. Didn't experience OSX until 2010. 9 years wasted.
I've started tons of scratch my own itch projects. There's adoption, UX, onboarding costs even if you're the only audience.
TLDR: i don't even use my own projects. I churn.
Though, the economy does not seem to be in a good spot to try that strategy out as of now.