doesn’t that presume no value is being delivered by current models?
I can understand applying this logic to building a startup that solves today’s ai shortcomings… but value delivered today is still valuable even if it becomes more effective tomorrow.
On the other hand, deeply understanding how models work and where they fall short, how to set up, organize, and maintain context, and which tools and workflows support that tends to last much longer. When something like the “Ralph loop” blows up on social media (and dies just as fast), the interesting question is: what problem was it trying to solve, and how did it do it differently from alternatives? Thinking through those problems is like training a muscle, and that muscle stays useful even as the underlying technology evolves.
Now because of models improving, context sizes getting bigger, and commercial offerings improving I hardly hear about them.
The problem is that so many of these things are AI instructing AI and my trust rating for vibe coded tools is zero. It's become a point of pride for the human to be taken out of the loop, and the one thing that isn't recorded is the transcript that produced the slop.
I mean, you have the creator of openclaw saying he doesn't read code at all, he just generates it. That is not software engineering or development, it's brogrammer trash.
How do you both hold that the technology is so revolutionary because of its productive gains, but at the same time so esoteric that you better be ontop of everything all the time?
This stuff is all like a weird toy compared to other things I have taken the time to learn in my career, the sense of expertise people claim at all comes off to me like a guy who knows the Taco Bell secret menu, or the best set of coupons to use at Target. Its the opposite of intimidating!
This is just wrong. A) It doesn’t promise improvement B) Even if it does improve, that doesn’t say anything about skill investment. Maybe its improvements amplify human skill just as they have so far.
I kinda regret going through the SeLU paper lol back in the late 2010s.