AI will be a tool, no more no less. Most likely a good one, but there will still need to be people driving it, guiding it, fixing for it, etc.
All these discourses from CEO are just that, stock market pumping, because tech is the most profitable sector, and software engineers are costly, so having investors dream about scale + less costs is good for the stock price.
All I'm saying is - why to think what AI is (exoskeleton, co-worker, new life form), when its owners intent is to create SWE replacement?
If your neighbor is building a nuclear reactor in his shed from a pile of smoke detectors, you don't say "think about this as a science experiment" because it's impossible, just call police/NRC because of intent and actions.
Only if you're a snitch loser
Let's rewind 4 years to this HN article titled "The AI Art Apocalypse": https://news.ycombinator.com/item?id=32486133 and read some of the comments.
> Actually all progress will definitely will have a huge impact on a lot of lives—otherwise it is not progress. By definition it will impact many, by displacing those who were doing it the old way by doing it better and faster. The trouble is when people hold back progress just to prevent the impact. No one should be disagreeing that the impact shouldn't be prevented, but it should not be at the cost of progress.
Now it's the software engineers turn to not hold back progress.
Or this one: https://news.ycombinator.com/item?id=34541693
> [...] At the same time, a part of me feels art has no place being motivated by money anyway. Perhaps this change will restore the balance. Artists will need to get real jobs again like the rest of us and fund their art as a side project.
Replace "Artists" with "Coders" and imagine a plumber writing that comment.
Maybe this one: https://news.ycombinator.com/item?id=34856326
> [...] Artists will still exist, but most likely as hybrid 3d-modellers, AI modelers (Not full programmers, but able to fine-tune models with online guides and setups, can read basic python), and storytellers (like manga artists). It'll be a higher-pay, higher-prestige, higher-skill-requirement job than before. And all those artists who devoted their lives to draw better, find this to be an incredibly brutal adjustment.
Again, replace "Artists" with coders and fill in the replacement.
So, please get in line and adapt. And stop clinging to your "great intellectually challenging job" because you are holding back progress. It can't be that challenging if it can be handled by a machine anyway.
Is it though? I agree the technology evolving is inevitable, but, the race/rush to throw as much money at scaling and marketing as possible before these things are profitable and before society is ready is not inevitable at all. It feels extremely forced. And the way it's being shoved into every product to juice usage numbers seems to agree with me that it's all premature and rushed and most people don't really want it. The bubble is essentially from investing way more money in datacenters and GPU's than they can even possibly pay for or build, and there's no evidence there's even a market for using that capacity!
It's funny you bring up artists, because I used to work in game development and I've worked with a lot of artists, and they almost universally HATE this stuff. They're not like "oh thank you Mr. Altman", they're more like "if we catch you using AI we'll shun you." And it's not just producers, a lot of gamers are calling out games that are made using AI, so the customers are mad too.
You keep talking about "progress", but "progress" towards what exactly? So far these things aren't making anything new or advancing civilization, they're remixing stuff we already did well before, but sloppily. I'm not saying they don't have a place -- they definitely do, they can be useful. My argument is against the bizarre hype machine and what sometimes seems like sock puppets on social media. If the marketting was just "hey, we have this neat AI, come use it" I think there'd be a lot less backlash then people saying "Get in line and adapt"
> And stop clinging to your "great intellectually challenging job" because you are holding back progress.
Man, I really wish I had the power you think I have. Also, I use these tools daily, I'm deeply familiar with them, I'm not holding back anyone's progress, not even my own. That doesn't mean I think they're beyond criticism or that the companies behind them are acting responsibly, or that every product is great. I plan to be part of the future, but I'm not just going to pretend like I think every part of it is brilliant.
> It can't be that challenging if it can be handled by a machine anyway.
This will be really funny when it comes for your job.
The only way generative AI has changed the creative arts is that it's made it easier to produce low quality slop.
I would not call that a true transformation. I'd call that saving costs at the expense of quality.
The same is true of software. The difference is, unlike art, quality in software has very clear safety and security implications.
This gen AI hype is just the crypto hype all over again but with a sci-fi twist in the narrative. It's a worse form of work just like crypto was a worse form of money.
And, bizarrely, I've really not bought any since. It's diminished my desire for the brand.
Gen AI is the opposite of crypto. The use is immediate, obvious and needs no explanation or philosophizing.
You are basically showing your hand that you have zero intellectual curiosity or you are delusional in your own ability if you have never learned anything from gen AI.
E.g. try to make any image generating model take an existing photo of a humanoid and change it so the character does a backflip.
It's also interesting to generate images in a long loop, because it usually reveals interesting patterns in the training data.
Outside these distractions I've never had generative AI be useful. And I'm currently working in AI research.
It’s always the people management stuff that’s the hard part, but AI isn’t going to solve that. I don’t know what my previous manager’s deal was, but AI wouldn’t fix it.
Historically when SWEs became more efficient then we just started making more complicated software (and SWE demand actually increased).
In times of uncertainty and things going south, that changes to we need as little SWEs as possible, hence the current narrative, everyone is looking to cut costs.
Had GPT 3 emerged 10-20 years ago, the narrative would be “you can now do 100x more thanks to AI”.