upvote
One solution is not to advance anything of course. I'm not even joking, is there going to be a successor to React? I suspect not, with the vast amount of training data for React now, it's going to look silly to move to something else with less support. What is the last new popular programming language, rust? Will there be another one? I suspect not. Same reasoning. The irony of all this AI acceleration talk is it'll work best if we don't accelerate the underlying tech at all.
reply
There probably won't be new stuff so much as trends in how stuff is done, and updates around optimizing those trends.
reply
Will programming languages evolve into less human oriented written code and more just calls to a trusted AI.

Or will human readable code be less and less of a thing as AI learns it's own, more terse language to talk to other AI's.

reply
Yes. I am seeing a big push to use vanilla js for single file html apps that are easy to build, deploy and distribute because they have no build step. I could see component libraries emerging that make it easier build from chat interfaces with less ceremony
reply
i'm not sure the tradeoff in code readability is worth it as of now.
reply
Name/post content combo on point
reply
Alot of the language work is scratching the itch of engineers and developers. I think you’re correct and react is the new COBOL.
reply
Humans are notoriously bad at predicting the future. Toward that end, your prediction is laughable. React is the end all be all of UI… lol
reply
Programmers won't be allow to exist in future. Vibe coding is the final resolution people can apply.
reply
Nobody is unaware of the knowledge cutoff, and sharing the Wikipedia article is not helping anyone. Your point is easily rebutted by taking whatever open weights/source model has an outdated cutoff and training or fine tuning it on more data, which is again always going to be viable given a modicum of compute
reply
You could learn how to code...a whole generation did it before...
reply
I genuinely don't understand how can this possibly be a problem long term.

It feels very obvious that the solution is to have a smaller model that can be trained exclusively on Java information to augment the older model. If the architecture doesn't support it currently, then that's what the architecture will look like in the future.

Otherwise you'd be arguing that, to serve users who want to an up-to-date LLM on topic X, you have to train the model on the entire ABC all over again.

It's simply ludicrous to have a coding LLM that needs to be retrained on the latest published poems and pastry recipes to generate Java.

reply
>Coding, one of the most popular uses cases today, would not be great if it say only understood java to a version from years ago etc.

This LLM trained only and entirely on pre-1930s texts was able to code Python programs when given only a short example:

https://talkie-lm.com/introducing-talkie

reply
Small models are more useful for "doing stuff" than "knowing stuff" to begin with. Add in an agentic harness and a small model can happily read more current information on demand (including from e.g. a local wikipedia snapshot).
reply
Ha yes I used to think this was not a notable issue, but just today I was getting qwen 3.5 to fix my network drivers and it immediately freaked out like: "kernel 6.17, what the fuck? that doesn't exist yet!". It almost had a mental breakdown over that detail and derailed the conversation towards checking what's wrong with the kernel version reporting lol.
reply