I know we've been primed by sci-fi movies and comic books, but like pytorch, gpt-5.2 is just a piece of software running on a computer instrumented by humans.
>I know we've been primed by sci-fi movies and comic books, but like pytorch, gpt-5.2 is just a piece of software running on a computer instrumented by humans.
Sure
Do you really want to be treated like an old PC (dismembered, stripped for parts, and discarded) when your boss is done with you (i.e. not treated specially compared to a computer system)?
But I think if you want a fuller answer, you've got a lot of reading to do. It's not like you're the first person in the world to ask that question.
Not an uncommon belief.
Here you are saying you personally value a computer program more than people
It exposes a value that you personally hold and that's it
That is separate from the material reality that all this AI stuff is ultimately just computer software... It's an epistemological tautology in the same way that say, a plane, car and refrigerator are all just machines - they can break, need maintenance, take expertise, can be dangerous...
LLMs haven't broken the categorical constraints - you've just been primed to think such a thing is supposed to be different through movies and entertainment.
I hate to tell you but most movie AIs are just allegories for institutional power. They're narrative devices about how callous and indifferent power structures are to our underlying shared humanity
(In the hands of leading experts.)
The humans put in significant effort and couldn’t do it. They didn’t then crank it out with some search/match algorithm.
They tried a new technology, modeled (literally) on us as reasoners, that is only just being able to reason at their level and it did what they couldn’t.
The fact that the experts were a critical context for the model, doesn’t make the models performance any less significant. Collaborators always provide important context for each other.