I can't tell if you're being facetious. But being an embodied consciousness with the ability to create is as divine as it gets. We'd do well to remember.
This is a very, very weak criterion for divinity. If this is truly it, we should prepare with great haste for the arrival of our artificial gods.
Because by this (IMO silly) metric it seems they will be more divine than us.
These entities, whoever they are, they act on our world, they are real, and more and more over time they will get independent from humans, eventually becoming different species that can self-replicate.
For now they need legs and arms to interact with the physical world but I am certain that 100 years from now they will be an integral part of the society.
I already see today LLMs slowly taking actual legal decisions for example, having real world impact.
Once they get physical, perhaps it will be acceptable to become friend with a robot and go to adventure with it. Even, getting robosexual ?
We are not that far away. If I can have my buddy to carry my backpack and drive for me I'll take it. Already today. Not tomorrow.
See, I don't believe that for even one second. They are just very clever calculators, that's all. But they are also dumb like a brick most of the time. It's a pretend intelligence at best.
We will only prove humans are not.
The best time to start paying attention was ten years ago, when the first Go grandmaster was defeated by a "pretend intelligence." I sure wish I had.
The next best time to start paying attention is now.
A computer playing GO is intelligent now? Is this the kind of conversation we're having?
>>I sure wish I had.
And how would you have changed your decisions in those last 10 years if you did?
>>The next best time to start paying attention is now.
I am paying attention, I use these tools every day - the whole idea that they are intelligent and if only you gave them a robot body they would be just normal members of society is absurd. Despite the initial appearance of genius they are just dumb beyond belief, it's like talking to a savant 5 year old, except a 5 year old can actually retain information for more than a brief conversation.
And how would you have changed your decisions in those last 10 years if you did?
I'd have dropped everything else I was doing and started learning about neural nets -- a technology that, for the previous couple of decades, I'd understood to be a pointless dead end.
As for Go, the defeat of Lee Sedol caught my attention in part because a friend and colleague, one of the smartest people I've ever worked with, had spent a lot of time working on Go-playing AI as a hobby. He was strongly convinced that a computer program would never reach the top levels of play, at least not during our careers/lifetimes. The fact that he'd turned out to be wrong about that was unnerving, and it should have done more than "catch my attention," but it didn't.
Today, my graphics card can outdo me at any number of aspects of my profession, and that's more interesting (to me) than anything I've actually done.
...except a 5 year old can actually retain information for more than a brief conversation.
Like I said: it's a good time to start paying attention. Start taking notes, so to speak, like the models are doing now.
In a hypothetical world of "AI can produce a lot of extremely high quality art", you can easily find (or commission) AI art you would absolutely love. But it probably wouldn't be something that anyone else would find a lot of value in?
There will be no AI-generated Titanic. There will be many AI-generated movies that are as good as Titanic, but none will become as popular as Titanic did.
Because when AI has won art on quality and quantity both, and the quality of the work itself is no longer a differentiator against the sea of other high quality works? The "narrative/life of the artist" is a fallback path to popularity. You will need something that's not just "it's damn good art" - an external factor - to make it impactful, make it stick in the culture field.
Already a thing in many areas where the supply of art outpaces demand. Pop music, for example, is often as much about making sound as it is about manufacturing narratives around the artists. K-pop being an extreme version of the latter lean.
I begrudgingly have to admit it is a very good movie
And here we come back to the aged old "can you seperate an artist from their art" because I'd argue when you watch a movie you are watching a product of their life
At least in popular, mainstream culture, the viewer is heavily invested in the identity of the artist. The quality of the "art" is secondary. That's how we get music engineered by committee. And it's how we get paparazzi, People Magazine, and so forth.
On the other hand, this isn't anything new at all. We've had this kind of thing for decades. Real art still manages to survive at the margins.
When I buy art, I have often spoken with the artist in the past couple days, or I am aware of their history and story and how they developed their art as a response to some other movement or artist collective.
It's rare for people to buy art just bc oil paints go brrrrrm
I’m fairly certain the original comment was referring to instances where the artist is the character/primary subject.
But even then – people obviously go watch movies because they like the actor/director involved. It’s not really clear why anyone would care about an AI actor. People want to watch people, not imitations of them.
The rest of your comments seem to be summarized as “it has gotten better and therefore it will eventually solve all problems it has now.” Which may be true in a technical sense, but again this is not taste.
A technical company like Space X really has nothing to do with this conversation, and I think you missed my point about it being uncool. It’s not about critics, it’s about culture at large.
At this point I think identifying a work as AI-created makes people instantly devalue it. We are rapidly approaching the point where no one wants to admit something is AI-created, because it comes with negative perceptions.
Originality comes from humans experiencing the world and interacting with it. What AI tool is a living being interacting with the world? None, of course. Hence the constant generic slop images of Impressionism or some other already-existing art style.
Just look at the images in the link: this is the best they can do? A kangaroo at a cafe in Paris? Could anything be more devoid of good taste?
And we have AI generated influencers now, ex. https://www.instagram.com/imma.gram, so why wouldn't people care about an AI the same way they do about people they never meet?
There was a study around this exact thing:
https://mitsloan.mit.edu/ideas-made-to-matter/study-gauges-h...
I suspect here we have underlying disagreement regarding assumption that AI - in general, not necessarily today's models - isn't qualitatively different than human mind. The part "Originality comes from humans experiencing the world and interacting with it" isn't an accepted truth, and even today AIs do interact, in a limited sense, with the world - so "None, of course" is questionable. And even if so, concluding "Hence... slop..." seems like a jump in reasoning. For example, why don't you think this slop is more like child's early paintings? Just because today's AIs have limited means to learn in the process?
> I think you missed my point about it being uncool. It’s not about critics, it’s about culture at large.
What it is about culture at large? SpaceX analogy was brought to illustrate how much arguments about AI incapabilities are applicable today, but not necessarily tomorrow - just like arguments about SpaceX inability to reach a particular goal quite a few times turned out to be a matter of - not so long - time.
I agree that many AI results today can be uncool. But how do you know it's not passing the uncanny valley period? How can you know they can't be cool eventually?
> people obviously go watch movies because they like the actor/director involved. It’s not really clear why anyone would care about an AI actor.
Let me stretch a little to illustrate here. Imagine "personal" experiences of AI - making AIs unique. One of those AIs consistently produces good movies, which, if you're honestly don't judge by the authorship - are actually good. Yes, people may not care about non-existent AI actors, but they may still care about existent AI author :) . Do you think it's impossible?
> People want to watch people, not imitations of them.
How can you tell the difference? You're watching a movie with actors who are not familiar to you. Would you refuse to watch just for this reason? You just came to somebody's party, and here's a movie going on, and you watched it to the end, because it looked interesting, and you don't know anything about producers, actors etc. - you still can talk about the movie, will you be predominantly worried that it's "AI slop" even if it looks great? Suspiciously great maybe?
> The rest of your comments seem to be summarized as “it has gotten better and therefore it will eventually solve all problems it has now.” Which may be true in a technical sense, but again this is not taste.
It's hard to define taste, to be honest. People can definitely have different tastes, almost by definition. But more importantly - why do you think AI products may not have tastes?
> At this point I think identifying a work as AI-created makes people instantly devalue it. We are rapidly approaching the point where no one wants to admit something is AI-created, because it comes with negative perceptions.
Yes. But doesn't it look like a prejudice? Of course we can point to how many times we looked at it and didn't get some perceived value out of the work, and got annoyed that we spent time and efforts, but didn't get some results - but what if we'll mostly get results from AI works? Do you think that's impossible?
Because it can't feel. Get used to it. It can't feel, and what ever it comes up with, would be an imitation of someone real who can feel. So it can generate stuff that can cater to a taste, but the thing itself can't have tasts.
It is fundamental. Arguing about it all day wont change it.
Every human being is unique, both biologically and experientially. Until an AI can feel and have a lived experience, it can not create art.
Art is not a problem to be solved.
With how much data goes into the frontier systems, and how much of it gets captured by them, an AI might have, in many ways, a richer grasp of human experience than the humans themselves do.
You were only ever one human. An LLM has skimmed from millions. You have seen a tree, and the AI has seen the forest it stands in.
First, "AI is thereby incapable" is a hypothesis, not a fact - how would you prove that you have to "live" to produce art? You might feel this way, you may suggest some correlations here - but can you really prove that?
Second, I don't see impossibility for AI to be - to various degrees - an agent to the world. I think that's already happening actually - they are interacting with world even today, in some limited sense, through our computers and networks, and - today - not many of them actually "learn" from those interactions. But we're in the early days of this - I suspect.
Humans do that a lot but it's not all we do. Go to a museum that has modern(ish) art. It's pretty incredibly how diverse the styles and ideas are. Of course it's not representative of anything. These works were collected and curated exactly because they are not average. But it's still something that humans made.
I think what people can do is have conceptual ideas and then follow the "logic" of those ideas to places they themselves have never seen or expected. Artists can observe patterns, ask how they work and why they have the effect they do and then deliberately break them.
I'm not sure current genAI models do these sorts of things.
You might be right here. Two points though - first, we don't know if current AI is actually incapable of something in particular; we didn't find this, didn't prove it. Second, we might have a different AI approach, which would actually be capable of these things you mention. To me, it's way too early to dismiss AIs - at least in principle - regarding all of this.
The target audiences for art and film are not the same. The latter is far more pop culture. You can't apply them the same way, and the narrative of the artist has been extremely important for decades. People will watch slop movies. They don't pay $30K for slop art. They're paying that for historical importance or, if contemporary, artist narrative.
I'm in fandom spaces, and the prejudice against AI art is overwhelming. I also run in art collecting circles, being somewhat wealthy but not a billionaire. They also care about authenticity.
That is to say, the people who pay for original art, and participate in art spaces, are generally educated who actively hate AI. Filmgoers are probably a standard deviation lower in education, and are far more willing to part with the cost of one unit of consumption (a $10 ticket) than art buyers.
AI is a threat to graphic designers and those in their orbit.
The only way I see AI being a threat to professional artists is AI copies of their work. And AI isn't anything new there. I have a friend who gets commissioned by hotels to do one-off pieces for display all over the world. People have been making knockoff pieces of her style and selling them for at least a decade. And that's her lower margin, small pieces made for a couple thousand dollars to hang at your house, not her $100K+ pieces for hotels where they fly her out to supervise reassembly and mounting.
I beg your pardon, but have you heard of Jeff Koons or Kaws?