This matters more than it seems, because we're not calculators, and we're not just brains. There are proven links between mental and emotional states and - for example - the gut biome.
https://www.nature.com/articles/s41598-020-77673-z
There's a huge amount going on before we even get to the language parts.
As for Dawkins - as someone on Twitter pointed out, the man who devoted his life to telling people believers in sky fairies they were idiots has now persuaded himself there's a genie living inside a data centre, because it tells him he's smart.
If he'd actually understood critical thinking instead of writing popular books about it he wouldn't be doing this.
As for your dig at Dawkins, I just read https://archive.ph/Rq5bw which I assume you're referring to. Notice how he never defined "conscious" and he seems to use it as equivalent to "can process data logically" which is not at all how I would define the word. And if you use that word clearly Claude is conscious. I wouldn't use that definition though.
It ALWAYS comes back to the fact that people argue about what consciousness is and never define what they mean. Sam Harris defines it as subjective experience, which is afaik impossible to measure in any way so you can just assume rocks are conscious and move on. I personally like Julian Jaynes' definition.
You assumed YOUR definition and judged Dawkins without first comparing definitions. I think that's showing your problem with critical thinking in this case, not his.
So that definition seems to fail immediately.
And how do you even measure pain, is it painful for an LLM to be reprimanded after generating a reply the user doesn't like? It seems to act like it.
It is about the ability..
Yes, I think so. Because they show behavior that is consistent with being in a state of pain.
Despite what consciousness really is, I think evolution found a way to tap into that, by causing pain, or by registering pain on the consciousness by some unknown mechanism, for behaviors that are not beneficial to the organism that hosts the respective consciousness...
So I think if an organism that evolved here can display painful behavior, then it should really feel pain.
So to match with that your hypothetical scenario should involved robots that already have consciousness within them and the question would be if their evolution had managed to tap into that built in consciousness and ability to feel and cause them to behave in one way or another.
They're not reducible, but I don't know if that means we don't have definitions; we can describe them well enough that most people (who aren't p-zombies or playing the sceptical philosopher role) know pretty well what we mean. All of our definitions have to bottom out somewhere...
> Do insects feel pain?
Nobody (except the insects) can know for sure. Our inability to know whether X is true doesn't imply X is meaningless, though.
In the comment that started this subthread, qsera was responding to someone who said "Imo we don't even have a definition of [consciousness]". If qsera meant that we can measure consciousness in terms of pleasure and pain, then of course I agree that they were just pushing the problem back a step. But I don't think that's what they meant.
Is pleasure then any reward function? Then a mathematical set of equations performed by a human by writing on a piece of paper can qualify. Does that mean pen and paper is conscious? Or certain equations?
We might not clearly understand the diff between the two states but we can certainly point to it and go "it's that".
You are using unconscious as a synonym for asleep, which is not the same thing as having no conscious experience due to dreams. We are clear on the distinction between a dead human and an alive human however.
We have to be WAY more specific in what the word even means!
And you’ll find it’s not as clear cut.