upvote
> We don't even know what the pre-requisites for consciousness are so we have no way of knowing.

Imo we don't even have a definition of the word that we agree on.

reply
Ability to feel pain or pleasure is a good indicator I think..
reply
That would be the physically embodied definition. Which is a useful starting point, because clearly our consciousness is physically embodied, while an LLM's isn't.

This matters more than it seems, because we're not calculators, and we're not just brains. There are proven links between mental and emotional states and - for example - the gut biome.

https://www.nature.com/articles/s41598-020-77673-z

There's a huge amount going on before we even get to the language parts.

As for Dawkins - as someone on Twitter pointed out, the man who devoted his life to telling people believers in sky fairies they were idiots has now persuaded himself there's a genie living inside a data centre, because it tells him he's smart.

If he'd actually understood critical thinking instead of writing popular books about it he wouldn't be doing this.

reply
First of all: arguing about the details of a thing that actually exists is an enormous difference from arguing details of a thing that does NOT exist.

As for your dig at Dawkins, I just read https://archive.ph/Rq5bw which I assume you're referring to. Notice how he never defined "conscious" and he seems to use it as equivalent to "can process data logically" which is not at all how I would define the word. And if you use that word clearly Claude is conscious. I wouldn't use that definition though.

It ALWAYS comes back to the fact that people argue about what consciousness is and never define what they mean. Sam Harris defines it as subjective experience, which is afaik impossible to measure in any way so you can just assume rocks are conscious and move on. I personally like Julian Jaynes' definition.

You assumed YOUR definition and judged Dawkins without first comparing definitions. I think that's showing your problem with critical thinking in this case, not his.

reply
What about single celled or microscopic multi-cellular life forms? They could sense positive and negative aspects to their surroundings and move toward/away from said aspects. I don’t think most would include them as conscious despite this directed behavior.
reply
There are times I am feeling neither pain nor pleasure, but I am still experiencing conciousness.

So that definition seems to fail immediately.

And how do you even measure pain, is it painful for an LLM to be reprimanded after generating a reply the user doesn't like? It seems to act like it.

reply
>There are times I am feeling neither pain nor pleasure

It is about the ability..

reply
I guess that just seems like an incredibly arbitrary criteria. Why would the potential for pleasure in the future determine if I am currently conscious even if I am not in fact experiencing pleasure.
reply
And how do you define pain and pleasure? Do insects feel pain?
reply
> Do insects feel pain?

Yes, I think so. Because they show behavior that is consistent with being in a state of pain.

Despite what consciousness really is, I think evolution found a way to tap into that, by causing pain, or by registering pain on the consciousness by some unknown mechanism, for behaviors that are not beneficial to the organism that hosts the respective consciousness...

So I think if an organism that evolved here can display painful behavior, then it should really feel pain.

reply
So if a robot + ai shows behavior consistent with pain, we can conclude it’s conscious?
reply
So if I build a simulation with robots living in a world and apply an evolutionary algorithm and at some point the virtual robots respond to damage in a way that looks like pain in animals, would the simulated robots be conscious? Or is it impossible that this could happen?
reply
In my comment, we already assume that we (humans) are conscious and we are the result of evolution. So the question was only if something else that evolved similarly, was conscious the way we are..

So to match with that your hypothetical scenario should involved robots that already have consciousness within them and the question would be if their evolution had managed to tap into that built in consciousness and ability to feel and cause them to behave in one way or another.

reply
See, this definition sucks, because even GPT-3 could display _signs_ of pleasure and pain. For that matter, so do characters in video games.
reply
[dead]
reply
> And how do you define pain and pleasure?

They're not reducible, but I don't know if that means we don't have definitions; we can describe them well enough that most people (who aren't p-zombies or playing the sceptical philosopher role) know pretty well what we mean. All of our definitions have to bottom out somewhere...

> Do insects feel pain?

Nobody (except the insects) can know for sure. Our inability to know whether X is true doesn't imply X is meaningless, though.

reply
But how can X be a good indicator for something I want to determine if I can’t measure X either?
reply
> But how can X be a good indicator for something I want to determine if I can’t measure X either?

In the comment that started this subthread, qsera was responding to someone who said "Imo we don't even have a definition of [consciousness]". If qsera meant that we can measure consciousness in terms of pleasure and pain, then of course I agree that they were just pushing the problem back a step. But I don't think that's what they meant.

reply
Now you have do define pleasure AND pain without using the word "consciousness" as that would be circular logic.

Is pleasure then any reward function? Then a mathematical set of equations performed by a human by writing on a piece of paper can qualify. Does that mean pen and paper is conscious? Or certain equations?

reply
We're pretty clear on the distinction between a conscious and an unconscious human.

We might not clearly understand the diff between the two states but we can certainly point to it and go "it's that".

reply
I'm not sure it's that clear. What about a person who is on drugs to the point they clearly don't know what reality is happening around them, but they are able to speak and move and such? I'm not sure I'd call that conscious, but by most definitions it is.
reply
You would just say that they have an altered experience of consciousness from the norm.
reply
Indeed, doing a first aid course we were pointed out that sleeping is different from being unconscious. You can wake someone from sleep pretty quickly. You can't bring an unconscious person back in the same way.
reply
>We're pretty clear on the distinction between a conscious and an unconscious human.

You are using unconscious as a synonym for asleep, which is not the same thing as having no conscious experience due to dreams. We are clear on the distinction between a dead human and an alive human however.

reply
Those terms are not really how we use the word "conscious" in any other situation though. With a definition like that you would say a rock is unconscious (I guess reasonable), a pretty cold bacteria is unconscious (hmm.. ok I guess?), and a warm bacteria is conscious (now I'm not on board anymore).

We have to be WAY more specific in what the word even means!

reply
Now discuss whether a bonobo, a dog, a cat, a mouse, an ant, a bacterium is conscious.

And you’ll find it’s not as clear cut.

reply
> LLMs have emergent behavior that is reminiscent of language forming brains,

Indeed, but then we need to prove that they are not "chinese box" conscious. Which is hard, because it might be that the thing running the chinese box is conscious, but can only communicate in a way it doesn't understand

reply
Clive Wearing's memory lasts for less than 30 seconds, so he has no memory of being awake before now. He is permanently in a state of feeling like he has just woken up, observing his surroundings for the first time.

Clive Wearing's mind has no time continuity and basically zero memory integration. Is he not conscious? There's interviews with the guy.

Where on the scale [No mind <-> Clive Wearing <-> Healthy human brain] would you put an LLM with a 10M token context window?

reply