I’m not sure I believe that consciousness emerges from sensory experience, but if it does, LLMs won’t get it.
Surely "having senses" is predicated more on "being able to sense the world around you" than "having a body."
> Does my installation of starcraft have consciousness?
Can your installation of StarCraft take in information about the world and then reason about its own place in that world?
(I'm still not sure that that makes them conscious, or if we can even determine that at all, but I don't think that's a fair argument.)
Conflating senses with cognitive awareness of sensory input is a mistake.
Edit: what they don’t have, obviously, is a hard-coded twitch response, where the brain itself is largely bypassed and muscles react to massive temperature differentials independently of conscious thought. But I don’t think that defines consciousness either. Ants instinctively run away from flames too.
Your best argument is that the weights are set because that means it’s not a system that can self reflect and alter the experience. But I don’t see why that is necessary to have an experience. It seems that I can sense a light and feel its warmth regardless of whether my neurons change. One experience being identical to another doesn’t mean neither was an experience.
LLMs do not have a self. This is like arguing that the algorithm responsible for converting ripped YouTube music videos to MP3s has a consciousness.
Can such an algorithm reason about itself in relation to others?
No, but an LLM doesn't do that either. An LLM is an algorithm to generate text output which can simulate how humans describe reasoning about themselves in relation to others. Humans do that by using words to describe what they internally experienced. LLMs do it by calculating the statistical weight of linguistic symbols based on a composite of human-generated text samples in its training data.
LLMs never experienced what their textual output is describing. It's more similar to a pocket calculator calculating symbols in relation to other symbols, except scaled up massively.
That they do it at all is the point and is what separates then from MP3 encoding algorithms. The "how" doesn't seem to me to be as important as you're suggesting.
You asked a hypothetical above about a different algorithm and now we've ascertained the reasons why that was reductive.
> LLMs never experienced ...
What is experience beyond taking input from the world around you and holding an understanding of it?
How do you know other humans do?
I merely object to the notion that we know how to tell who or what has a consciousness.
I do not pretend. I asked honest questions that clearly neither you nor the previous person are able to answer.