upvote
An LLM is more like the unconscious part of my brain. It’s my gut. It shits out answers using an ungodly amount of parallel processing and it’s often right.

But it also hallucinates thoughts and beliefs too, and that’s where the conscious parts have to intervene.

But the conscious parts are expensive to run and I can’t multi-task that.

The conscious parts also degrade first when I don’t get enough sleep.

reply
deleted
reply
>It seems obvious that a humanoid robot system or other truly general-purpose AI will need a stack of model types that work in concert.

I don't think that much of AI today is obvious, so I'm suspicious of anything that is "obvious" about the future.

reply
OK AI user.

Did it truly take someone else to externalize the mechanics of cognition into a machine for you, for you to become able to notice them and become interested in them?

And then to remain focused on the machine that you see, rather than the machine that you are.

Pitiful.

reply