Oh, so you mean, it would be in a harness of some sort that lets it connect to sensors that tell it things about its position, speed, balance and etc? Well, yes, but then it isn't an LLM anymore, because it has more than language to model things!
You know transformers can do math, right?
You're asking someone to answer this question in a text forum. This is not quite the gotcha you think it is.
The distinction between "knowing" and "putting into language" is a rich source of epistemological debate going back to Plato and is still widely regarded to represent a particularly difficult philosophical conundrum. I don't see how you can make this claim with so much certainty.
Riding a bike is, broadly, learning to co-ordinate your muscles in response to visual data from your surroundings and signals from your vestibular and tactile systems that give you data about your movement, orientation, speed, and control. As LLMs only output tokens that represent text, by definition they can NEVER learn to ride a bike.
Even ignoring that glaring definitional issue, an LLM also can't learn to ride a bike from books written by humans to humans, because an LLM could only operate through a machine using e.g. pistons and gears to manipulate the pedals. That system would be controlled by physics and mechanisms different from humans, and not have the same sensory information, so almost no human-written information about (human) bike-riding would be useful or relevant for this machine to learn how to bike. It'd just have to do reinforcement learning with some appropriate rewards and punishments for balance, speed, and falling.
And if we could embody AI in a sensory system so similar to the human sensory system that it becomes plausible text on bike-riding might actually be useful to the AI, it might also be that, for exactly the same reasons, the AI learns just as well to ride just by hopping on the thing, and that the textual content is as useless to it as it is for us.
Thinking this is an obvious gotcha (or the later comment that anyone thinking otherwise is going to have egg on their face) is just embarrassing.