Debatable whether it truly understands what it's doing or not, but the argument usually assumes that it does know what it's doing at least in that it's able to imagine outcomes and create plans to reach its singular goal, making it a very simple toy example of a misaligned system.