Right, there's probably something more subtle like "semantic density within tokens is how models think"
So it's probably true that the "Great question!---" type preambles are not helpful, but that there's definitely a lower bound on exactly how primitive of a caveman language we're pushing toward.