"The MLP (multilayer perceptron) is a two-layer feed-forward network: project up to 64 dimensions, apply ReLU (zero out negatives), project back to 16"
Which starts to feel pretty owly indeed.
I think the whole thing could be expanded to cover some more of it in greater depth.