“Although the use of multiple GPUs introduces some randomness (Nvidia, 2024), it can be eliminated by setting random seeds, so that AI models are deterministic given the same input. […] In order to support this line of reasoning, we ran Llama3-8b on our local GPUs without any optimizations, yielding deterministic results. This indicates that the models and GPUs themselves are not the only source of non-determinism.”
Secondly, as I quoted the paper is explicitly making the point that there is a source of nondeterminism outside of the models and GPUs, hence ensuring that the floating-point arithmetics are deterministic doesn’t help.