Well, it surely did not help that the government has been drip-feeding us computational resources. First we had about 16 GPU nodes to share between the whole country for over a decade. Then just before Isambard-AI came online they made one open call for about a week where you could get nearly enough to train a sizeable model, but the call was poorly advertised and in the middle of high vacation season. After this, the only big call explicitly cut out training large language models by its scope and the general calls have been peanuts and less than me and the UK-LLM team had access to during the Isambard-AI beta phase! When I gave a talk at White Hall recently my message was clear: We have the team, the knowledge, the data, etc. We just need an open call for enough compute to train the darn thing! Here is to hoping that they listened.
reply