Britain lacks computing power for AI, CMA warns

Cloud providers in the UK do not have the latest chips, which would hamper those trying to develop “foundation model” AI such as ChatGPT and Bard, the Competition and Markets Authority said.

The regulator also warned there was a danger that big technology companies would consolidate their power in foundation models. It is investigating the foundation model market and has released its initial findings.

The most capable foundation models such as ChatGPT (OpenAI), Bard (Google) and Claude (Anthropic) have been developed using huge computing and data resources. OpenAI reportedly spent more than $100 million developing GPT-4, the latest version of ChatGPT. However, access to the most sophisticated chips or GPUs (graphics processing units) made by Nvidia is costly and currently restricted due to the huge demand.

In a section on barriers to entry, the regulator states that none of the three biggest cloud service providers based in Britain have the latest Nvidia chips available. It says this could be a problem for British developers working on foundation models that need to be trained on sensitive or personal data, as there can be restrictions on storing the data internationally.

In March, a review commissioned by the government concluded that Britain had fallen behind Russia, Italy and Finland in the world league table for computing power.

As of November last year, the UK had only a 1.3 per cent share of the global compute capacity and did not have a system in the top 25 of the most powerful global systems. Its most powerful system, Archer2, the national computing service, ranks 28th.

The review said there were fewer than 1,000 high-end Nvidia chips available to researchers and recommended that at least 3,000 “top-spec” GPUs be made available as soon as possible.

The government is working to address this by spending £900 million on a supercomputer that will be based in Bristol. It is in talks to buy £100 million worth of Nvidia chips for AI training.

Saudi Arabia has reportedly bought at least 3,000 of Nvidia’s latest AI chips, the H100, which cost $40,000 each. By comparison, the American start-up Inflection AI, which has developed the chatbot Pi, is building the largest artificial intelligence cluster in the world comprising 22,000 H100s.

The watchdog also warned that big tech could squeeze out smaller companies in the sector because of greater access to data and compute.

“Large technology companies’ access to vast amounts of data and resources may allow them to leverage economies of scale, economies of scope, and feedback effects to gain an insurmountable advantage over smaller organisations, making it hard for them to compete,” the regulator said. It concluded: “Given the likely importance of foundation models across the economy, we would be concerned if access to the key inputs required to develop foundation models were unduly restricted, in particular restrictions on data or computing power.”

Sarah Cardell, chief executive of the competition regulator, said: “The speed at which AI is becoming part of everyday life for people and businesses is dramatic. There is real potential for this technology to turbo-charge productivity and make millions of everyday tasks easier — but we can’t take a positive future for granted. “There remains a real risk that the use of AI develops in a way that undermines consumer trust or is dominated by a few players who exert market power that prevents the full benefits being felt across the economy.”

Fake reviews may become easier to write because of the AI, the watchdog said. “The increased use of foundation model tools may in future make it easier and cheaper for bad actors to create fake reviews. Moreover, it can be difficult to tell the difference between a genuine and a fake review. Foundation models may make that problem worse because they could be used to generate content that may be even more convincing.”