The immense power needs of AI computing were flagged early as a bottleneck, prompting Alphabet’s Google Cloud to plan how to procure power and use it, according to Thomas Kurian, CEO of Google Cloud.
Speaking at the Fortune Brainstorm AI event in San Francisco on Monday, he emphasized that the company, a key player in the AI infrastructure landscape, was working on AI long before the arrival of large language models and had a long-term vision.
“We also knew that the most problematic thing that was going to happen would be energy, because energy and data centers were going to become a bottleneck alongside chips,” Kurian said. FortuneAndré Nusca. “So we designed our machines to be extremely efficient.”
The International Energy Agency estimated that some AI-driven data centers consume as much electricity as 100,000 homes, and that some of the largest facilities under construction could even use 20 times that amount.
At the same time, global data center capacity will increase by 46% over the next two years, equivalent to a jump of almost 21,000 megawatts. according to at the real estate consultancy firm Knight Frank.
At the Brainstorm event, Kurian outlined Google Cloud’s three-pronged approach to ensuring there will be enough power to meet all of this demand.
First, the company seeks to be as diverse as possible in the types of energy that power AI computing. While many people say any form of energy can be used, that’s actually not true, he said.
“If you’re running a cluster for training and you launch it and you start running a training task, the peak that you get with that compute consumes so much power that you can’t handle that from some forms of power generation,” Kurian explained.
The second part of Google Cloud’s strategy is to be as efficient as possible, particularly in how it reuse energy in data centers, he added.
In fact, the company uses AI in its control systems to monitor the thermodynamic exchanges necessary for the recovery of the energy already brought into the data centers.
And third, Google Cloud is working on “fundamental new technologies to actually create energy in new forms,” Kurian said without elaborating.
Earlier Monday, utility company NextEra Energy and Google Cloud announced they were expanding their partnership and developing new data center campuses in the United States that would also include new power plants.
Technology leaders have warned that energy supply is critical to the development of AI, alongside innovations in chips and improved language models.
The ability to build data centers is also another potential choke point. Nvidia CEO Jensen Huang recently highlighted China’s advantage on this front compared to the United States.
“If you want to build a data center here in the United States, it will probably take about three years from the start of construction to the implementation of an AI supercomputer. » he told the Center for Strategic and International Studies end of November. “They can build a hospital in a weekend.”