Nasty, brutal and tiny, rather similar. Talk to utility and data center operators, and while many share Mr. Altman’s excitement about artificial intelligence (AI), they are grappling with an energy conundrum that, in part, determines the future of three massive economic shifts: the AI revolution; efforts to electrify immense areas of the economy; and the fight against climate change. In tiny, “generative” AI, the kind behind OpenAI’s ChatGPT, has a voracious appetite for electricity. It almost unexpectedly landed in a global energy system that is already struggling with alternative sources of energy demand. For now, it is not clear whether there will be enough tidy energy to meet everyone’s needs.
At first glance, the solution looks elementary. Data centers, such as those used by companies like Alphabet, Amazon and Microsoft to provide cloud computing services, have only accounted for 1-2% of global energy demand over the past decade. For years, massive tech “hyperscalers” have been achieving increasingly greater energy efficiency in their server farms, even as the world’s computing load skyrockets. Moreover, they are investing heavily in tidy energy to offset their carbon footprint. In America, electricity suppliers want hyperscale companies are all too willing to facilitate. They have survived two decades of anemic electricity demand and are desperate for up-to-date sources of growth. In recent earnings calls, their executives promised investments worth tens of billions of dollars over the next five years to pump more energy into data centers. Last month, one such company, Talen Energy, sold a nuclear-powered data center to Amazon for $650 million. which is promising so far.
However, generative artificial intelligence changes the nature of the game. Since the days when graphics processing units (GPUs), the chips on which models such as ChatGPT are trained and run, have been the main drivers of the cryptocurrency boom, have been energy-intensive. According to Christopher Wellise of Equinix, a data center rental company, a pre-AI hyperscale server rack consumes 10–15 kilowatts (kW). AI consumes 40-60 kW. It’s not just computation that consumes electricity. Keeping GPU racks nippy requires just as much power. Moreover, much of AI energy demand over the past year has come from “basic” model trainers such as GPT-4, OpenAI’s latest offering. Widespread operate of them as tools – for research, for making films, for the pope to dress in Balenciaga – could put more strain on the network. Searching with ChatGPT can consume ten times more energy than Googling.
This is just the beginning of the generative AI boom, so it’s too early to make difficult and quick predictions. However, educated guesses about the associated escalate in energy demand are striking. The International Energy Agency, a global forecaster, says that by 2026 data centers could operate twice as much energy as they did two years ago – and as much as Japan currently consumes. It expects data centers to account for a third of America’s up-to-date electricity demand over the next two years. Rene Haas, chief executive of chip design company Arm, told the Wall Street Journal this week that AI-powered data centers could operate as much as a quarter of all U.S. electricity by the end of the decade, up from 4% or more now. less .
In America, two things add to the complexity. The first is timing. The development of generative artificial intelligence coincides with a dynamically developing economy whose energy consumption corresponds to an appropriate level. Many energy consumers want their energy to be zero-emission, which creates competition for confined resources. So do buyers of electric vehicles (EVs), whose growth may have slowed but has not stopped. The second is the challenge of expanding the network. Despite White House support, it’s not uncomplicated for utilities to quickly build up-to-date renewable capacity. They suffer from supply chain problems; according to some sources, it takes three years to deliver a transformer, compared to less than a year earlier. Interest rates have raised the cost of wind and solar projects, making them more complex to finance. Building up-to-date transmission lines is extremely complex.
There will undoubtedly be plenty of imaginative thinking. The obvious solution is to make GPUs more energy productive. Nvidia, their largest supplier, says it has already achieved this with its latest generation of AI servers. More productive chips, however, may simply stimulate greater usage. Another option, says Aaron Denman of consulting firm Bain, is for hyperscalers to operate their deep pockets to facilitate utilities overcome some of the grid’s limitations. He says the real crisis can occur at certain times of the year, such as unusually heated summer days when Americans turn on the air conditioning. This means leaving tiny power plants on standby. However, it is likely that they will be powered by natural gas, which will undermine the climate commitments of cloud service providers.
The nuclear option
If there is a shortage of renewable energy, it will come at a cost. No one knows yet how generative AI will make money. People know that the cost of purchasing GPUs is skyrocketing. If the energy costs associated with operating them also escalate, this could hamper expansion. Moreover, electrification of the rest of the economy is highly cost-dependent; An AI vs. EV battle for tidy energy would drive up prices and not be good for any industry. Definitely keep your fingers crossed that Mr. Altman’s rosy dream of nuclear fusion becomes a reality. But don’t count on it.
© 2024, The Economist Newspaper Circumscribed. All rights reserved.
From The Economist, published under license. Original content can be found at www.economist.com
Posted: Jun 13, 2024 18:16 EST