The director of the Arm says: Generative AI the huge demand for electricity could hamper its rapid development.
Ami Badani, chief marketing officer of the British technology company, told the Fortune Brainstorm AI conference in London that data centers supporting AI chatbots they already account for 2% of global electricity consumption.
“We will not be able to continue developing artificial intelligence without addressing the issue of power,” Badani said. “ChatGPT requires 15 times more energy than a customary web search engine.”
This growth will not only sluggish the development of artificial intelligence, but will also sluggish the growth of technology and utility companies seeking to achieve their net-zero goals. Former Energy Secretary Ernest Moniz said as much last month at S&P Global’s annual global energy conference Utilities will have to rely more on fossil fuels as they try to meet demand.
“We do not intend to build 100 gigawatts of fresh renewable energy sources in a few years. You kind of get stuck,” Moniz told The Wall Street Journal last month.
Utilities are already seeing a significant escalate in demand. Nine of the 10 largest U.S. utilities said so in their first-quarter earnings reports this year data centers are their main source of customer growth, reports Reuters. According to Reuters, McKinsey estimates that the energy consumption of data center equipment will more than double in the coming years, from 21 gigawatts in 2023 to more than 50 gigawatts in 2030.
“If you think about artificial intelligence, there is a cost,” Badani said at the Fortune conference, “and that cost, unfortunately, is power.”
By the numbers
100 million: Number of busy ChatGPT users.
9,000-11,000: Number of cloud data centers around the world.
2026: A year in which electricity demand from data centers, artificial intelligence and cryptocurrencies could double.
46 terawatt hours: Global energy consumption of all data centers estimated for calendar year 2024, three times higher than in 2023.