Intel unveiled its latest AI hardware on Tuesday, claiming it has better performance and efficiency in training generative AI models than industry alternatives – even Nvidia‘S highly sought after chip.
The semiconductor pioneer presented its Gaudi 3 AI accelerator at the fair Intel Vision 2024 conference in Phoenix. Compared to its predecessor, Gaudi AI Accelerator 2Intel said Gaudi 3 has more AI processing power, network bandwidth and memory bandwidth to scale up AI training and inference based on so-called gigantic language models (LLMs) that power AI chatbots such as ChatGPT. Intel says the accelerator will also improve training and inference of multimodal AI models that can process and understand many types of information, including images and audio.
The Gaudi 3 AI Accelerator is built to enhance the speed and efficiency of parallel AI operations. Intel claims it can also handle multiple data types and is projected to be 50% faster at training top LLM managers than Nvidia’s popular H100 chip for $40,000and provide 30% faster application for top LLMs compared to Nvidia H200 chip. Inference refers to the process by which: The AI model makes predictions based on the data on which it was trained. The company said it is waiting for Nvidia to release performance results for its processors newly announced Blackwell chip before he can compare it with Gaudi 3.
“Innovation is advancing at an unprecedented pace, made possible by silicon, and every company is quickly becoming an artificial intelligence company,” Intel CEO Pat Gelsinger said in a statement. “Intel is bringing AI to the entire enterprise, from PCs to data centers to the edge.”
Intel says that with increased memory capacity, fewer Gaudi 3 accelerators are needed to process data sets from larger AI models, improving the cost efficiency of data centers.
Intel last month received $8.5 billion in direct government funding from the CHIPS and Science Actas part of a five-year, $100 billion plan to expand U.S. chip production. Gelsinger said the company wants to build “the world’s largest AI chip facility” on a vacant site near Columbus, Ohio — one of four states where the plant is located. investing in expansion. The company also qualifies for up to $11 billion in federal loans.
During his Tuesday speech at Intel Vision, Gelsinger said the company aims to become the world’s No. 2 AI systems maker by the end of the decade.
Intel says its Gaudi 3 accelerator fills a gap in hardware offerings in the AI market where customers are asking for choice – especially amid Nvidia chip shortage.
“If we look at Gaudi 3 customers today, we see growing momentum in the industry from our partners, our solutions and our customers who are now saying, ‘Man, this is great,’” Gelsinger said.
Bosch, LandingAI and Seekr are some of Intel’s Gaudi accelerator customers, the company said.
Gaudi 3 will be available to hardware makers including Dell, HPE, Lenovo and Supermicro in the second quarter of this year, and is expected to be more widely available later this year, Intel says.
Gelsinger said Gaudi 3 and Intel’s other recent products and services “provide a consistent set of versatile solutions to meet the evolving needs of our customers and take advantage of the enormous opportunities that lie ahead.”
Gelsinger later said on a call with reporters that Intel had an “innovation roadmap.” This includes the development of the Gaudi 3 accelerator among customers and business partners this quarter, as well as the next-generation Falcon Shores supercomputer chip, which will compete with Nvidia products.
“Without a doubt, the Gaudi 3 significantly outperforms the H100 today,” Gelsinger said, adding that TCO, or total cost of ownership, “is a huge difference” between the two chipmakers.
While Gelsinger declined to reveal pricing details for the Gaudi 3, he said Intel is “very composed in saying that we will be well below the price thresholds given for the H100 and Blackwell.”
Christoph Schell, Intel’s chief commercial officer, told reporters that Intel’s competition with Nvidia is not just about performance, but also about cost, availability and access to data.
“We are checking the box for each of them,” Schell said, “well ahead of Nvidia.”