Product reviews, deals and the latest tech news

Intel’s Gaudi 3 Accelerators Offer Significant Savings Over Nvidia GPUs in AI Competition

Intel Throws Down the Gauntlet Against Nvidia in the Heated Battle for AI Hardware Supremacy

At Computex this week, Intel made a bold move in the competitive AI hardware landscape. CEO Pat Gelsinger announced the pricing for Intel’s next-generation Gaudi 2 and Gaudi 3 AI accelerator chips, and the figures are set to disrupt the market.

Intel's Gaudi 3 and Nvidia

Typically, pricing for high-performance AI accelerators is kept under wraps, but Intel has bucked this trend by revealing official numbers. The flagship Gaudi 3 accelerator is priced at approximately $15,000 per unit when purchased individually, which is 50 percent less than Nvidia’s H100 data center GPU.

The Gaudi 2, although less powerful, also presents a significant cost advantage over Nvidia’s offerings. An 8-chip Gaudi 2 accelerator kit will be available for $65,000 to system vendors, which Intel claims is only a third of the price of comparable setups from Nvidia and other competitors.

For the Gaudi 3, an 8-accelerator kit configuration will cost $125,000. Intel asserts that this price point is two-thirds cheaper than alternative solutions at the same high-performance tier. To provide context, Nvidia’s newly launched Blackwell B100 GPU is priced at around $30,000 per unit, while the high-performance Blackwell CPU+GPU combo, the B200, is priced at approximately $70,000.

Pricing, however, is just one aspect of the competition. Performance and the software ecosystem are equally critical. Intel maintains that the Gaudi 3 can match or exceed Nvidia’s H100 in various AI training and inference workloads.

Benchmarks provided by Intel indicate that the Gaudi 3 delivers up to 40 percent faster training times than the H100 in large 8,192-chip clusters. Even a smaller 64-chip Gaudi 3 setup offers 15 percent higher throughput than the H100 on the popular LLaMA 2 language model. For AI inference, Intel claims a twofold speed advantage over the H100 on models like LLaMA and Mistral.

Despite these impressive claims, the Gaudi chips use open standards like Ethernet for easier deployment but lack optimizations for Nvidia’s ubiquitous CUDA platform, which is widely used in AI software today. Convincing enterprises to refactor their code for Gaudi could prove challenging.

To drive adoption, Intel has partnered with at least 10 major server vendors, including new Gaudi 3 partners like Asus, Foxconn, Gigabyte, Inventec, Quanta, and Wistron. Well-known names such as Dell, HPE, Lenovo, and Supermicro are also on board.

Nvidia remains a formidable competitor in the data center space. In the final quarter of 2023, Nvidia captured a 73 percent share of the data center processor market, and that number continues to rise, posing a significant challenge to both Intel and AMD. The consumer GPU market mirrors this dominance, with Nvidia holding an 88 percent share.

Intel faces an uphill battle, but these significant price differences could help it gain ground. Additionally, Intel is working on enhancing its software ecosystem to make it more attractive to developers and enterprises. The company is investing heavily in AI software tools and frameworks to support Gaudi accelerators, aiming to reduce the friction associated with transitioning from Nvidia’s CUDA.

Moreover, Intel’s commitment to open standards and its collaborative efforts with the AI community might pay off in the long run. By fostering an inclusive and flexible development environment, Intel hopes to build a robust and versatile AI ecosystem that can rival Nvidia’s well-established CUDA platform.

In conclusion, while Nvidia currently leads the AI hardware market, Intel’s aggressive pricing strategy and strategic partnerships could potentially shift the competitive landscape. The AI race is intensifying, and only time will tell how these developments will influence the market dynamics.

Leave a Reply

Your email address will not be published. Required fields are marked *