Home Latest Insights | News AMD Takes Aim at Nvidia with New AI Chip, Instinct MI325X, in a Bid to Capture Growing AI Market

AMD Takes Aim at Nvidia with New AI Chip, Instinct MI325X, in a Bid to Capture Growing AI Market

AMD Takes Aim at Nvidia with New AI Chip, Instinct MI325X, in a Bid to Capture Growing AI Market

On Thursday, AMD unveiled its latest AI chip, the Instinct MI325X, with the ambitious goal of challenging Nvidia’s dominance in the data center GPU market.

As the AI revolution accelerates, demand for high-performance chips is growing exponentially, with generative AI models like OpenAI’s ChatGPT requiring massive amounts of processing power from data centers filled with GPUs. AMD is positioning its new chip as a serious competitor to Nvidia’s, hoping to grab a larger slice of an industry projected to be worth $500 billion by 2028.

AMD announced that the Instinct MI325X will enter production before the end of 2024, marking a significant step in its efforts to take on Nvidia, which has dominated the AI chip market recently. Currently, Nvidia holds more than 90% of the data center AI GPU market, having capitalized on the recent surge in demand for AI processing power. Nvidia has enjoyed gross margins of around 75%, fueled by the critical role its GPUs play in the training and deployment of AI models.

Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.

Tekedia AI in Business Masterclass opens registrations here.

Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.

During the launch event, AMD’s CEO Lisa Su noted the growing global demand for AI processing power.

“AI demand has actually continued to take off and actually exceed expectations. It’s clear that the rate of investment is continuing to grow everywhere,” she said.

This growing demand is not just a trend, Su explained, but a permanent shift that AMD hopes to capitalize on with its Instinct MI325X chip.

Su further underscored AMD’s ambition to capture a larger share of this lucrative market saying: “We’re accelerating our product schedule to release new chips on an annual basis, ensuring that we stay competitive and meet the growing needs of AI developers and cloud companies.”

AMD also revealed its product roadmap, confirming the release of future chips like the MI350 in 2025 and the MI400 in 2026.

Targeting Nvidia’s Dominance Through Performance and Pricing

The Instinct MI325X is a direct successor to the MI300X, which began shipping last year. AMD hopes the MI325X will position it as a viable alternative to Nvidia’s highly coveted GPUs. With Nvidia’s Blackwell chips expected to start shipping early next year, the competition for AI dominance is heating up.

Su highlighted the performance advantages of AMD’s new chip during the event, explaining that it is particularly well-suited for inference tasks, where AI models generate outputs based on trained data. She pointed out that the MI325X has advanced memory architecture that enables it to serve certain AI models faster than Nvidia’s offerings.

For example, Su said, “What you see is that the MI325 platform delivers up to 40% more inference performance than the H200 on Llama 3.1,” referring to Meta’s Llama AI model.

However, AMD did not announce any new major cloud or internet customers for the Instinct MI325X during the event. It has previously been disclosed that Meta and Microsoft are already buyers of its AI GPUs, with OpenAI using AMD’s chips for some applications.

The company also did not reveal pricing for the MI325X, but the launch could have an indirect impact on Nvidia’s pricing strategy. If developers and cloud giants begin to see AMD’s products as a close substitute for Nvidia’s, this could put pricing pressure on Nvidia, which has enjoyed premium margins due to its dominant position.

Overcoming the CUDA Challenge

One of AMD’s biggest obstacles in the AI chip market is overcoming Nvidia’s CUDA programming language, which has become the industry standard for AI developers. This effectively locks developers into Nvidia’s ecosystem, making it difficult for competitors like AMD to capture market share. In response to this challenge, AMD has been working to improve its own software, ROCm, which allows AI developers to switch their models to AMD’s GPUs more easily.

Lisa Su addressed this challenge head-on during the event: “We recognize that developers have been deeply invested in the CUDA ecosystem, but with ROCm, we’re making it easier for them to run their models on AMD chips without sacrificing performance,” she said.

Su explained that AMD is positioning its GPUs, which it calls accelerators, as particularly competitive for tasks where AI models create content or make predictions, rather than processing vast amounts of training data.

“Our advanced memory architecture allows for faster performance in certain use cases, and we’re confident that developers will see the benefits,” she added.

Beyond Nvidia, Taking on Intel in CPUs

While AI accelerators and GPUs have become the most intensely watched part of the semiconductor industry, AMD’s core business remains in central processing units (CPUs), which are the brains behind nearly every server in the world. During the event, AMD also announced a new line of CPUs, the EPYC 5th Gen, which range from 8-core chips for low-cost, low-power applications to 192-core processors intended for supercomputers.

Su explained the critical role CPUs play in AI data centers, emphasizing how they work with GPUs to power complex AI workloads.

“Today’s AI is really about CPU capability, and you see that in data analytics and a lot of those types of applications,” she said.

AMD’s data center sales in the June quarter more than doubled year-over-year to $2.8 billion, with AI chips accounting for $1 billion of that total. However, AMD still has some catching up to do. It currently captures 34% of total spending on data center CPUs, while Intel remains the market leader with its Xeon line of processors. AMD hopes that its new EPYC 5th Gen chips will help it further chip away at Intel’s lead.

Nvidia Unbothered

Despite AMD’s aggressive push into the AI market, Nvidia remains the undisputed leader. Nvidia’s stock has skyrocketed by more than 175% in 2024, while AMD’s stock has risen by only 20%. While the launch of the Instinct MI325X could attract more interest from investors looking for companies poised to benefit from the AI boom, Nvidia’s entrenched position, especially with its CUDA ecosystem, remains a significant barrier for AMD.

Some analysts believe that a successful launch of the Instinct MI325X could help shift investor sentiment, particularly if AMD manages to attract more AI developers and cloud providers to its platform. They note that by offering a competitive alternative to Nvidia’s GPUs, AMD could also pressure Nvidia to lower its prices, which could have broader implications for the AI chip market.

No posts to display

Post Comment

Please enter your comment!
Please enter your name here