Home Latest Insights | News Nvidia Era Begins As It Touches $1 trillion Market Cap

Nvidia Era Begins As It Touches $1 trillion Market Cap

Nvidia Era Begins As It Touches $1 trillion Market Cap
Nvidia chip

In 2020, I wroteIntel faces a tough future as Nvidia becomes the absolute category-king in the modern microprocessor business…With Nvidia’s AI technology being armed by these CPUs, you have a new basis of competition in the market, with industry-shaping implications for players like Intel”. 

This is my industry.  Yes, as far back as 2020, I postulated on what just happened based on the promises of AI, from the domain of microelectronics. Today,  NVIDIA topped $1 trillion in market cap for the first time in its history because of those chips: “Nvidia’s market capitalization surged past $1 trillion for the first time in the chipmaker’s history Tuesday, as the Silicon Valley behemoth rides Wall Street’s artificial intelligence obsession to new heights.”

Nvidia is now the sixth public company in the world valued at over $1 trillion, joining Apple, Saudi Aramco, Microsoft, Alphabet and Amazon; the only other companies to ever cross the threshold are Tesla and Meta, which are each valued at less than $700 billion today, and Chinese oil giant PetroChina.

Joining the $1 trillion club follows a dramatic rise thus far in 2023, with Nvidia’s stock up more than 175% as Wall Street analysts say the Palo Alto-based company could be the firm best-positioned to profit off of the AI boom.

We will be using Nvidia for a case study in our program, drawing contrasts with Intel, just as we did with Microsoft and IBM. Nvidia is just under $1 trillion now while Intel hovers around sub-$125 billion in market cap comparison.  Microsoft is worth $2.47 trillion while IBM (including the split) is below $140 billion, but less than 15 years ago, they were largely in the same valuation bucket.

Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.

Tekedia AI in Business Masterclass opens registrations here.

Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.

What happened? Business model. In the Nvidia case, there are many things we can learn from this. Yes, many good cases here on strategy, products, etc.

This piece from Fortune newsletter explains why Nvidia is great

Computex, the massive personal computing trade show, is taking place in Taipei this week, its first time as an in-person event since before the pandemic. And when it comes to A.I., the show provided further evidence of just how far ahead of the game Nvidia is as the leading producer of the computer chips that are powering the current A.I. revolution.

Jensen Huang, the company’s Taiwan-born CEO, bantered with the crowd in Taiwanese during parts of his keynote address to the conference and was clearly reveling in his status as a homegrown hero as his company closed in on a $1 trillion market valuation—a milestone it hit today, a day after Huang’s keynote, becoming the first chipmaker ever to reach that lofty height. Thanks to investor enthusiasm for the generative A.I. boom that is being built atop Nvidia’s graphics processing units, the company’s stock is up more than 180% year to date.

At the show, Huang announced that Nvidia’s Grace Hopper GH200 “superchips”—as the company terms them—are now in full production. These chips combine Nvidia’s highest-performing Hopper H100 graphics processing units, now the top-of-the-line chip for generative A.I. workloads, with its Grace CPU, or central processing unit, that can handle a more diverse set of computing tasks.

Huang revealed that the company had linked 256 of these GH200 chips together using the company’s own NVLink networking technology to create a supercomputer that can power applications requiring up to 144 terabytes of memory. The new supercomputer is designed for training ultra-large language models, complex recommendation algorithms, and graph neural networks that are used for some fraud detection and data analytics applications. Nvidia said the first customers for this supercomputer will be Microsoft, Google, and Meta.

Of course, Nvidia has recently branched out from hardware and begun offering its own fully-trained A.I. foundation models. At Computex, the company tried to demonstrate some of these A.I. capabilities in a demo geared towards Computex’s PC and video gaming crowd. It showed a video depicting how its A.I.-powered graphics rendering chips and large language models can be coupled to create non-player characters for a computer game that are more realistic and less scripted, than those that currently exist. Well, the visuals in the scene, which was set in a ramen shop in a kind of Tokyo underworld, were arrestingly cinematic. But the LLM-generated dialogue, as many commentators noted, seemed no less stilted than the canned dialogue that humans script for non-player characters in existing games. Clearly, Nvidia’s Nemo LLM may need some further fine-tuning.

As any student of power politics or game theory knows, hegemons tend to beget alliances aimed at countering their overwhelming strength. In the past month, news accounts reported that Microsoft was collaborating with AMD, Nvidia’s primary rival in the graphics rendering sphere, on a possible A.I.-specific chip that could make Microsoft less reliant on purchasing Nvidia’s GPUs. (Microsoft later said aspects of the report were wrong, but that it has long had efforts to see if it could develop its own computer chips.) George Hotz, a Silicon Valley hacker and merry prankster best known for jailbreaking iPhones and Playstations and who went on to build a self-driving car in his own garage, also announced he was starting a software company called Tiny Corp. dedicated to creating software that will make AMD’s GPUs competitive with Nvidia’s. If that effort is successful, Hotz declared he would turn to building his own silicon. “If we even have a 3% chance of dethroning NVIDIA and eating in to their 80% margins, we will be very very rich,” Hotz wrote on his blog. “If we succeed at this project, we will be on the cutting edge of non-NVIDIA AI compute.”

In his blog, Hotz notes that most A.I. chip startups that hoped to dethrone Nvidia have failed. Some, such as Cerebras and Graphcore are still trying, but both have struggled to gain as much traction as they had hoped. Microsoft tried using Graphcore in its data centers but then pivoted away from the U.K.-based startup’s chips. And Hotz is right about identifying one of Nvidia’s biggest advantages: It’s not Nvidia’s hardware, it’s its software. Cuda, the middleware layer that is used to implement A.I. applications on Nvidia’s chips, is not only effective, it is hugely popular and well-supported. An estimated 3 million developers use Cuda. That makes Nvidia’s chips, despite their expense, extremely sticky. Where many rivals have gone wrong is in trying to attack Nvidia on silicon alone, without investing in building a software architecture and developer ecosystem that could rival Cuda. Hotz is going after Cuda.

But there is more to Nvidia’s market dominance than just powerful silicon and Cuda. There’s also the way it can link GPUs together inside data centers. One of Huang’s greatest acquisitions was Israeli networking company Mellanox, which Nvidia bought for $6.9 billion in 2019. Mellanox has given Nvidia a serious leg up on competitors like AMD. Michael Kagan, Nvidia’s chief technology officer, who had also been CTO at Mellanox before the acquisition, recently told me that one of the ways Nvidia had wrung more efficiency out of its data center GPUs was to move some of the computing into the network equipment itself. (He likened it to a pizza shop that, in order to get more efficient, equipped its delivery drivers with portable ovens so the pizza would finish cooking as the delivery driver drove it to a customer’s house.) And Nvidia isn’t sitting still when it comes to networking either. At Computex, the company announced a new ethernet network called Spectrum X that it says can deliver 1.7 times better performance and energy efficiency for generative A.I. workloads.

Improvements like this will make Nvidia very hard to catch. Of course, Nvidia isn’t perfect. And we’ll have more on one area of the generative A.I. race where it may have stumbled in the Brainfood section below.


---

Register for Tekedia Mini-MBA (Feb 10 - May 3, 2025), and join Prof Ndubuisi Ekekwe and our global faculty; click here.

No posts to display

1 THOUGHT ON Nvidia Era Begins As It Touches $1 trillion Market Cap

  1. We have entered Nvidia’s Era, its case is quite unique, not really something you touch or use as a final product, yet it powers the very things almost everyone is currently talking about.

    A conflation of vision and engineering.

Post Comment

Please enter your comment!
Please enter your name here