Continue reading this on our app for a better experience

Open in App
Floating Button
Home Views Artificial Intelligence

What's all the fuss about Nvidia and the AI bubble?

Assif Shameen
Assif Shameen • 9 min read
What's all the fuss about Nvidia and the AI bubble?
Nvidia's market value has hit US$1 trillion / Photo: Bloomberg
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

Nvidia, AI, ChatGPT, new tech paradigm, the next iPhone moment, the greatest thing since the Internet, the market bubble, and the inevitable dotcom-like crash are part of the chatter you hear these days. This past week everything reached a crescendo when chipmaker Nvidia, the sixth largest listed firm on earth, announced its quarterly earnings. It reported a blowout adjusted earnings per share of US$1.09 ($1.48) on revenues of US$7.19 billion, handily beating consensus estimates of 92 US cents on forecast revenues of US$6.52 billion.

Stellar earnings aside, Nvidia shocked the market by telegraphing huge growth in the current quarter due to insatiable demand for its pioneering AI chips. The Santa Clara, California-based chipmaker said its revenues would exceed US$11 billion in the June-August fiscal quarter, or way above US$7.5 billion to US$8 billion or so that most bullish analysts were forecasting. No company of its size has ever beaten analyst estimates and then gone on to guide so far above consensus forecasts as Nvidia did last week. The market had underestimated the chipmakers’ ability to leverage their leadership in AI chips. Nvidia shares skyrocketed 24% the following day after being up 30% at one point.

On May 30, Nvidia’s market valuation briefly topped US$1 trillion before dipping slightly. It is now valued eight times its arch-rival Intel, with a market value of just US$125 billion. Only five companies currently have a market capitalisation of over US$1 trillion: iPhone maker Apple, software behemoth Microsoft, search engine giant Google’s parent Alphabet, e-commerce supremo Amazon and Middle Eastern oil powerhouse Saudi Aramco. Facebook’s owner, Meta Platform, was briefly a member of the trillion-dollar club in 2021, as was electric vehicle pioneer Tesla.

Understanding Nvidia

Nvidia graphics processing unit, or GPU, chips are used in PCs, cloud servers in data centres, game consoles, cars and robots, and bitcoin mining despite the recent ‘crypto winter’. The original first-generation Microsoft Xbox used Nvidia’s gaming chip, as did Sony’s PlayStation PS3. These days the only game console that uses Nvidia chips is Nintendo’s Switch. Nvidia’s traditional gaming chip business continues to grow robustly thanks to the rise of e-sports, cloud gaming and more graphically intense video games.

Nvidia has been one of the world’s best-performing tech stocks and a darling of retail investors over the past decade. The stock was up 45% just in May. The stock is up 188% this year, up 524% over the last five years, a whopping 11,000% over the last 10 years and 602% since I wrote my column, Can Nvidia remain a chip powerhouse? touting the high-flying stock in February 2018.

See also: test article with event

Why are chips, specifically those made by Nvidia, taking all the oxygen in the room? Nvidia has moved on from making cheap game console chips to high-end sophisticated chips used in data centres for cloud computing, artificial intelligence, or AI. Investors’ newfound fascination with AI, particularly generative AI models such as ChatGPT, in which Microsoft has made a significant investment through its partnership with OpenAI, has catapulted Nvidia to one of the world’s largest companies.

You must understand the difference between a microprocessor or CPU in your desktop PC or laptop and the GPU Nvidia makes. If you have been using PCs for years, you have probably heard of Intel, which, until three years ago, was the world’s dominant chip maker when Nvidia overtook it. Intel owned the computing market for over 40 years. It was one-half of the Wintel combo with Microsoft, whose Windows operating software worked with Intel chips. It made macro processors or CPUs, the central processing units, that powered PCs and servers. The software we use daily, like word processing, spreadsheets, and Internet browsers, worked fine on an Intel CPU. However, we need more computing power or accelerated computing and artificial intelligence.

Little wonder, then, that GPUs are ascendant. GPUs are also used for parallel processing. Instead of doing one thing at a time, the GPU chip can work on several different things simultaneously. That saves time and enhances productivity, and chips like GPU, which Nvidia initially made for gaming consoles or for playing interactive games on PCs, can also be used for sophisticated AI applications.

See also: Without regulator buy-in, scaling AI in financial services will be an uphill battle

As for Intel, it famously missed the mobile computing era as London-based ARM Holdings, which designed smartphone chips, became the dominant player in mobile computing. Now Intel has missed the GPUs, and it looks like it will miss the huge new AI market, which, for now, Nvidia seems to have all to itself. Nvidia abandoned its attempt to take over ARM Holdings late last year after it became apparent that regulators in the UK and the US were unwilling to let the merger go ahead. Arm Holdings, a Japanese tech investment firm SoftBank subsidiary, is seeking a New York Stock Exchange listing as early as July.

Companies these days store all or most of their data on cloud servers operated by Amazon Web Service, Microsoft Azure or Google Cloud.

Increasingly, cloud infrastructure is moving from CPUs to GPUs. The advent of ChatGPT and generative AI is forcing a bigger shift to GPUs instead of old CPU workhorses. At the moment, only 4% of cloud infrastructure is GPU-based servers. CPUs power the rest. Nvidia’s GPU data centre accelerators can potentially serve as the backbone for generative AI infrastructure. That means the rest of the 96% of cloud infrastructure must migrate to GPUs over time. “Generative AI is driving exponential growth in compute requirements and a fast transition to Nvidia’s accelerated computing,” Nvidia’s CFO Colette Kress told analysts during last week’s earnings call.

So, the opportunity for Nvidia is huge. How huge? Microsoft’s ChatGPT used up to 10,000 Nvidia GPUs — mostly Nvidia’s A100 chips — for AI training. Because Nvidia is the dominant global supplier of GPUs, each of its A100 chips can command US$10,000. If Microsoft needs 25,000 Nvidia additional GPU chips this year, as has been reported, it would need to fork out US$250 million for them. Nvidia is rolling out the next generation of H100 chips for generative, which could sell up to US$40,000 a piece. That is US$1 billion for 25,000 H100 chips. Nvidia controls over 80% of the server GPU market, while AMD controls the rest.

The battle for resources

Because of the US ban on technology exports to China, Nvidia can only supply cheaper, low-powered A800 GPUs to China’s search giant Baidu which has been working on its own ChatGPT equivalent called ERNIE Bot. As Nvidia migrates to even higher-end GPUs, the gap between what it supplies to Microsoft for ChatGPT and what Baidu gets for its ERNIE Bot will widen. Eventually, China will make its own AI chips, but by then, America’s lead in AI chips would have grown much wider.

Throughout much of modern history, wars have been fought mostly over resources. The biggest resource is not oil, water, minerals, metals, or even land. It is a semiconductor chip. And get this: The battle between the US and China is not about who should dominate the world. The real fight is about a single resource called the chip. They are ubiquitous. Inside everything — your PC, tablet, smartphone, refrigerator, washing machine, microwave oven, car and air conditioners. My gourmet coffee machine recently broke down because of a malfunctioning chip. Now, a coffee machine isn’t some complex piece of electronic circuitry powering some nuclear plant. It just makes coffee, but it does use chips and sensors. These days you just can’t live without chips. Everyone wants chips, and whoever has the most advanced chips will dominate the world. That’s what the battle is all about. Artificial Intelligence is the next new frontier in the global chip war. And Nvidia, for now, is by far the dominant maker of AI chips, with a 90%-plus market share.

Sink your teeth into in-depth insights from our contributors, and dive into financial and economic trends

The new growth cycle

The race to convert cloud servers running on CPU to sophisticated AI-enabled servers running on GPUs is part of America’s post-Covid-19 revival. In the aftermath of the pandemic, supply chain disruptions helped trigger inflation, forcing the US Federal Reserve to raise interest rates from near zero to over 5%. Central banks around the world have raised interest rates as well. That has helped dramatically slow down the global economy, although inflation is still becoming sticky. In the aftermath of previous slumps, technology and innovation helped spark a new growth cycle.

Twenty years ago, it was the Internet. In the aftermath of the global financial crisis, smartphones helped dawn the connected era and the gig economy. This time, all bets are on AI as the growth driver that helps the US economy and the global economy out of its current rut. AI promises to be a huge boom not just for Nvidia and firms like Microsoft and Google, who are at the forefront of generative AI like ChatGPT.

Yet so far, the focus has been on a handful of “winners”, which has led to a stock market rally that has helped seven or eight large-cap stocks led by Nvidia. Though there is a huge shortage of AI chips right now, and companies are begging Nvidia’s founder CEO Jen-Hsung Huang and some of the firm’s high-end chips are selling for three times recommended price in the grey market, competition is coming. Google makes its own Tensor Processing Unit or TPU chips. They may not be as good as comparable Nvidia chips but will improve over time. Microsoft is helping Nvidia’s rival Advanced Micro Devices or AMD to expand its AI chips. Apple makes its own M2 chips which have a GPU built-in. Apple is reportedly building a more powerful chip that will be rolled out next year. Amazon has ambitious chip plans. That may dampen the enthusiasm for Nvidia’s stock but is unlikely to derail the AI boom. The more players join the AI party, the better.

The AI boom is only just getting started. Though chip makers like Nvidia, Broadcom, Marvel Technologies and Micron Technology are seen as initial picks-and-shovel winners alongside Microsoft, Google’s owner Alphabet, Facebook’s parent Meta Platform, software and services companies with compelling AI services will emerge over the next few years.

Among the less obvious winners is the world’s largest global chip foundry, Taiwan Semiconductor Manufacturing or TSMC, which manufactures customised chips designed by fabless design houses like Nvidia that don’t have their plants and rely on outsourced manufacturing. Another is TSMC’s main equipment supplier, Dutch chip gear maker ASML. Nvidia needs TSMC to make the most sophisticated AI chips, and TSMC relies on ASML for the machines and tools it needs to make those sophisticated chips.

Assif Shameen is a technology and business writer based in North America

Highlights

Re test Testing QA Spotlight
1000th issue

Re test Testing QA Spotlight

Get the latest news updates in your mailbox
Never miss out on important financial news and get daily updates today
×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2024 The Edge Publishing Pte Ltd. All rights reserved.