Google’s TPU Chips Threaten Nvidia’s Dominance in AI Computing

Here is a three-year chart of stock prices for Nvidia (NVDA), Alphabet/Google (GOOG), and the generic QQQ tech stock composite:

NVDA has been spectacular. If you had $20k in NVDA three years ago, it would have turned into nearly $200k. Sweet. Meanwhile, GOOG poked along at the general pace of QQQ.  Until…around Sept 1 (yellow line), GOOG started to pull away from QQQ, and has not looked back.

And in the past two months, GOOG stock has stomped all over NVDA, as shown in the six-month chart below. The two stocks were neck and neck in early October, then GOOG has surged way ahead. In the past month, GOOG is up sharply (red arrow), while NVDA is down significantly:

What is going on? It seems that the market is buying the narrative that Google’s Tensor Processing Unit (TPU) chips are a competitive threat to Nvidia’s GPUs. Last week, we published a tutorial on the technical details here. Briefly, Google’s TPUs are hardwired to perform key AI calculations, whereas Nvidia’s GPUs are more general-purpose. For a range of AI processing, the TPUs are faster and much more energy-efficient than the GPUs.

The greater flexibility of the Nvidia GPUs, and the programming community’s familiarity with Nvidia’s CUDA programming language, still gives Nvidia a bit of an edge in the AI training phase. But much of that edge fades for the inference (application) usages for AI. For the past few years, the big AI wannabes have focused madly on model training. But there must be a shift to inference (practical implementation) soon, for AI models to actually make money.

All this is a big potential headache for Nvidia. Because of their quasi-monopoly on AI compute, they have been able to charge a huge 75% gross profit margin on their chips. Their customers are naturally not thrilled with this, and have been making some efforts to devise alternatives. But it seems like Google, thanks to a big head start in this area, and very deep pockets, has actually equaled or even beaten Nvidia at its own game.

This explains much of the recent disparity in stock movements. It should be noted, however, that for a quirky business reason, Google is unlikely in the near term to displace Nvidia as the main go-to for AI compute power. The reason is this: most AI compute power is implemented in huge data/cloud centers. And Google is one of the three main cloud vendors, along with Microsoft and Amazon, with IBM and Oracle trailing behind. So, for Google to supply Microsoft and Amazon with its chips and accompanying know-how would be to enable its competitors to compete more strongly.

Also, AI users like say OpenAI would be reluctant to commit to usage in a Google-owned facility using Google chips, since then the user would be somewhat locked in and held hostage, since it would be expensive to switch to a different data center if Google tried to raise prices. On contrast, a user can readily move to a different data center for a better deal, if all the centers are using Nvidia chips.

For the present, then, Google is using its TPU technology primarily in-house. The company has a huge suite of AI-adjacent business lines, so its TPU capability does give it genuine advantages there. Reportedly, soul-searching continues in the Google C-suite about how to more broadly monetize its TPUs. It seems likely that they will find a way. 

As usual, nothing here constitutes advice to buy or sell any security.

How Many Semiconductor Chips Are There in a Car?

I recently read a statement that there is something like 1400 individual semiconductor chips in a typical modern car.  I wondered, “Can that be correct?”   1400 is a lot of anything.  I have torn apart whole PCs and found only a few dozen chips.

Chips in cars have big economic significance. As called out on a post back in March, COVID shutdowns of semiconductor plants and other factors meant a shortage of critical chips for cars. This has led to extensive shutdowns of car and truck assembly lines in 2021, affecting employment and auto maker profits.  It is estimated that the world lost 11.3 million units of production in 2021 due to the chip shortage, and may lose another 7 million units in 2022.

But back to 1400  chips…I did not find the One True Pronouncement of chips in cars (a promising N Y Times article lay tantalizingly behind a paywall). But I found a number of statements that corroborated that order of magnitude, and also fleshed out the many uses for such chips.

This picture is worth maybe 1400 words:

Source

Here is an even more detailed diagram (sorry, hard to read):

Source

Cars and trucks have something like 100 distinct electronics modules, and each module has multiple chips. Wiring in cars is expensive and vulnerable, so it is better to distribute the information processing rather than run a bunch of wires back to one central processor.

The chip supply situation should sort itself out by 2024, if all goes well. Meanwhile, electronics has become the tail that wags the automotive dog – – electronics have gone from being just 18% of a car’s cost in 2000, to being 40% of its cost in 2020 , and projected to be 45% by 2030: