The GPU market is buoyed by gaming, AI, and crypto. I am bearish on crypto. However, I am neutral/mildly bullish on gaming (VR/AR is the next wave of gaming), and I am extremely bullish on AI (computer vision, autonomous vehicles). Assuming level 5 vehicular autonomy is reached first by big/unicorn tech (Waymo, Uber, Lyft, etc). It seems these companies have the leverage to poach talent and vertically integrate. Recent attempts to subvert Nvidia’s dominance in the GPU AI market include Google’s Tensor Processing Unit (TPU). My point is that the majority of deep learning efforts take place in the cloud (AWS, Azure, GCP). It seems Amazon, Microsoft, and Google are highly incentivized to develop their own processing units for AI. Upon doing so, they could potentially push Nvidia out of the cloud by undercutting compute costs using their own hardware. Thoughts? * Disclaimer: I own Nvidia shares.
Good analysis. It depends on the adoption. If AI/deep learning has wide adoption (that is, your average company is investing in deep learning), then Nvidia will benefit. If the adoption is limited to few tech companies and limited verticals, NVidia won’t benefit much because of vertical integration by tech companies like you mentioned.
True, smaller shops could still benefit from cheap options especially if their use cases are much smaller in scope than say Google's.
Can you think of any sample use cases? Why would smaller shops invest in deep learning?
Even for average companies, a lot of them are starting to use cloud solutions from Google cloud , Amazon AWS , IBM and Microsoft Azure rather than building their own infra. I think these few powerful companies have massive influence on using whatever in their cloud solutions for ML / DL / AI and definitely they will try to minimize their dependence on NVDA GPU’s by building customized NPU / TPU
But who would be best positioned to create that specialized hardware? I mean, GPUs are specialized for visual computation. I’m from the semiconductor industry and it’s not trivial to just decide to become a large scale, high tech hardware manufacturer. As a technology, there’s tons of tribal knowledge that came from years and years of smart people optimizing lines that have unique problems. Just ask a semi plant that tries to spin up a new plant to make the same stuff. Almost impossible, hence Intel’s super extreme culture around process and detail. When you’re dealing with flatness specs that have a standard deviation of 2 atoms across a surface a foot in diameter, everything matters. To think I.e. Google could just manufacture at scale is to underestimate the challenges. Also, as a cost driven business with razor thin margins, these mega tech companies have no experience in that environment. The entire culture would need to be different and in that case, what’s the advantage of vertical integration? I think it’s more likely that the cloud businesses will just have enormous buying power and strongly influence a variety of competitors, as having all your eggs in one basket is too risky. But to answer your question, I think it’s inevitable that ML specific hardware replaces GPUs, but that Nvidia is the best positioned to develop that hardware.
Also, what you have said about big tech leveraging its position to negotiate prices makes a lot of sense. We have already seen examples of this with medical insurance companies and medical providers/pharmaceutical companies.
Brilliant points! The strategy is to differentiate their cloud from others. No one cares about infrastructure per se. More companies will be interested switching to your cloud if you help them compete in their field using ML you provide.
You’ve raised some pretty good points. I suspected something about the profit margins while posting. I think one thing to point out is that I’m not necessarily saying that in-house solutions need to beat Nvidia in performance, just that a good enough in-house solution could be competitive to Nvidia. 2 weeks of model training instead of 1 week seems acceptable.
Isn’t your concern analogous to the one of smart phone CPUs? Millions of devices sold to date, and snapdragon is still the leading chipset. Macs use Intel’s chipset, etc.
Depends on the market, making asics is serious money and you need to have huge demands to justify. The good news is it is easy for nvidia and amd to do it if the market is there.
Software Engineering Career
Yesterday
333
Most prestigious filesystem?
Tech Industry
2d
54248
Goog Employees Arrested
2024 Tax
Yesterday
3686
Biden’s new tax proposal is wild
Tech Industry
Yesterday
1951
So hard being a women in tech industry
2024 Presidential Election
Yesterday
2395
Biden ruined America and tech! Tax plans are insane
Short answer yes. A gpu is much more general than a tpu, which I think is just an asic. A specialized solution will always trump a more general one. Didn't the alpha go zero use like 2 tpus instead of the 500 gpus previously. Maybe there will be some kind of moores law for ai chips now. Nvidia kinda lucked out that the gpus we're efficient for this stuff.
I agree but not from the specialization perspective. Just from the fact that a cloud provider could develop their own chips and offer a cheaper service.