I am not able to understand the hype around NVIDIA. My understanding was that Google is using their TPUs, Meta also is using their own and Tesla is using AMD. I am not sure about Apple. With NVIDIA’s A100 and other industries, I am not able to follow how much revenue will they be able to generate? Can someone please help me understand what am I not seeing which Wallstreet is able to? TC: 370k
Wall Street can be wrong, and in this case it is. These are the same people who panic sold GOOG when bard blundered one response. NVDA will eventually come down to earth when the AI hype fizzles out.
Wall Street is wrong most of the times short term
Fizzles out ? The AI hype train has just started. You step in and make money on the latest craze and exit once your grandma starts talking about AI. The grandma signal is the best indicator to know when you are at the top of the bubble.
Intel already has Gaudi2 which is on par with A100 …
Hello H100 Hello DGX machines with Mellanox networking Hello Cuda Hello Accelerated Libraries
Intel is not on par at all...
It's easier to make sense of the stock market if you just accept that it's disconnected from reality.
Lot of demand and nobody else to fulfill. Google A3 is 26k h100s, Meta's project is in infancy. I think people underestimate building the entire chip ecosystem from software, performance, volume. Many axes to compete on. NV happens, some luck and some vision, to be the only one faring across many axes.
The problem with wall street is that do unlimited extrapolation. E commerce did well in COVID, it will be an eternal reality from now on. Nvidia chips got more demand due to mega corps wanting to run ai models, it's an eternal trend. It's still not clear if these ai models increases productivity or just a fad. Secondly the moat isn't nearly as strong as Nvidia bulls believe
To the first, I think google cares because MS wants to make it dance. To the second, only time will tell.
Google is using NVIDIA A100 Tesla is using NVIDIA A100 Meta is using NVIDIA A100 CUDA, CUDNN, Magnum IO, GPU Operator, TensorRT, TRITON, distributed multi-node training, mixed precision training. We are talking about 10 years of effort into building an ecosystem. And those who question scale. ChatGPT requires 5 x A100s for Inference. 1000's of A100s for Training. Companies cannot obviously upload their IP to ChatGPT => Train LLMs on-prem => Buy A100/H100s for Training. Trust me the more you buy the more you save. This is here to stay unlike crypto mining which for the right reasons did not take seriously.
I can tell you have the technical depth, but that second to last line made me laugh so much. Clearly the PR training has worked. You can reduce the number of GPUs you need by using things like zero-infinity, etc. You are stating the current state-of-art, no one likes to pay for more computing power than they need and all hyperscalars are investing heavily in model parallism techniques that can do better with fewer accelerators.
Good point. There are also approaches where people are trainig smaller models on larger datasets so that they can fit in one GPU but again, IP data is limited compared to public data. So models will have to be bigger for enterprises. You can also operate in lower precision like INT 8. But from my own experience, It's not that great. BF16/FP16 is optimum. There are also frameworks that's abstract the underlying HW, custom chip approaches to LLMs. But the problem here is would you want to buy a H100 that can give you great performance across a variety of workloads and architectures for both training and Inference ( via MIG) or would you buy custom silicon that offers you limited programmability. From my experience, it's the ecosystem - today you can run Spark and ETL, do video decoding and preprocessing,, Model Training, Model Serving and Simulations or Metaverse Applications (LLMs in the Metaverse 🙏) on the same GPU. Unless you are able to deliver a HW architecture that can do LLM inference at 1/10th or 1/5th the price of a H100 it's not worth the fight as the value it delivers is much more .So, going back to the PR statement, the more you buy the more you save 😊
It's a bubble duh. Even if the upside is for Nvidia the markethasnt priced in the decline of many companies who AI is supposed to make obselete
It’s kind of a forced equivalency - “AI is trending therefore more companies are using AI therefore more companies will be training their own models therefore more Companies will need more graphics cards” coupled with “large companies such as Amazon are moving away from serverless micro services therefore more companies will build physical server racks for their AI initiatives”. It’s an attempt at a proof by induction which isn’t really logical if you dig any deeper than the very surface level of the hypothesis, but we must also remember that illogical, hype based decisions factor just as much into investing theses as actual research
I wouldn’t say NVDA has 10 years of experience building this ecosystem, they have 30 years of expertise in building this ecosystem, because all of their GPU experience translates into TPU experience. For the past 20 years, Google has been hiring all the best PhDs in search — you want to build a traditional search engine, you aren’t catching up. But NVDA has been specializing in graphics cards, which are now the best way to do AI. There have all the people. They have 30 years of experience in terms of design, methods, manufacturing, hiring, mentoring, testing, bug fixing. Anyone else building a factory now will be building a graphics card factory for the first time. No one is catching up, just like no one is building a better traditional search engine than google. People still don’t realize how big AI is going to be. I believe it is going to be the biggest tech boom ever, and for every $5000 AI workstation, NVDA is going to get $2500, and for every $50 million dollar AI data center, NVDA is going to get $25 million of that, and when corporations and governments realize how important AI is, they will be scrambling to spend hundreds of billions of dollars, if not trillions of dollars, on AI. Don’t believe me? Have a different opinion? Fine, go short sell all of your net worth into NVDA, let’s see what happens.
Problem is not with NVDA tech (which is best in class), but with its stock price and valuation. Would you go long with all your net worth into NVDA at current price? Let's see what happens.
I am not going to talk about my asset allocation but many people have quite a bit of their NW in NVDA and they are doing just fine. And yes, I believe they are going to be doing even better.
The last line got me Tesla is using AMD. For… what?
Playing Cyberpunk in your Cybertruck
Tech Industry
Yesterday
1675
Rate my offers!
World Conflicts
Yesterday
770
Peaceful Protest Hasn’t Worked and Has Been Met With Aggression.
Tech Industry
4h
548
The man I love hates me because I’m Vietnamese
Tech Industry
7h
205
Is Israel getting bad PR the reason for banning tik tok?
Software Engineering Career
9h
2086
L4 Google -> 45 interviews, 5 offers, AMA
Simps for stock prices think it will go up like this forever.