As more companies bring AI chips in-house, including Meta/Google/Amazon/Microsoft and now reportedly OpenAI, is $NVDA approaching its peak? https://www.reuters.com/technology/chatgpt-owner-openai-is-exploring-making-its-own-ai-chips-sources-2023-10-06/
- OpenAI makes really good AI progress. - OpenAI sells AI as a Solution - Companies don’t trust OpenAI security - open source community builds similar AI - Companies adopt open source models and have in-house solutions. - OpenAI doesn’t use NVDA hardware - Companies don’t have their own hardware - NVDA offers low cost hardware as a service - NVDA 🚀 🌕
There wont be similar AI solution like GPT, bert etc because it takes fuck ton of money for data and resources to train these gigantic models. We will be at mercy of big corporations like google, microsoft etc to open-source their AI solution for a long time.
2022 called, it wants jailchimp back.
There has been literally fifty startups, Microsoft, Amazon, and Google tried to do the same thing the last 7 years or so. And the big CSPs and everyone else are still using nvidia GPUs. Maybe OpenAI AGI could design an accelerators that can beat Nvidia GPUs. Too bad OpenAI still needs nvidia to reach AGI 🤣
😱😱😱😰😰😰
What does everyone think of the chips/systems being offered by smaller outfits like Cerebras and Groq?
Literally the entire country of China backed by infinite money or places like fucking INTEL cannot make silicon that come close to 1% of what nvidia does. Why this is? I dunno but i wouldn’t be worried
Lol they are gonna fail like everyone else
Taping out new silicon that is competitive with whatever succeeds the H100 is a 5-7 year project that requires a huge amount of capital and people. New architecture will require new compilers, toolchains, etc. It's not impossible, but it's a big lift.
Making AI chips ain't a walk in the park. Meta and Google tried. Meta for VR is still with Qualcomm, Google's TPU still needs a big update. Microsoft is also making their AI chips but still all of them take from Nvidia and maybe a small percentage later from amd, dmatrix, cerebras, tenstorrent etc. Sw people don't realise this because they are paid better than hardware engineers, hardware engineering is very difficult and very much experience and specific skill dependent. It's not like a similar DSA, system design concepts for most of the roles (I'm not undermining software skills, just wanted to mention it's not general.) Example an avid ASIC engineer can't be considered with same level for an FPGA role. GPU and CPU engineering needs a very different skillset. All tools , language and debug methodology is different. Hence for all software companies it will take a long long time to establish as a premier hardware company.
https://techround.co.uk/news/openai-considers-own-ai-chips-amidst-shortages/ Maybe sambanova 🤔🤔🤔???
No. All is good. Don’t worry 👍
Hey amazon, how do learn these things? Any references
@Badbarbi3 what org are you in? This sounds like exactly what I’m trying to switch into (distributed dl). Is it Annapurna?