What is the future of hardware development. Are we reaching the end of innovation in hardware? All the new devices coming out dont seem to have that big of an innovation compared to immediate previous. Whats is the future of ASICs, SOCs, FPGAs and CPUs.?
Seems like a feeling more than fact; can you pinpoint which big innovations you saw previously in that space? Given the ever increasing cost and complexity (and thus risk) of developing and burning large chips it makes sense that big and complex ones evolve incrementally, while new players on the block for smaller specialized chips make theirs simpler (but highly specialized so more energy efficient). Both of these situations may contribute to an impression that not much big-bang tech is happening. Edit: typo
^ this
Moore's Law isn't going to just end like someone turning off gravity. Just because we no longer have a doubling of transistors on a chip every 18 months doesn't mean that progress will come to a complete stop. It just means that the speed of improvements will happen a bit slower.
The law also doesn’t specify compute power per dollar but per chip
No. That’s like saying is the innovation of the car over 10 years ago before Tesla.
As we are going towards cloud computing, autonomous driving and IOT...requirement of more power efficient and scalable solution in ASIC and CPU world will increase. Design challenges will of course increase. This is an interesting time if you are in hardware space.
New clouds
No, “end of innovation” is impossible, doesn’t exist. There are too many kinds of innovation. The end of hardware innovation some companies are experiencing, is self inflicted because their executive teams have absolutely no desire to innovate... pretty much because it is “risky” or NUD (new unique difficult) and they have no idea what to do with it... the end of innovation in hardware can be currently predicted in PCs and, believe it or not, that includes most of the desktop, notebook and portable form factors. And even mobile phones / smartphones... But even in these already perfected formats, a more tactical level of innovation can still be injected. Like battery life improvements, much much better camera, screens that are brighter but use less energy, etc. Just the form factor itself is either mature or EOL... meaning if suddenly a new way to do desktop or handheld computing emerges, and if it is much better than the current classic way of doing it, those companies who failed to keep up, will literally disappear.
Cheers mate 🍻
Analog can never be out of fashion
If cavemen had said, how can be innovate more on our arrows, we would never had missiles and bombs. The thing about innovation is that we should not tie it down to one thing - like hardware.
(Biased from an ML perspective) The short-term future of hardware is changing from compute power to data movement. Innovation in hardware will be driven by innovation in data movement. For single nodes (one cpu/server+gpu) compute power is excellent right now. The trouble is that a lot of operations are memory-bound, or require cpu-gpu communication. Photonics deals with some of those problems, along with direct-to-gpu memory. That's why Nvidia bought Mellanox. The same problem exists for communication between nodes. The "edge" marketing craze is driven by the fact that you can send data to a cloud server, make a prediction, and send it back. It's expensive to make sure that all phones can run their own ML models. Much better to have them use a cloud server. The main obstacle is data movement and latency.
Software Engineering Career
Yesterday
427
Candidate used non-woke terminology?
2024 Presidential Election
Yesterday
1294
Biden to hand out green cards to 4000 illegals per year
Tech Industry
Yesterday
391
Best LCOL or MCOL city?
Tech Industry
Yesterday
1178
Enraged that kids prefer KPop to Bollywood. Installed cams in their rooms.
Is quantum computing fully mature yet? Also, for classical computing, we haven’t gotten over the power wall last time I checked.