I have been thinking about getting into quantum computing for a while now but can't figure out if this field will boom in the following 10-15 years or not. Any opinions on this?
Isn’t it still very early? Like 20 years out.
I imagine that like GPGPU, only a few people will write software for it while the rest of us will use APIs and frameworks
Yes it is time. Q# is the future.
Too early. Read papers on architecture conference and you'll find it would still have layered approached like assembly + bunch of high level languages
Not sure I understand your comment. Yes Q# is running on a simulator in Azure, but there are actual quantum computers in labs that are running quantum programs. Yes it is like a gpu where you have a driver that sends a program to be executed across to the quantum computer. Yes maybe IBM or someone will come up with a better language than Microsoft s Q#, but for now, I feel it is good enough to start thinking in quantum computing terms.
It will only boom if you make it boom
Been thinking about this myself. Near term there are NISQs, there are also some skeptics that there will always be too much noise to measure states reliably enough to get the quantum advantage. I think it's a gamble, but it could be one of those things that you read about in a Walter Isaacson book later, and think damn I wish I was part of that. Regardless, the skills you gain could still be leveraged for other areas. I'm thinking that a PhD is definitely the best way to get into the field though.
Wait. That’s illegal.
In a Microsoft video on YouTube a guy talks about a recent paper arguing that the amount of qbits necessary to error correct 1 qbit increases faster than the amount of qbits and that if that proofs to be true it would mean quantum computing is a dead end. The guy also says he prays everyday that it doesn't end like that because he has put all his eggs in the QC field
Is that because of the thermal design of the qbits? I'd be interested in reading that paper if you can find it!
Was it this guy? https://www.google.com/amp/s/spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing.amp.html But how many qubits do you actually need to do useful things, you get the 2^n type capacity right?
I heard on an RI video that something like 50 qbits was enough to achieve "quantum supremacy" And Google seems to already have something like 76 qbits chip But then again , "they say" you could need as much as 200 or 300 qbits to error correct 1 qbit It is important at this stage for me to state that I don't know shit about this and that I just report things I see on YouTube videos
What are you doing at AAPL? Hw or sw
HW I'm a nanotechnology engineer/physicist by training.