←back to thread

335 points ingve | 3 comments | | HN request time: 0.61s | source
Show context
raverbashing ◴[] No.45083550[source]
My belief in achieving actual quantum computing is going down as noise in qbits goes up

Digital computers were much easier than that. Make it smaller, make a larger number of it, and you're set.

Quantum computers complexity goes up with ~ n^2 (or possibly ~ e^n) where n is the number of qbits

At the same time, things like d-wave might be the most 'quantum' we might get in the practical sense

replies(1): >>45083788 #
analog31 ◴[] No.45083788[source]
It turns out that error correction was easy on digital computers, and was essentially a solved problem early in their development. In fact, "noise immunity" is arguably the defining feature of a digital system. And error correction can happen at each gate, since there's no reason to propagate an indeterminate number.
replies(1): >>45084808 #
Der_Einzige ◴[] No.45084808[source]
Except quantum error correction algorithms that are good don’t exist and probably theoretically never can exist: https://spectrum.ieee.org/the-case-against-quantum-computing
replies(2): >>45085621 #>>45086364 #
1. analog31 ◴[] No.45086364[source]
I took a QC course, and have done some reading, but am hardly an expert. But my impression has been: "This is analog computation." To reinforce the similarity, the error level of analog computers can be improved by running many of them in parallel.
replies(1): >>45090486 #
2. vrighter ◴[] No.45090486[source]
That gets you about 1 bit of extra precision every time you quadruple the number of parallel machines. (or rerun the computation 4x)
replies(1): >>45092934 #
3. analog31 ◴[] No.45092934[source]
Yep. O(sqrt(n)) is a tough slog.