←back to thread

335 points ingve | 1 comments | | HN request time: 0s | source
Show context
raverbashing ◴[] No.45083550[source]
My belief in achieving actual quantum computing is going down as noise in qbits goes up

Digital computers were much easier than that. Make it smaller, make a larger number of it, and you're set.

Quantum computers complexity goes up with ~ n^2 (or possibly ~ e^n) where n is the number of qbits

At the same time, things like d-wave might be the most 'quantum' we might get in the practical sense

replies(1): >>45083788 #
analog31 ◴[] No.45083788[source]
It turns out that error correction was easy on digital computers, and was essentially a solved problem early in their development. In fact, "noise immunity" is arguably the defining feature of a digital system. And error correction can happen at each gate, since there's no reason to propagate an indeterminate number.
replies(1): >>45084808 #
Der_Einzige ◴[] No.45084808[source]
Except quantum error correction algorithms that are good don’t exist and probably theoretically never can exist: https://spectrum.ieee.org/the-case-against-quantum-computing
replies(2): >>45085621 #>>45086364 #
1. i7l ◴[] No.45085621[source]
The current best one- and two-gate errors are well below 0.01% and going down with every generation of chips. See: https://ianreppel.org/quantum.html

There are no theoretical reasons QEC can't exist. In fact it already does. Is it already good enough for universal fault tolerance? No. But then again no one said it would. We are slowly getting closer every year.

In his book, Dyakonov offers zero solid reasons other than "it's hard" and thus likely not possible. That's just an opinion.