←back to thread

113 points roboboffin | 8 comments | | HN request time: 0.415s | source | bottom
1. dogma1138 ◴[] No.42197261[source]
How can a classical system detect/correct errors in a quantum one? I thought all the error correction algos for quantum also relied on qbits e.g. Shor Code.
replies(6): >>42197361 #>>42197383 #>>42197389 #>>42197393 #>>42197624 #>>42197772 #
2. anon291 ◴[] No.42197361[source]
It's not a perfect detector. If you give up perfection and settle for X% accuracy, then you can use a classical system (FWIU).
3. summerlight ◴[] No.42197383[source]
Quantum computing is not intractable and can be still simulated with a sufficient amount of time. This work used quantum simulator to generate data points then use it to train a transformer, which doesn't seem that different from other neural network use cases to optimize computation heavy problems.

The question would be whether this approach still works when it is scaled to thousands or even millions of qubits. The team is optimistic that that is the case, but we will see.

4. drdeca ◴[] No.42197389[source]
The model could choose which measurement operations to make on the qubits, and which operations to take to repair the qubits?

In some quantum error correcting codes, there is a large set of operators that, when there are currently no errors, measuring these will not change the state (well, assuming the measurement is made without error), but would result in some information about the kind of error if there is an error, and this info can be used to choose what operations to take to correct the error.

For a number of such schemes, there’s a choice of a strategy of what schedule to check which of the measurements with, and how to correct the errors.

replies(1): >>42199437 #
5. fizx ◴[] No.42197393[source]
The error correction itself requires qbits, but reading out the final answer apparently becomes more probabilistic and complex, to the point where a neural net is a reasonable solution for interpretation and denoising.
6. Strilanc ◴[] No.42197624[source]
The full error correction system involves qubits. This paper is mainly about the decoder, which is responsible for taking the symptom data produced by the quantum circuit and determining the most likely errors that caused those symptoms. In the blog post it's not stated what code is being run, but in the illustration it's clear it's a surface code [1] and this is confirmed in the paper's abstract [2].

Disclaimer: am one of the authors, but not a main contributor. I wrote the simulator they used and made some useful suggestions on how to use it to extract information they wanted for training the models more efficiently, but know nothing of transformers.

[1]: https://errorcorrectionzoo.org/list/quantum_surface

[2]: https://www.nature.com/articles/s41586-024-08148-8.pdf

7. abdullahkhalids ◴[] No.42197772[source]
The world of quantum has all these interesting gotchas.

In a quantum computer, your logical quantum state is encoded in lots of physical qubits (called data qubits) in some special way. The errors that occur on these qubits are indeed arbitrary, and for enough physical qubits are indeed not practically classically simulatable.

To tackle these errors, we do "syndrome measurement" i.e. interact the data qubits with another set of physical qubits (called syndrome qubits), in a special way, and then measure the syndrome qubits. The quantum magic that happens is that the arbitrary errors get projected down to a countable and finite set of classical errors on the data and syndrome qubits!!! Without this magic result we would have no hope for quantum computers.

Anyway, this is where a decoder - a classical algorithm running on a classical computer - comes in. OP is a decoder. It takes the syndrome qubit measurements and tries to figure out what classical errors occurred and what sort of correction, if any, is needed on the data qubits.

8. abecedarius ◴[] No.42199437[source]
The way you describe this reminds me of the quantum bomb tester (Elitzur & Vaidman). Uhhh so this is treating a potential environmental interaction the same way as E&V's "bomb"? With at least the new wrinkle that there are multiple potential bombs, each with low probability?