←back to thread

122 points roboboffin | 1 comments | | HN request time: 0.212s | source
Show context
outworlder ◴[] No.42198355[source]
So, an inherently error-prone computation is being corrected by another very error prone computation?
replies(3): >>42198861 #>>42198868 #>>42198986 #
leptons ◴[] No.42198861[source]
I've never seen so much money spent on a fundamentally flawed tech, since maybe Theranos. I'm really starting to doubt the viability of the current crop of quantum computing attempts. I think there probably is some way to harness quantum effects, but I'm not sure computing with inherently high margin of error is the right way to do it.
replies(3): >>42199092 #>>42199157 #>>42200990 #
sesm ◴[] No.42199092[source]
I'm optimistic about current quantum computers, because they are a tool to study wave function collapse. I hope that they will help to understand the relation between the number of particles and a time how long a system can stay in entangled state, which will point to a physical interpretation of quantum mechanics (different from "we don't talk about wave function collapse" Copenhagen interpretation).
replies(1): >>42200429 #
nickpsecurity ◴[] No.42200429[source]
The non-experts here might be interested in why you’d want to do that. Do you have explanations or links about it?
replies(1): >>42202216 #
1. tsimionescu ◴[] No.42202216[source]
In short, quantum mechanics has a major issue at its core: quantum states evolve by purely deterministic, fully time reversible, evolutions of the wave function. But, once a classical apparatus measures a quantum system, the wave function collapses to a single point corresponding to the measurement result. This collapse is non-deterministic, and not time reversible.

It is also completely undefined in the theory: the theory doesn't say anything at all about what interaction constitutes "a quantum interaction", that keeps you in the deterministic time evolution regime; and what interactions constitute "a measurement" and collapse the wave function.

So, this is a major gap in the core of quantum mechanics. Quantum computers are all about keeping the qubits in the deterministic evolution state while running the program, and performing a measurement only at the end to get a classical result out of it (and then repeating that measurement a bunch of times, because this is a statistical computation). So, the hope is that they might shed some light on how to presicsely separate quantum interactions from measurements.