←back to thread

81 points teddyh | 1 comments | | HN request time: 0.215s | source
Show context
tomgag ◴[] No.44609103[source]
I guess I'll post it here as well. This is my personal take on the whole story: https://gagliardoni.net/#20250714_ludd_grandpas

A relevant quote: "this is your daily reminder that "How large is the biggest number it can factorize" is NOT a good measure of progress in quantum computing. If you're still stuck in this mindset, you'll be up for a rude awakening."

Related: this is from Dan Bernstein: https://blog.cr.yp.to/20250118-flight.html#moon

A relevant quote: "Humans faced with disaster tend to optimistically imagine ways that the disaster will be avoided. Given the reality of more and more user data being encrypted with RSA and ECC, the world will be a better place if every effort to build a quantum computer runs into some insurmountable physical obstacle"

replies(5): >>44609195 #>>44609761 #>>44611286 #>>44611423 #>>44612270 #
jgeada ◴[] No.44609195[source]
Except that factorization is exactly what is needed to break encryption, and so knowing what QC can do in that realm of mathematics and computing is exactly the critical question that needs to be asked.

And a reminder that in the world of non-QC computing, right from its very roots, the ability of computers improved in mind boggling large steps every year.

QC records, other than the odd statistic about how many bits they can make, have largely not made any strides in being able to solve real world sized problems (with exception of those that use QCs purely as an analog computer to model QC behavior)

replies(1): >>44609450 #
tomgag ◴[] No.44609450[source]
I beg you to read the full story and to not extrapolate from the quote.

Also, in the world of QC, right from its very roots, the ability of QC improved in mind boggling large steps every year. It's only that you cannot see it if you only look at the wrong metric, i.e., factorization records.

It's a bit like saying "classical computing technology has not improved for 50 years, it's only recently that we finally start to have programs that are able to write other programs".

replies(2): >>44609521 #>>44609609 #
jgeada ◴[] No.44609609[source]
There is a reason QC factorization records haven't shifted much over the past years. Number of qubits by themselves isn't enough. You to be able to do computation on them and for long enough to run Shor's algorithm till it produces a solution. How the qubits are connected, how reliable the logic gates are and how long you can maintain the quantum coherence with enough fidelity to get results is equally important.

That no significant factorization milestones have moved is a huge critical black eye to this field. Even worse, that no one has ever even been able to truly run Schors algorithm on even trivial numbers is a shocking indictment of the whole field.

replies(3): >>44609682 #>>44610230 #>>44611259 #
1. bawolff ◴[] No.44611259[source]
> That no significant factorization milestones have moved is a huge critical black eye to this field.

But everyone knew that it wasn't going to move going in. It would have been shocking if it had. It was never a reasonable medium-term goal.