Do you have a citation?
I have a strong belief that new mathematical tools and methods can be developed that can make it "easy" to break a lot of modern cryptography primitives without ever using a quantum computer.
The practical problem is that ‘noise’ between gates seems to increase exponentially, so practically it may actually be impossible to construct anything with more than a handful of gates for the foreseeable (possibly indefinite?) future.
It’s essentially the crypto version of Fusion.
As it turns out, that's a big if, but the bigness of the if is about hardware implementation. The theory behind it is just basic quantum mechanics
Transformer LLMs already gave us the most general AI as of yet, by far - and they keep getting developed further, with a number of recent breakthroughs and milestones.
From the article it sounds like we will still be safe for 20+ years. On the other hand 15 was just extraordinarily easy, progress after 21 will be much quicker. And we never know which breakthroughs might come in the next decades that speed up progress.
Can you provide a quick verification for that?
But 22 and 24 are in the same boat as 21 here. All three of them require computing only factors that are not one, all three are not one less than a factor of 2. You need slightly more multiplications (and thus more gates) as the numbers get larger, but that only grows linearly. Maybe the conditional multiplications required get slightly more expensive to implement, but I wouldn't expect a 100x cost blowup from that. Error correction is still an issue, potentially making a linear complexity increase quadratic, but qubit counts in quantum computers also increase at an exponential rate
> so we should totally be able to factor 21 (or larger numbers)…. When?
Just because we solve one problem doesn't imply all the problems in QC are also instantly solved. I guess it does if you assume noise is the only problem and once is it solved the engineering is trivial. That is not the case. Even assuming all foundational problems have been solved, figuring out how actually engineer and also mass produce large numbers of gates, will take a while.
As the article pointed out, going from 15 to 21 requires a 100x increase in gates.
As the article that you posted under says:
"Because of the large cost of quantum factoring numbers (that aren’t 15), factoring isn’t yet a good benchmark for tracking the progress of quantum computers. If you want to stay abreast of progress in quantum computing, you should be paying attention to the arrival quantum error correction (such as surface codes getting more reliable as their size is increased) and to architectures solving core scaling challenges (such as lost neutral atoms being continuously replaced)."
Even in "multimodal" models, text is still the fundamental unit of data storage and transformation between the modes. That's not the case for how your brain works—you don't see a pigeon, label it as "pigeon," and then refer to your knowledge about "pigeons". You just experience the pigeon.
We have 100K years of homo sapiens thriving without language. "General Intelligence" occurs at a level above semantics.