Today, Google Quantum AI published a research paper that might boost the post-quantum migration. Their team has tailored Shor’s algorithm to solve the 256-bit Elliptic Curve Discrete Logarithm Problem. ECDLP is the hard mathematical problem that secures ECDSA: the signature scheme underpinning most blockchains, TLS certificates, and countless authentication systems, using fewer than 1,200 logical qubits and 90 million Toffoli gates. Translated to hardware: fewer than 500,000 physical qubits, executing in a few minutes. A few minutes. Less than a Bitcoin block time. Less than two Ethereum epochs. The long-standing argument that public keys can simply remain hidden is now moot. What exactly changed Shor's algorithm has been known since 1994 as a generic quantum approach to factoring integers and computing discrete logarithms. But "known" and "practical" are very different things. The real progress is in the engineering: how many qubits and gates you actually need once you compile the algorithm into a fault-tolerant quantum circuit. The recent algorithmic trendline is clear: every 12-18 months, the resource estimates drop significantly. And these are pure algorithmic gains: they compound on top of hardware improvements, which remain a major challenge. However, as of today, we're still far from having such a quantum computer. This didn't change. Zero Knowledge Proof Here's where it gets interesting. Google chose not to publish their optimized circuits. Instead, they released a zero-knowledge proof that their circuits achieve the claimed resource counts. We have no doubt they know how to do it, but no clue how. The reasons are likely multiple: competitive advantage, national security implications... Regardless, it establishes a powerful (and elegant) precedent. What’s ironic: Google's ZK proof is not itself post-quantum secure. What’s next? The good news is that we already have the tools: Post Quantum Cryptography, now we need to migrate. A few days ago, Google announced it is targeting 2029 for full post-quantum readiness. NIST plans to deprecate RSA signatures by 2030 and disallow all legacy algorithms by 2035. Cryptography exists to create mathematical trust in the security of systems. That trust is now being eroded, not by a working attack, but by the increasingly credible prospect of one. In security, the moment you start doubting the foundation is the moment you should be rebuilding it. What this means for blockchains For blockchain ecosystems specifically, the threat is central. ECDSA on secp256k1 (Bitcoin) and P-256 curves is the cornerstone of security. Unlike traditional systems where you can rotate certificates behind a corporate firewall, blockchain migration requires coordination across decentralized, permissionless networks. This process will likely take time. I'll be diving deeper into the concrete challenges and strategies for PQC migration on blockchains and secure systems at my keynote this Thursday at EthCC conference.
Role of Shor Code in Quantum Technology
Explore top LinkedIn content from expert professionals.
Summary
Shor code is a quantum error-correcting code, and Shor's algorithm is a quantum approach for cracking complex encryption by factoring large numbers or solving discrete logarithms—a challenge classical computers can't handle efficiently. The role of Shor code and Shor's algorithm in quantum technology is to make quantum computers reliable and powerful enough to tackle problems like breaking current cryptographic standards, which has sweeping implications for cybersecurity and computing.
- Understand quantum impact: Recognize that advances in Shor's algorithm and quantum error-correcting codes are steadily reducing the resources needed to perform cryptographically relevant tasks on quantum computers.
- Prepare for migration: Start planning for post-quantum cryptography, as the threat to current encryption methods is growing with every breakthrough in quantum technology.
- Monitor advancements: Keep an eye on new quantum hardware and algorithmic innovations, since both are moving quickly toward making quantum-powered applications widely accessible.
-
-
A recent paper demonstrates how to run Shor's algorithm using considerably fewer quantum gates. Using the new approach, factoring a 2048-bit RSA key might require about 100k gates instead of approximately 4 million. Despite this aggressive improvement, we may not need to panic just yet. There is rarely a free lunch in algorithm design, and this is no exception. One of the trade-offs made in this paper is an increase in the number of qubits required to implement the algorithm. As the authors acknowledge: "... an improvement in the number of gates does not necessarily translate into an improved practical implementation. Indeed, in most architectures currently being considered by industry, the space (or number of qubits) plays an important role.... It therefore remains to be seen whether the algorithm can lead to improved physical implementations in practice." Papers like this (see link below) are one of the reasons experts struggle to predict when quantum computers will break modern cryptography. The landscape is always shifting, with both the computers and the algorithms improving every year. Each bright idea potentially brings "Q Day" closer. -- ℹ️ Over 150 cyber professionals read my new weekly newsletter. Sign up using the link in the first comment 👇 #cybersecurity #cryptography #pqc #quantumcomputing #encryption
-
Shor’s algorithm is possible with as few as 10,000 reconfigurable atomic qubits by John Preskill (Caltech) https://lnkd.in/ethGUK8B Quantum computers have the potential to perform computational tasks beyond the reach of classical machines. A prominent example is Shor's algorithm for integer factorization and discrete logarithms, which is of both fundamental importance and practical relevance to cryptography. However, due to the high overhead of quantum error correction, optimized resource estimates for cryptographically relevant instances of Shor's algorithm require millions of physical qubits. Here, by leveraging advances in high-rate quantum error-correcting codes, efficient logical instruction sets, and circuit design, we show that Shor's algorithm can be executed at cryptographically relevant scales with as few as 10,000 reconfigurable atomic qubits. Increasing the number of physical qubits improves time efficiency by enabling greater parallelism; under plausible assumptions, the runtime for discrete logarithms on the P-256 elliptic curve could be just a few days for a system with 26,000 physical qubits, while the runtime for factoring RSA-2048 integers is one to two orders of magnitude longer. Recent neutral-atom experiments have demonstrated universal fault-tolerant operations below the error-correction threshold, computation on arrays of hundreds of qubits, and trapping arrays with more than 6,000 highly coherent qubits. Although substantial engineering challenges remain, our theoretical analysis indicates that an appropriately designed neutral-atom architecture could support quantum computation at cryptographically relevant scales. More broadly, these results highlight the capability of neutral atoms for fault-tolerant quantum computing with wide-ranging scientific and technological applications.
-
A new quantum computing initiative on Chicago’s South Side is set to become the most powerful in the country, with PsiQuantum at the heart of the effort. Backed by a $500 million investment from Illinois and DARPA, this ambitious project aims to push the boundaries of quantum technology, potentially revolutionizing computing and cybersecurity. The significance of quantum computing became clear in 1994 when mathematician Peter Shor published a groundbreaking paper demonstrating how a quantum computer could efficiently solve complex mathematical problems, particularly prime factorization. This discovery had major implications for cryptography, as widely used encryption protocols like RSA depend on the difficulty of factoring large numbers—a process that takes classical computers years to complete. Shor’s work showed that a sufficiently advanced quantum system could achieve the same results in minutes, posing a major threat to current encryption standards. The new quantum campus in Chicago represents a critical step toward realizing this theoretical potential. PsiQuantum, a leading player in the field, is working to develop a scalable quantum system capable of executing Shor’s algorithm and other advanced computations. If successful, this project could mark a turning point in fields ranging from cybersecurity to drug discovery and materials science, solidifying Chicago’s role as a global quantum hub.
-
Two important papers by Google and QuEra Computing Inc. dropped on the arXiv on Wednesday, both presenting new methods to run Shor's algorithm. They were released almost simultaneously and one could have smiled at the coincidence... until one remembered that Google invested in QuEra last year! But anyway, here are the papers and a few thoughts about why they matter: 👉 Google: https://lnkd.in/eNdYhJaK 👉 QuEra: https://lnkd.in/eH5XBuZD First, there's a chance that you'll see some articles pop up in the next few days, claiming that quantum computers are on the verge of breaking the Internet or something. This is because Shor's algorithm can be used to break popular public key encryption protocols like RSA, if you can run it on large enough numbers. But spoiler alert: this is not happening anytime soon. While the resource estimations in these papers are more optimistic than previous works, they are still far beyond the capabilities of current computers. 👉 We're still talking about 1 million superconducting qubits for the Google paper, while Google's largest chip to date is ~100 qubits. 👉 QuEra will require 19 million qubits, but has "only" demonstrated a few hundred so far. But then, what's the big deal? Two things: 👉 In the Google case, the number of superconducting qubits drops from 20 million to less than 1 million by using some clever compilation techniques, at the cost of a slower processing time (a few days instead of 8 hours). 👉 In the QuEra case, the key achievement is the 50x speedup over previous approaches based on neutral atoms, bringing total processing time to 5.6 days. This is on par with superconducting qubits, which are supposed to be much faster on paper. These achievements highlight an interesting trend in quantum computing: 👉 On one hand, the specs of real hardware are constantly improving 👉 On the other hand, the specs required for useful applications are constantly decreasing. The two curves are still not meeting, but they seem to be bound to converge by the end of the decade. But the best part in all of this is that several of the techniques demonstrated in these papers could be applied to a cat qubit architecture too. This means it should be possible to run Shor's algorithm on far fewer than the 100 000-qubit figure Alice & Bob has been advertising so far. How much fewer exactly? Well, let the police do the job and be sure I'll give you answers as soon as possible.
-
This could have big implications. Breaking RSA-2048 encryption using only 100k instead of 1 million physical qubits? Yesterday a paper appeared online that is quite rigorous in its analysis of resource estimations showing that a variant of Shors algorithm can be ran using only 100k physical qubits. That’s an order of magnitude reduction from the previous estimate/record of Gidney in 2025. Considering quantum hardware roadmaps predict fault tolerant machines will exist in the early 2030’s with this many qubits (and using the same error correcting code family), it could mean the standard encryption standards just got a scary amount closer to being obsolete. As an aside, my new favorite unit is the “megaqubit” (10^6 qubits) and look forward to seeing this used in the next era of fault tolerant resource estimation papers! Paper link 👉 https://lnkd.in/dtgFiUE2 Congrats to Paul Webster, Larry Cohen and the team from Iceberg Quantum for the nice result. #quantumcomputing #quantumapplications #encryption
-
The common estimate for breaking RSA with a quantum computer? Around 20 million qubits. This week, 𝗖𝗿𝗮𝗶𝗴 𝗚𝗶𝗱𝗻𝗲𝘆 from Google 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗔𝗜 showed how to do it with 𝗹𝗲𝘀𝘀 𝘁𝗵𝗮𝗻 𝟭 𝗺𝗶𝗹𝗹𝗶𝗼𝗻. Before you panic: this still requires a fault-tolerant quantum computer—something no one has built yet. And it would take a few 𝗱𝗮𝘆𝘀 to run. So no, your encrypted messages are not in danger. 𝗪𝗵𝗮𝘁 𝗰𝗵𝗮𝗻𝗴𝗲𝗱? Not the hardware. The trick lies in how the algorithm, Shor’s algorithm, is assembled and optimized: • 𝗦𝗽𝗮𝗰𝗲-𝘁𝗶𝗺𝗲 𝗰𝗼𝗺𝗽𝗿𝗲𝘀𝘀𝗶𝗼𝗻: Fewer qubits, longer runtime. The new architecture runs over several days instead of hours, but requires far less parallelism. • 𝗕𝗲𝘁𝘁𝗲𝗿 𝗯𝘂𝗶𝗹𝗱𝗶𝗻𝗴 𝗯𝗹𝗼𝗰𝗸𝘀: The paper uses compact arithmetic and smarter magic state generation to reduce ancilla overhead. • 𝗖𝗼𝗱𝗲-𝗹𝗲𝘃𝗲𝗹 𝘀𝗮𝘃𝗶𝗻𝗴𝘀: More efficient layout and scheduling of surface code operations cuts down the number of T factories and long-lived logical qubits. Gidney’s work is a clear reminder that meaningful progress isn’t just happening in the lab—it’s also happening in the compiler, in the fault-tolerance architecture, and in how we think about the resources at hand. 📸 Image Credits: Craig Gidney
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development