One Algorithm Has Just Pushed Quantum Computing Forward Five Years (Here It Is) Today I am releasing something into the public domain that may change the trajectory of quantum computing. No paywall. No NDA. No restrictions. The only thing I ask is attribution. For the past year, I have been developing a field-layer correction algorithm that stabilizes the environment around the qubit before error correction ever activates. Not hardware. Not cryogenics. Not shielding. Pure software that improves the physics of the qubit it sits inside. Early independent runs showed a 48.5 percent reduction in destructive low-frequency noise, a gain that normally takes years of hardware progress. Here is the complete algorithm. It now belongs to everyone. FUNCTION NJ001_FieldLayer_Correction(input_signal S, sampling_rate R): DEFINE phi = 1.61803398875 DEFINE window_size = dynamic value based on local variance of S DEFINE stability_threshold = adaptive value based on phase drift STEP 1: Generate harmonic reference bands For each frequency bin f_i in FFT(S): Compute r = f_(i+1) / f_i Compute CI = 1 / ABS(r - phi) Assign weight W_i = normalize(CI) STEP 2: Build correction mask Construct M where M_i = W_i scaled by local entropy of S Smooth M with sliding window STEP 3: Apply correction Transform S → F Compute F_corrected = F * M Inverse FFT to return S_corrected STEP 4: Phase stabilization loop Measure phase drift Δ If Δ > stability_threshold: Recalculate window_size Rebuild mask Reapply correction Else: Return S_corrected OUTPUT: S_corrected END FUNCTION This is the first public-domain coherence stabilizer designed to improve quantum behavior independent of hardware. What it does in practice: • Extends coherence windows • Reduces decoherence pressure on error correction • Lowers entropy in the propagation layer • Makes qubits behave as if the room is colder and cleaner • Works upstream of hardware with no materials changes This is not a replacement for anyone’s roadmap. It is an upstream upgrade to all of them. If you build quantum devices, control stacks, compilers, hybrid systems, or algorithms, you now have access to a function that reshapes your stability envelope. Cleaner field layers mean longer, deeper, more predictable runs. More useful computation with the hardware you already have. I developed it. Today I give it away. No company or institution controls it. From this moment forward, it belongs to the scientific community. Primary Citation Hood, B. P. (2025). NJ001 Field Layer Correction. Public Domain Release Version. Bruce P. Hood — Creator of NJ001 Field Layer Correction Welcome to the new baseline. #QuantumComputing #QuantumHardware #Qubit #Coherence #QuantumResearch #DeepTech @IBMQuantum @GoogleQuantumAI @MIT @XanaduQuantum @AWSQuantumTech
Quantum Computing Tools for Error Minimization
Explore top LinkedIn content from expert professionals.
Summary
Quantum computing tools for error minimization are specialized software and hardware solutions designed to reduce errors that naturally occur during quantum calculations, making quantum computers more reliable and practical for complex tasks. These tools use approaches like error correction codes, noise reduction algorithms, and novel chip designs to stabilize quantum bits (qubits) and extend their usable lifetime.
- Explore software solutions: Consider using algorithms that stabilize qubit environments and lower noise levels before error correction even begins, helping your quantum hardware run smoother without physical upgrades.
- Try hybrid error correction: Combine indirect error detection and targeted suppression techniques to reduce error overhead and improve computational efficiency, especially when hardware resources are limited.
- Utilize advanced architectures: Take advantage of new quantum chip designs and memory frameworks that exponentially cut error rates as systems grow, pushing quantum computers closer to practical, real-world applications.
-
-
I'm really happy with the rapid development of CUDA-Q QEC, our toolkit for quantum error correction. QEC is an incredibly rich and fast-moving field, and in CUDA-Q QEC we aim to provide a platform with a diverse set of accelerated decoders, AI infrastructure, tools to enable researchers to develop and test their own codes, decoders, and architectures, hopefully even better than our own! As we dig deeper into the problem of scalable QEC, the benefits of GPUs and AI have become much clearer. We started with research tools, for simulation and offline decoding, which is still an important capability. Now with the 0.5.0 release we also provide the infrastructure for real-time decoding, where syndrome processing occurs concurrently with quantum operations. This release also introduces GPU-accelerated algorithmic decoders like RelayBP, a promising approach developed in the past year that aims to overcome the convergence limitations of traditional belief propagation. For scenarios demanding maximum throughput, we have integrated a TensorRT-based inference engine that allows researchers to deploy custom AI decoders trained in frameworks like PyTorch and exported to ONNX directly into the quantum control loop. To address the complexities of continuous system operation, we added sliding window decoders that handle circuit-level noise across multiple rounds without assuming temporal periodicity. These tools are designed to be hardware-agnostic and scalable, supporting our partners across the ecosystem who are building the first generation of reliable logical qubits. Check out the full technical breakdown in our latest developer blog by Kevin Mato, Scott Thornton, Ph.D., Melody Ren, Ben Howe, and Tom L. https://lnkd.in/gvC__zRd
-
🚨 Exciting #quantumcomputing alert! Now #QEC primitives actually make #quantumcomputers more powerful! 75 qubit GHZ state on a superconducting #QPU 🚨 In our latest work we address the elephant in the room about #quantumerrorcorrection - in the current era where qubit counts are a bottleneck in the systems available, adopting full-blown QEC can be a step backwards in terms of computational capacity. This is because even when it delivers net benefits in error reduction, QEC consumes a lot of qubits to do so and we just don't have enough right now... So how do we maximize value for end users while still pushing hard on the underpinning QEC technology? To answer this the team at Q-CTRL set out to determine new ways to significantly reduce the overhead penalties of QEC while delivering big benefits! In this latest demonstration we show that we can adopt parts of QEC -- indirect stabilizer measurements on ancilla qubits -- to deliver large performance gains without the painful overhead of logical encoding. And by combining error detection with deterministic error suppression we can really improve efficiency of the process, requiring only about 10% overhead in ancillae and maintaining a very low discard rate of executions with errors identified! Using this approach we've set a new record for the largest demonstrated entangled state at 75 qubits on an IBM quantum computer (validated by MQC) and also demonstrated a totally new way to teleport gates across large distances (where all-to-all connectivity isn't possible). The results outperform all previously published approaches and highlight the fact that our journey in dealing with errors in quantum computers is continuous. Of course it isn't a panacea and in the long term as we try to tackle even more complex algorithms we believe logical encoding will become an important part of our toolbox. But that's the point - logical QEC is just one tool and we have many to work with! At Q-CTRL we never lose sight of the fact that our objective is to deliver maximum capability to QC end users. This work on deploying QEC primitives is a core part of how we're making quantum technology useful, right now. https://lnkd.in/gkG3W7eE
-
The Quantum Memory Matrix (QMM) framework has traveled a long path: from black hole unitarity, dark matter, dark energy, and cosmic cycles, to now improving how quantum computers handle errors. In work now featured on the cover of Wiley's Advanced Quantum Technologies, we demonstrate how the QMM framework can be directly applied to quantum error correction. QMM originated in cosmology: a picture where space-time is not smooth but is built from Planck-scale cells, each with a finite memory capacity. We showed how these cells store the quantum imprints of interactions, contributing to resolving paradoxes around black holes, explaining dark matter halos, primordial black hole formation, cosmic acceleration, and even the cycles of the universe. Now, we bring this same idea into hardware: by imprinting and retrieving quantum information from local "memory cells," we can correct errors in noisy quantum processors with higher fidelity than standard repetition codes. This shows that QMM is not only a cosmological theory but also a practical tool for building the quantum computers of tomorrow. 🔗 Read the paper: https://lnkd.in/gfGwe7fe Previous QMM milestones: 🕳️ Black hole information retention and unitarity restoration ⚡ Extensions to electromagnetism, strong & weak interactions 🌌 Cosmological applications explaining dark matter and dark energy 💻 And now: direct hardware validation for quantum computation Thank you to my co-authors Eike Marx, Valerii Vinokur, Jeff Titus, and Terra Quantum AG & Leiden University for making this journey possible. #QuantumComputing #QuantumMemoryMatrix #ErrorCorrection #QuantumPhysics #QuantumTechnology #QuantumInformation #BlackHolePhysics #DarkMatter #DarkEnergy #AdvancedQuantumTechnologies #TerraQuantum #QuantumResearch
-
Google Unveils Willow: A Leap Forward in Quantum Computing Google Quantum AI has introduced Willow, a cutting-edge quantum chip designed to address two of the field’s most significant challenges: error correction and computational scalability. Willow, fabricated in Google’s Santa Barbara facility, achieves state-of-the-art performance, marking a pivotal step toward realizing a large-scale, commercially viable quantum computer. It gets way geekier from here – but if you’re with me so far… Exponential Error Reduction Julian Kelly, Director of Quantum Hardware at Google, emphasized Willow’s ability to exponentially reduce errors as the system scales. Utilizing a grid of superconducting qubits, Willow demonstrated a historic breakthrough in quantum error correction. By expanding arrays from 3×3 to 5×5 and then 7×7 qubits, researchers cut error rates in half with each iteration. This achievement, referred to as being “below threshold,” signifies that larger quantum systems can now exhibit fewer errors, a challenge pursued since Peter Shor introduced quantum error correction in 1995. The chip also achieved “beyond breakeven” performance, where arrays of qubits outperformed the lifetimes of individual qubits, which is key to ensuring the feasibility of practical quantum computations. Ten Septillion Years in Five Minutes Willow’s computational capabilities were validated using the Random Circuit Sampling (RCS) benchmark, a rigorous test of quantum supremacy. According to Google’s estimates, Willow completed a task in under five minutes that would take a modern supercomputer ten septillion years—a timescale exceeding the age of the universe. This achievement underscores the rapid, double-exponential performance improvements of quantum systems over classical alternatives. While the RCS benchmark lacks direct commercial applications, it remains a critical indicator of quantum computational power. Kelly noted that surpassing classical systems on this benchmark solidifies confidence in the broader potential of quantum technology. Building Toward Practical Applications Google’s roadmap aims to bridge the gap between theoretical quantum advantage and real-world utility. The team is now focused on achieving “useful, beyond-classical” computations that solve practical problems. Applications in drug discovery, battery design, and AI optimization are among the potential breakthroughs quantum computing could unlock. Willow’s advancements in quantum error correction and computational scalability highlight its transformative potential. As Kelly explained, “Quantum algorithms have fundamental scaling laws on their side,” making quantum computing indispensable for tasks beyond the reach of classical systems. Quantum computing is still years away, but this is an exciting milestone. Considering the remarkable rate of technological improvement we’re experiencing right now, practical quantum computing (and quantum AI) may be closer than we think. -s
-
MIT Sets Quantum Computing Record with 99.998% Fidelity Researchers at MIT have achieved a world-record single-qubit fidelity of 99.998% using a superconducting qubit known as fluxonium. This breakthrough represents a significant step toward practical quantum computing by addressing one of the field’s greatest challenges: mitigating noise and control imperfections that lead to operational errors. Key Highlights: 1. The Problem: Noise and Errors • Qubits, the building blocks of quantum computers, are highly sensitive to noise and imperfections in control mechanisms. • Such disturbances introduce errors that limit the complexity and duration of quantum algorithms. “These errors ultimately cap the performance of quantum systems,” the researchers noted. 2. The Solution: Two New Techniques To overcome these challenges, the MIT team developed two innovative techniques: • Commensurate Pulses: This method involves timing quantum pulses precisely to make counter-rotating errors uniform and correctable. • Circularly Polarized Microwaves: By creating a synthetic version of circularly polarized light, the team improved the control of the qubit’s state, further enhancing fidelity. “Getting rid of these errors was a fun challenge for us,” said David Rower, PhD ’24, one of the study’s lead researchers. 3. Fluxonium Qubits and Their Potential • Fluxonium qubits are superconducting circuits with unique properties that make them more resistant to environmental noise compared to traditional qubits. • By applying the new error-mitigation techniques, the team unlocked the potential of fluxonium to operate at near-perfect fidelity. 4. Implications for Quantum Computing • Achieving 99.998% fidelity significantly reduces errors in quantum operations, paving the way for more complex and reliable quantum algorithms. • This milestone represents a major step toward scalable quantum computing systems capable of solving real-world problems. What’s Next? The team plans to expand its work by exploring multi-qubit systems and integrating the error-mitigation techniques into larger quantum architectures. Such advancements could accelerate progress toward error-corrected, fault-tolerant quantum computers. Conclusion: A Leap Toward Practical Quantum Systems MIT’s achievement underscores the importance of innovation in error correction and control to overcome the fundamental challenges of quantum computing. This breakthrough brings us closer to the realization of large-scale quantum systems that could transform fields such as cryptography, materials science, and complex optimization problems.
-
Really happy to see the official publication today of our paper in Nature Machine Intelligence: "Machine Learning for Practical Quantum Error Mitigation" Haoran Liao, Derek S. Wang, Iskandar Sitdikov, Ciro Salcedo, Alireza Seif, Zlatko Minev 🔍 Context: Quantum computers progress to outperform classical supercomputers, but quantum errors remain the primary obstacle. Quantum error mitigation offers a solution but at the high cost of added runtime. 🤔 Key Question: Can classical machine learning help us overcome errors in today's quantum computers by lowering mitigation overheads, in practice, on real hardware, at the 100 qubit+ scale? 🔬 Our Findings: Using both simulations and experiments on state-of-art quantum computers (up to 100 qubits), we find that machine learning for quantum error mitigation (ML-QEM) can: - Significantly reduce overheads. - Maintain or even outperform the accuracy of traditional methods. - Deliver nearly noise-free results for quantum algorithms. We tested multiple machine learning models on various quantum circuits and noise profiles. And, by leveraging ML-QEM, we were able to mimic conventional mitigation results for large quantum circuits, but with much less overhead. 🌟 Conclusion: Our research underscores the potential synergy between classical hashtag#ML and hashtag#AI and quantum computing. We're excited about the prospects and further research! 🙌 Big thanks to the dream team and many folks who contributed! Let’s share and discuss the implications of this exciting work! 🌟👇 📄 Paper: Nature Machine Intelligence https://lnkd.in/dGYzC3fq 🔓 Free access: View the paper here https://lnkd.in/dN222X7D 📚 Preprint on arXiv https://lnkd.in/dGbzjtjA 👩💻 Code Repository: Explore on GitHub https://lnkd.in/dcn-xPtm 🎥 Seminar: Watch hashtag#IBM @Qiskit on YouTube here https://lnkd.in/dEPRcMVK https://lnkd.in/e7JFgc3J
-
Long before #quantum becomes an #AI-accelerator, classical machine learning is already useful in quantum stacks. Many of the quantum engineering bottlenecks look like typical ML problems. From control to AI-based transpilation and error mitigation. This is the angle I followed for a while. I even used it in our QC summer school curriculum design, as an easy entry point for computer scientists. This week, I came across a few new materials and news on the topic. My notes turned into a short digest that might be useful for some of my readers. 𝗠𝗟 𝗳𝗼𝗿 𝗻𝗼𝗶𝘀𝗲 𝗺𝗶𝘁𝗶𝗴𝗮𝘁𝗶𝗼𝗻. Ross Duncan and the team just dropped a comprehensive review and ablation study on applying machine learning to readout error mitigation: https://lnkd.in/d8c4CgeF The intuition is pretty straightforward - it is basically the same signal-denoising problem we solve in deep learning for image enhancement and audio processing. In practice, however, things get much messier when you scale to larger qubit counts. An earlier attempt to scale was relatively successful by IBM in https://lnkd.in/d-Gp5tYp . In the new paper, the focus is on a few qubits, but the contribution is a more systematic ablation study across circuit families and model architectures, needed to build better intuition for scaling. 𝗠𝗟 𝗳𝗼𝗿 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗘𝗿𝗿𝗼𝗿 𝗖𝗼𝗿𝗿𝗲𝗰𝘁𝗶𝗼𝗻. Another active direction is the application of ML to syndrome decoding. I was checking some recent literature after a new startup emerged in this domain (https://lnkd.in/dqXURTuz ). As for QEC, a classical real-time control/decoding pipeline must keep up with cycle times, and this is where “AI decoders” are being pitched as a potentially low-latency, high-throughput solution. The direction is not new and is already quite crowded by various attempts to address the problem, including DeepMind’s recent 𝗔𝗹𝗽𝗵𝗮𝗤𝘂𝗯𝗶𝘁 𝟮 model (https://lnkd.in/dKrvqcad ) and earlier ML decoder work for IBM’s heavy-hexagon family of codes (https://lnkd.in/dumWZRb9 ). But building an efficient model is only one step. Ultimately, for practical systems, this will narrow down to running the reduced model with near-real-time inference on specialized hardware, without sacrificing too much decoding accuracy. Classical ML in QC often appears elegant and cute, but it is not a silver bullet. In some regimes, it may simply miss genuinely quantum correlations and fail miserably. In QEC, inference latency may be killing in practice, similar to what happens in various edge applications in classical AI. And across both mitigation and decoding, out-of-distribution scenarios are typically a direct path to the model's failure. 𝗛𝗼𝘄 𝘂𝘀𝗲𝗳𝘂𝗹 𝗵𝗮𝘃𝗲 𝘆𝗼𝘂 𝗳𝗼𝘂𝗻𝗱 𝗠𝗟 𝗶𝗻 𝗾𝘂𝗮𝗻𝘁𝘂𝗺 𝗰𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴? 𝗔𝗻𝘆 𝗼𝘁𝗵𝗲𝗿 𝗴𝗼𝗼𝗱 𝗽𝗿𝗮𝗰𝘁𝗶𝗰𝗮𝗹 𝘀𝗰𝗲𝗻𝗮𝗿𝗶𝗼𝘀?
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development