PsiQuantum Achieves Breakthrough in Mass-Producing Light-Powered Quantum Chips American quantum computing startup PsiQuantum has announced a major breakthrough in manufacturing scalable photonic quantum chips, marking a significant step toward making practical quantum computing a reality. The company, which emerged from stealth mode in 2021, has been working on a light-powered (photonic) quantum computing approach, which was previously considered impractical due to hardware limitations. Why Photonic Quantum Computing? • Photonic quantum computers encode data in individual particles of light (photons), rather than in superconducting circuits like many other quantum systems. • This approach has key advantages: • Low noise compared to superconducting qubits. • High-speed operation due to the natural speed of light. • Seamless integration with fiber-optic networks, which could make quantum internet feasible. • However, the challenge has always been scaling up, as photons are difficult to control, detect, and stabilize in large-scale computations. PsiQuantum’s Breakthrough • In a paper published in Nature, the company unveiled a manufacturing process that enables large-scale production of photonic quantum chips. • The new hardware design solves key engineering problems, making it possible to reliably manipulate and measure photons at scale. • Unlike previous photonic quantum systems, which struggled with extreme hardware demands, PsiQuantum’s solution reduces errors and improves stability in complex computations. Implications for the Future of Quantum Computing • Scalability Achieved – This breakthrough could allow for mass production of quantum chips, removing a key bottleneck in commercial quantum computing development. • Quantum Networking Potential – With natural fiber-optic compatibility, photonic quantum computers could lead to highly secure quantum communications networks. • New Industrial Applications – The technology may soon be applied to optimization problems, cryptography, and materials science, revolutionizing industries that require complex simulations. The Bigger Picture PsiQuantum’s ability to mass-produce photonic quantum chips puts light-powered quantum computing in direct competition with other approaches, such as superconducting and trapped-ion quantum systems. If successful, it could make quantum computing more accessible, scalable, and commercially viable—a leap forward in the race to achieve practical quantum supremacy.
Quantum Computing Scalability in Real-World Applications
Explore top LinkedIn content from expert professionals.
Summary
Quantum computing scalability in real-world applications refers to the ability to build and deploy quantum computers that can tackle practical problems at larger scales, moving beyond laboratory prototypes and into industries like finance, medicine, and logistics. Recent breakthroughs show that quantum hardware, error correction, and integration with existing computing systems are making quantum technology increasingly usable and energy-efficient.
- Prioritize integration: Consider combining quantum computers with classical and AI systems to create powerful hybrid workflows capable of solving diverse complex challenges.
- Adopt scalable designs: Focus on quantum chip technologies and architectures that reduce errors and work with existing manufacturing processes to help quantum systems grow without requiring vast resources.
- Explore collaborative models: Look into collective infrastructure approaches, such as shared quantum systems across countries, to boost accessibility and resilience for broader real-world impact.
-
-
Google and IBM believe first workable quantum computer is in sight - meanwhile Europe offers a more collaborative vision Yesterday, both Google and IBM signalled that quantum computing is entering its engineering phase: Google’s Willow chip, introduced in December 2024, demonstrated scalable error correction: as more qubits were added, error rates dropped exponentially. It completed a benchmark task in under five minutes - one that would take today’s fastest supercomputer an unimaginable 10⁻²⁵ years (i.e., 10 septillion years). IBM revealed a detailed blueprint for industrial-scale quantum, outlining a path to building a fault-tolerant quantum supercomputer by late 2029. Meanwhile, real-world applications are already emerging: IBM and Moderna have collaborated to simulate the longest mRNA sequence (60 nucleotides) ever modelled on a quantum computer, using 80 of the 156 qubits on IBM’s Heron chip. They applied a clever algorithm (CVaR-based VQA) that has made earlier attempts at 42 nucleotides seem modest. Now contrast that with Europe’s collaborative approach. Instead of centralised lab efforts, Europe is deploying nine quantum systems across at least seven countries - spanning superconducting, ion-trap, and annealing technologies - integrated with national supercomputing centers for shared access and resilience. I recently visited the Poznań Supercomputing Centre in Poland to witness one of these systems in action. Europe’s model is about collective strength, diversity, and building long-term quantum infrastructure - demonstrating that the race isn’t just about breakthroughs, but also how you organise for scale and inclusivity.
-
For quantum computing to reach its full potential, it will need to become part of a broader computing fabric—working alongside classical HPC and AI systems to tackle problems that no single paradigm can address alone. This has been the idea behind quantum-centric supercomputing (QCSC): integrating quantum processors with classical compute, and orchestration layers so hybrid algorithms can run as coherent, end-to-end workflows rather than fragmented experiments. Today we’re sharing a concrete step in that direction: our Quantum-Centric Supercomputer Reference Architecture, which describes how quantum processors can integrate with classical HPC and AI infrastructure across the full stack—from applications and orchestration layers to how these systems may ultimately be deployed in data centers. Today’s hybrid workflows are still largely stitched together manually by experts. Our goal with this architecture is to outline the system components, software layers, and interconnects that will be needed to make quantum-classical workflows more natural and scalable as hardware and applications mature. Importantly, the framework is evolutionary. Early systems may operate with loosely coupled resources, but over time we expect progressively tighter integration between quantum processors, CPUs, and GPUs—enabling deeper co-design across hardware, software, and applications. References in comments.
-
A quiet but profound milestone in quantum computing: researchers have demonstrated silicon spin qubits with fidelity exceeding 99.9%, fabricated using standard semiconductor processes. That’s not just a technical achievement, it’s a signal that quantum chips may soon be manufacturable at scale, using the same industrial infrastructure that powers classical computing. The implications for cost, reliability, and integration are enormous, especially as quantum systems inch closer to practical deployment. What’s especially interesting is how this breakthrough aligns with DARPA’s Quantum Benchmarking Initiative, which defines “utility scale” as the point where quantum processors deliver more commercial value than they cost to operate. Crossing that threshold would mark the beginning of a new era, not just for physics labs, but for industry, logistics, finance, and beyond. If you’re tracking the convergence of quantum theory and manufacturing reality, this might get you thinking. #QuantumComputing #Semiconductors #SpinQubits #TechInnovation #DARPA #UtilityScale #DeepTech
-
Quantum-classical computing is transforming how we approach complex challenges by combining cloud-based quantum acceleration with traditional systems to improve speed, precision, and efficiency while enabling scalable innovation. This convergence is a key step toward making quantum computing a practical tool rather than a distant goal. Hybrid architectures allow organizations to experiment safely through APIs and SDKs that connect classical systems with Quantum Processing Units in the cloud. This setup reduces the barrier to entry and opens new paths for optimization, simulation, and predictive modeling in sectors such as finance, logistics, and materials science. Early experimentation is essential. Small proof-of-concept projects can reveal measurable gains in performance and cost-efficiency while helping teams develop internal expertise and new partnerships. Each iteration builds readiness for a future in which quantum computing will become an integral part of enterprise infrastructure. #QuantumComputing #DigitalTransformation
-
cuts quantum computer heat emissions by 10,000 times, offering a breakthrough in cooling and efficiency for next-generation machines. Heat is a major challenge in quantum computing, as excess energy disrupts qubits and causes errors. Reducing emissions is essential for scaling up powerful quantum systems. This device operates at extremely low temperatures, maintaining qubits in stable states while drastically minimizing unwanted thermal noise, allowing longer computations with higher accuracy. It could be launched as early as 2026, potentially revolutionizing how quantum computers are built, cooled, and deployed, making them more practical for real-world applications. Controlling heat at this scale reminds us that engineering solutions, combined with quantum science, are key to unlocking the full potential of quantum computing, enabling faster, more reliable, and energy-efficient machines. Thank YOU — Quantum Cookie The device is a cryogenic traveling-wave parametric amplifier (TWPA) made with specialized "quantum materials." Traditional amplifiers used for reading out qubit signals in superconducting quantum computers generate noticeable heat (even if small in absolute terms), which adds thermal noise, raises the cooling burden on dilution refrigerators, and limits how many qubits can be packed into one cryostat. Qubic's version reportedly cuts thermal output by a factor of 10,000, bringing it down to practically zero (on the order of 1–10 microwatts), while also reducing overall power consumption by about 50%. Why this matters for quantum computing - Heat is a core scaling bottleneck: Qubits (especially superconducting ones) must operate at millikelvin temperatures (~10–50 mK). Even tiny amounts of heat from readout electronics or control lines can cause decoherence, increase error rates, and require more powerful (and expensive) cryogenic systems. - The amplifier's role: It boosts the faint microwave signals from qubits without adding much noise. Conventional semiconductor-based amplifiers at cryogenic stages dissipate more heat; this new TWPA minimizes that, potentially allowing twice as many qubits per dilution refrigerator by easing the thermal load and simplifying cabling. - Potential impact: Lower cooling demands could cut operational costs and energy use significantly, making larger, more practical quantum systems feasible for real-world applications rather than just lab prototypes. Timeline and status The company has received grant funding and aims for commercialization/launch in 2026. As of early 2026 reports, development is ongoing with targets like 20 dB gain over a 4–12 GHz bandwidth. No major contradictions or retractions have appeared in credible coverage.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development