{"id":191619,"date":"2025-06-17T12:07:21","date_gmt":"2025-06-17T12:07:21","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/191619\/"},"modified":"2025-06-17T12:07:21","modified_gmt":"2025-06-17T12:07:21","slug":"psiquantum-study-maps-path-to-loss-tolerant-photonic-quantum-computing","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/191619\/","title":{"rendered":"PsiQuantum Study Maps Path to Loss-Tolerant Photonic Quantum Computing"},"content":{"rendered":"<p><strong>Insider Brief<\/strong><\/p>\n<ul class=\"wp-block-list\">\n<li>PsiQuantum\u2019s study evaluates photonic fusion-based quantum computing designs, identifying adaptive, encoded schemes that significantly improve tolerance to photon loss.<\/li>\n<li>The analysis finds that advanced methods like exposure-based adaptivity can raise loss thresholds to 18.8%, while balancing resource cost with performance.<\/li>\n<li>Using detailed modeling, the study maps tradeoffs between resource state size, preparation overhead and error tolerance, guiding scalable optical quantum architecture development.<\/li>\n<\/ul>\n<p>A new study from a team of PsiQuantum researchers lays out a blueprint for building loss-tolerant quantum computers using photons, showing that carefully engineered resource states and adaptive measurements could push photonic systems into the realm of fault-tolerant computing.<\/p>\n<p>The research, posted to <a href=\"https:\/\/arxiv.org\/abs\/2506.11975\" target=\"_blank\" rel=\"noopener\">arXiv<\/a> recently, compares a wide range of design schemes for a quantum computing architecture known as fusion-based quantum computing (FBQC). The analysis focuses on one of the biggest hurdles for photonic qubits: photon loss. Using simulations and theoretical comparisons, PsiQuantum researchers evaluate how different strategies fare under realistic conditions and which designs offer the best tradeoff between error tolerance and hardware cost.<\/p>\n<p>Fusion-based computing relies on entangling operations \u2014 called fusions \u2014 between small, pre-prepared resource states. These resources are stitched together to form larger structures capable of running algorithms. But in photonic systems, each qubit is represented by a single photon, making the system vulnerable. Simply put, if you lose the photon, the quantum information disappears.<\/p>\n<p><a href=\"https:\/\/thequantuminsider.com\/data\/\" onclick=\"_gs(&#039;event&#039;, &#039;DATA IN CONTENT NEW&#039;)\" class=\"responsive-image\" target=\"_blank\" rel=\"noopener\"><img decoding=\"async\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/06\/Website-Banner-Quantum-2.gif\" alt=\"Responsive Image\"\/><\/a><\/p>\n<p>PsiQuantum\u2019s paper evaluates nearly a dozen different resource configurations and encoding strategies designed to counteract this. It shows that by combining specific error-correcting codes with measurement adaptivity \u2014 where the system adjusts future operations based on past measurement outcomes \u2014 photonic quantum systems can tolerate loss rates that would otherwise be catastrophic.<\/p>\n<p>Loss Tolerance and Resource Cost<\/p>\n<p>At the center of the study is a metric called the Loss Per Photon Threshold (LPPT), which measures how much photon loss a system can endure before errors accumulate beyond control. In the most basic designs, loss tolerance is extremely limited. For example, traditional \u201cboosted\u201d fusion networks without any encoding or adaptivity manage an LPPT below 1%.<\/p>\n<p>PsiQuantum\u2019s team shows that introducing encoding \u2014 essentially spreading quantum information across multiple photons in a structured way \u2014 significantly boosts resilience. Using a resource state called the 6-ring network with a {2,2} Shor code, the researchers reach an LPPT of 2.7%. Incorporating adaptivity, where measurements are adjusted on the fly based on outcomes, raises the threshold further. A four-qubit code with adaptivity pushes LPPT to 5.7%.<\/p>\n<p>In more advanced designs, particularly those using \u201cexposure-based adaptivity,\u201d the LPPT reaches as high as 17.4% with a 168-qubit resource state. A newer geometry called the \u201cloopy diamond\u201d network \u2014 using 224 qubits and a {7,4} encoding \u2014 delivers even higher loss tolerance, hitting 18.8%.<\/p>\n<p>However, the study emphasizes that resilience comes at a cost and higher thresholds generally require larger and more complex resource states. These are expensive to prepare, especially when constructed from basic three-photon building blocks known as 3GHZ states. For instance, a 24-qubit 6-ring state requires more than 1,500 3GHZ states to assemble, while a 224-qubit loopy diamond network demands over 52,000.<\/p>\n<p>This essentially means that while setting up and running a photonic quantum calculation is theoretically possible, it remains impractical with current technology due to the extreme resource requirements.<\/p>\n<p>Tradeoffs Between Size and Performance<\/p>\n<p>Rather than chase higher thresholds, the paper is more focused on mapping out the tradeoff space \u2014 how much performance gain each additional photon delivers and when the cost becomes prohibitive. For example, PsiQuantum\u2019s modeling suggests that a 32-qubit loopy diamond resource state \u2014 a cluster of photons arranged for reliability, even when some photons are lost \u2014 offers better loss tolerance than a 24-qubit 6-ring but is cheaper to build.<\/p>\n<p>To further illustrate these tradeoffs, the team plots LPPT against resource size for dozens of schemes. While the theoretical maximum LPPT for adaptive systems approaches 50%, achieving this would require impractically large resource states. The best-performing small-to-medium scale systems top out at about 15%\u201319% LPPT, depending on geometry and adaptivity.<\/p>\n<p>These results help identify \u201csweet spots\u201d \u2014 designs that balance loss tolerance and hardware complexity. The authors suggest that, for near-term implementations, focusing on small resource states with smart adaptivity yields the best return.<\/p>\n<p>Adaptive Fusion and Geometry Selection<\/p>\n<p>The PsiQuantum team classifies adaptivity into two main types: local and global. Local adaptivity involves adjusting fusions within a small cluster of photons, while global adaptivity modifies the entire fusion network based on aggregate outcomes. The most effective technique analyzed \u2014 exposure-based adaptivity \u2014 selectively chooses which measurements to perform and in which order, prioritizing the parts of the system most vulnerable to error buildup.<\/p>\n<p>On top of encoding and adaptivity, geometry plays a critical role. The team compares 4-star, 6-ring, and 8-loopy-diamond network topologies. Each configuration dictates how photons are entangled and measured, with some layouts offering better loss tolerance or simpler resource construction.<\/p>\n<p>The study also introduces cost models for evaluating how many elementary operations are required to build each resource state. Using optimistic assumptions, such as perfect fusion success and no photon loss during assembly, they estimate preparation overhead in terms of the number of 3GHZ states needed. Even under these ideal conditions, resource costs rise steeply with encoding size.<\/p>\n<p>Implications for Fault-Tolerant Quantum Computing<\/p>\n<p>While fault-tolerant photonic quantum computing remains a long-term goal, this researchers are using this investigation to create a concrete map for getting there. It shows that with clever use of error-correcting codes, adaptive measurements and optimized network geometries, photon loss can be tamed to workable levels.<\/p>\n<p>The results are especially important for companies like PsiQuantum that are betting on photons over other qubit types, such as trapped ions or superconducting circuits. Photons offer advantages like room-temperature operation and easy transmission over fiber, but they suffer from unique challenges \u2014 chief among them, fragility.<\/p>\n<p>By framing the problem in terms of LPPT and resource cost, the PsiQuantum team provides a way to benchmark progress. New schemes can be compared on equal footing, and system architects can prioritize configurations that strike the right balance.<\/p>\n<p>Limitations and Future Work<\/p>\n<p>The study acknowledges several limitations. First, its cost metrics are based on simplified assumptions \u2014 such as perfect switching and no losses in the assembly stage \u2014 that may not hold in practice. Also, while the study focuses on theoretical error thresholds, end-to-end system performance should be considered, which would also account for decoherence, gate errors and environmental noise.<\/p>\n<p>It\u2019s likely that, as resource states grow, the complexity of managing measurement adaptivity also increases. Implementing dynamic fusion strategies in real time will require advances in classical control systems, fast switching networks and low-latency feedback loops, along with other technological innovations.<\/p>\n<p>Future work could involve refining cost models with real-world data from photonic devices, testing these adaptive strategies experimentally, and integrating them into full-stack architectures. The study also hints at further gains from leveraging \u201cscrap\u201d information \u2014 residual quantum states that survive partial photon loss\u2014a technique that could push non-adaptive systems beyond current limits.<\/p>\n<p>The paper on\u00a0<a href=\"https:\/\/arxiv.org\/abs\/2506.11975\" target=\"_blank\" rel=\"noopener\">arXiv<\/a>\u00a0dives in deeper technologically than this summary story, so reviewing the study for more exact technological detail is recommended. ArXiv is a pre-print server, meaning the work has not officially been peer-review, a key step of the scientific method.<\/p>\n<p>The PsiQuantum team of researchers included: Sara Bartolucci, Tom Bell, Hector Bombin, Patrick Birchall, Jacob Bulmer, Christopher Dawson, Terry Farrelly, Samuel Gartenstein, Mercedes Gimeno-Segovia, Daniel Litinski, Yehua Liu, Robert Knegjens, Naomi Nickerson, Andrea Olivo, Mihir Pant, Ashlesha Patil, Sam Roberts, Terry Rudolph, Chris Sparrow, David Tuckett and Andrzej Veitia<\/p>\n","protected":false},"excerpt":{"rendered":"Insider Brief PsiQuantum\u2019s study evaluates photonic fusion-based quantum computing designs, identifying adaptive, encoded schemes that significantly improve tolerance&hellip;\n","protected":false},"author":2,"featured_media":191620,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3164],"tags":[3284,78428,78429,78430,78431,53,16,15],"class_list":{"0":"post-191619","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-computing","9":"tag-fusion-based-quantum-computing","10":"tag-loss-per-photon-threshold","11":"tag-photonic-quantum-computing","12":"tag-psiquantum","13":"tag-technology","14":"tag-uk","15":"tag-united-kingdom"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114698626795078684","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/191619","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=191619"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/191619\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/191620"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=191619"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=191619"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=191619"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}