{"id":12800,"date":"2025-04-12T05:18:15","date_gmt":"2025-04-12T05:18:15","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/12800\/"},"modified":"2025-04-12T05:18:15","modified_gmt":"2025-04-12T05:18:15","slug":"drama-over-quantum-computings-future-heats-up","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/12800\/","title":{"rendered":"Drama over quantum computing\u2019s future heats up"},"content":{"rendered":"<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">On March 18th, Chetan Nayak, a physicist leading Microsoft\u2019s quantum team, presented new data on <a href=\"https:\/\/news.microsoft.com\/azure-quantum\/\" target=\"_blank\" rel=\"noopener\">the company\u2019s quantum computing chip<\/a> at the American Physical Society\u2019s Global Physics Summit in Anaheim, California. It was meant to calm a raging debate among physicists, but researchers <a href=\"https:\/\/www.nature.com\/articles\/d41586-025-00829-2\" target=\"_blank\" rel=\"noopener\">remain skeptical<\/a> of the results. \u201cI never felt like there would be one moment when everyone is fully convinced,\u201d Nayak told <a href=\"https:\/\/www.nature.com\/articles\/d41586-025-00829-2\" target=\"_blank\" rel=\"noopener\">Nature<\/a> in a March 18th article. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">The <a href=\"https:\/\/physicsworld.com\/a\/experts-weigh-in-on-microsofts-topological-qubit-claim\/\" target=\"_blank\" rel=\"noopener\">controversy<\/a> centers on Microsoft\u2019s February claim that it had built a new type of quantum hardware \u2014 a topological qubit, made from a pattern of electrons on a tiny wire. Microsoft claimed that the qubit is less prone to errors. That would make quantum computers easier to scale up to something big enough to actually be useful. But in the journal article accompanying the release, the editors wrote that Microsoft had not conclusively shown <a href=\"https:\/\/static-content.springer.com\/esm\/art%3A10.1038%2Fs41586-024-08445-2\/MediaObjects\/41586_2024_8445_MOESM2_ESM.pdf\" target=\"_blank\" rel=\"noopener\">the electrons forming the signature pattern<\/a>, known as <a href=\"https:\/\/physicsworld.com\/a\/majorana-modes-continue-to-elude\/\" target=\"_blank\" rel=\"noopener\">Majorana zero modes<\/a>. Nature had <a href=\"https:\/\/www.wired.com\/story\/microsoft-retracts-disputed-quantum-computing-paper\/\" target=\"_blank\" rel=\"noopener\">retracted<\/a> <a href=\"https:\/\/www.nature.com\/articles\/d41586-021-00612-z\" target=\"_blank\" rel=\"noopener\">a similar paper<\/a> by <a href=\"https:\/\/www.nature.com\/articles\/s41586-021-03373-x\" target=\"_blank\" rel=\"noopener\">a Microsoft-affiliated team<\/a> in 2021.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup qnnwq2 _1xwtict9\">When quantum computers become useful, ordinary consumers shouldn\u2019t expect them as personal devices.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">\u201cDiscourse and skepticism are all part of the scientific process,\u201d Microsoft spokesperson Craig Cincotta tells The Verge. He points to additional improvements since that accompanying article, where Microsoft says the team controlled and measured a specific aspect of the qubit.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">The newest data Microsoft presented on Tuesday is \u201cjust noise,\u201d says physicist Sergey Frolov of the University of Pittsburgh. (On Tuesday, Nayak acknowledged that the signal was hard to see because of electrical noise.) <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">In a statement, Nayak tells The Verge that Microsoft is confident in its device. \u201cIt is clear that the interest and excitement level are very high,\u201d he says. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">On top of controversy, the industry suffers from hype. Quantum computer champions say that they will revolutionize materials science, encryption, and finance. Theoretical research indicates that they could one day beat regular computers in certain time-consuming tasks and open new realms of computing. But the timeline is uncertain. In January, Nvidia\u2019s Jensen Huang <a href=\"https:\/\/www.cnbc.com\/2025\/03\/20\/nvidia-ceo-huang-says-was-wrong-about-timeline-for-quantum-computing.html\" target=\"_blank\" rel=\"noopener\">expressed doubt<\/a> that commercial quantum computing would exist in 15 years, triggering quantum computing stocks to fall. He tried to walk those comments back on March 20th, when he hosted \u201cQuantum Day\u201d at Nvidia\u2019s GTC conference, but <a href=\"https:\/\/fortune.com\/2025\/03\/21\/nvidia-jensen-huang-quantum-computing-stocks-gtc-rigetti-dwave-ionq\/\" target=\"_blank\" rel=\"noopener\">quantum-related stocks fell again<\/a>.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Nevertheless, quantum computing researchers have been hard at work. Over the recent months, Google, Amazon, and several startups have announced a series of incremental improvements. We\u2019re left to wonder how much longer consumers will have to wait for quantum computing\u2019s killer applications. Are quantum computers coming to your cloud or phone in the future? What and who are they for?<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup qnnwq2 _1xwtict9\">\u201cDiscourse and skepticism are all part of the scientific process.\u201d<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Quantum computers won\u2019t be able to tackle anything useful for at least another decade, says physicist Andrea Morello of the University of New South Wales in Australia. And that\u2019s if investors don\u2019t lose patience and jump ship. The technology remains a full-stack problem, from engineering the materials to make the qubits, to connecting the qubits together, to manufacturing the chips at scale \u2014 and not to mention software. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Investors are sticking around because the payoff could be huge. Quantum computers offer a completely new paradigm for computing. Unlike a conventional computer, which encodes information as binary ones and zeros, a quantum computer represents information as a probability of one and zero, known as a superposition. Superposition is a concept from quantum mechanics: for example, an electron can exist as a superposition, or probability, of multiple locations. You can also think of superposition like a coin flipping in the air. Before it lands, it is neither heads nor tails, but in a superposition state of both. Similarly, the qubit can represent information as some probability of both one and zero. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Researchers make physical qubits from different materials \u2014 for Google, Amazon, and IBM, each qubit is a small superconducting circuit; notable startups are using ions, atoms, and photons as qubits. At this point, it\u2019s not clear what material is best.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">All qubits obey the mathematics of quantum mechanics. So do molecules. That\u2019s why experts predict that an early useful application of quantum computers could be performing <a href=\"https:\/\/www.nature.com\/articles\/s41586-021-04351-z\" target=\"_blank\" rel=\"noopener\">accurate and fast chemistry simulations<\/a>, for discovering new materials for better batteries, more climate-friendly fertilizers, and new medical drugs. Currently, to simulate these reactions, scientists rely on supercomputers, which are inexact and slow. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">A quantum speedup could upend other industries, as well. Banks are investigating quantum optimization algorithms for <a href=\"https:\/\/arxiv.org\/pdf\/2403.14436\" target=\"_blank\" rel=\"noopener\">improving financial forecasts<\/a>. Quantum algorithms could make AI algorithms more energy-efficient. They should also be able to break existing encryption methods; the prediction has spurred research into<a href=\"https:\/\/www.theverge.com\/22523067\/nist-challenge-quantum-safe-cryptography-computer-lattice\" target=\"_blank\" rel=\"noopener\"> more robust forms of cryptography<\/a>. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">But first, researchers need to reduce the errors in a quantum computer overall and make them larger.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">And when quantum computers become useful, ordinary consumers shouldn\u2019t expect them as personal devices. Experts currently <a href=\"https:\/\/arxiv.org\/pdf\/2411.10406\" target=\"_blank\" rel=\"noopener\">envision<\/a> future quantum computers as a specialized chip in a supercomputer or as <a href=\"https:\/\/www.nature.com\/articles\/s41586-024-08406-9\" target=\"_blank\" rel=\"noopener\">a data center<\/a>. Either way, users would access the machine through the cloud. It\u2019s also unlikely that quantum computers will be useful for everyday tasks like word processing or internet browsing. Its proposed applications are largely specialized for technical fields such as pharmaceuticals and finance.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Recent progress has been heartening. The first quantum computers of note, built in the last decade, were too error-ridden to execute useful algorithms. Lately, researchers have figured out how to correct computing errors by encoding a single unit of information in multiple physical qubits instead of one. Using this approach, <a href=\"https:\/\/www.nature.com\/articles\/s41586-024-08449-y\" target=\"_blank\" rel=\"noopener\">Google<\/a> and <a href=\"https:\/\/www.nature.com\/articles\/s41586-025-08642-7\" target=\"_blank\" rel=\"noopener\">Amazon<\/a> have shown that their quantum computers can more reliably store information without the machines becoming more error-prone as they get bigger. The results could pave the way toward larger, useful quantum computers. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Still, a leap for physicists is an inch forward for the rest of us. Google and Amazon\u2019s quantum \u201cmemory\u201d only stored a single unit of quantum information, known as a logical qubit. A useful quantum computer will need thousands, perhaps a million physical qubits, corresponding to hundreds or thousands of logical qubits. Researchers need to reduce the number of physical qubits to encode a unit of information. In Amazon\u2019s recent announcement, they only needed nine physical qubits per unit of information, compared to the 105 physical qubits that Google needed. \u201cWe are a long way away from the big, mind-blowing, world-changing results and applications,\u201d says Morello.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup qnnwq2 _1xwtict9\">\u201cIt\u2019s a very delicate balance. It has a chance of either people getting bored, or getting overexcited and really angry\u2026\u201d<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">The US, European Union, and the UK governments have each pledged funding in the billions to develop quantum computing. For the US, the main rival is China, which has poured <a href=\"https:\/\/merics.org\/en\/report\/chinas-long-view-quantum-tech-has-us-and-eu-playing-catch\" target=\"_blank\" rel=\"noopener\">$15 billion of public funding<\/a> into quantum computing, according to the Mercator Institute for China Studies, a Germany-based think tank. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Cash has been flowing in the private sector, as well. Crunchbase reported that <a href=\"https:\/\/news.crunchbase.com\/venture\/quantum-computing-funding-record-high-ai-quantinuum\/\" target=\"_blank\" rel=\"noopener\">quantum computing received $1.5 billion in venture funding worldwide in 2024<\/a>, an all-time high compared to the previous record of $963 million in 2022.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">But building the technology is difficult. Researchers have to show progress to keep their investors happy, while also tempering their expectations to keep them patient. The worry is a potential \u201cquantum winter,\u201d where overhype leads to inflated expectations and disappointment, and investors withdraw funding. AI development underwent such cooling eras. Researchers made the first AI chatbot in the 1960s, but the field was overly optimistic about the speed of development. When they didn\u2019t deliver, <a href=\"https:\/\/arxiv.org\/pdf\/2109.01517\" target=\"_blank\" rel=\"noopener\">funders<\/a><a href=\"https:\/\/www.pet.theclinics.com\/article\/S1556-8598(21)00053-5\/abstract\" target=\"_blank\" rel=\"noopener\"> withdrew<\/a>, leading to two \u201cAI winters\u201d from the late 60s to the mid-90s.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">\u201cPeople would prefer to keep a low-enough profile to be kind of cool and a little bit buzzy, so that they can just continue reaping the benefits slowly,\u201d Frolov says. \u201cBut I think it\u2019s a very delicate balance. It has a chance of either people getting bored, or getting overexcited and really angry\u201d when quantum computers don\u2019t deliver according to their expectations.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">The anxiety over losing their funders\u2019 trust has led to physicists\u2019 current furor over Microsoft\u2019s claims. Frolov, along with several other researchers, has spent years calling out what he said were discrepancies between Microsoft\u2019s announcements and their experimental data. The community seems to be more receptive to critiques lately, he says.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Such are the growing pains involved in building a quantum computer. Its potential remains alluring, but the finish line is still far away. In the meantime, physicists will continue squabbling over incremental progress \u2014 as long as the cash keeps flowing. <\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\"><strong>Clarification, March 22nd:<\/strong> The 2018 Majorana zero-modes paper from a Microsoft-affiliated team was retracted by Nature.<\/p>\n<p><a class=\"duet--article--comments-link b1p9679\" href=\"http:\/\/www.theverge.com\/tech\/633248\/beyond-the-hype-of-quantum-computers#comments\" target=\"_blank\" rel=\"noopener\"><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"On March 18th, Chetan Nayak, a physicist leading Microsoft\u2019s quantum team, presented new data on the company\u2019s quantum&hellip;\n","protected":false},"author":2,"featured_media":12801,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3164],"tags":[3284,326,53,16,15],"class_list":{"0":"post-12800","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-computing","9":"tag-tech","10":"tag-technology","11":"tag-uk","12":"tag-united-kingdom"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114323305704772552","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/12800","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=12800"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/12800\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/12801"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=12800"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=12800"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=12800"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}