Oak Ridge National Laboratory is learning what it takes to integrate quantum with classical computing, and the experience could shape how future computers tackle complex problems across industries.

The quantum side of the equation arrived at the Department of Energy lab in July: Quantum Brilliance, based in Australia, installed its quantum computers at the Oak Ridge Leadership Computing Facility in Tennessee. The facility, a high-performance computing (HPC) center, houses the Frontier HPE-Cray EX supercomputer, one of a handful of exascale systems in the world.

The task at hand is finding ways to make fast computers even more potent and capable of addressing challenges beyond the scope of any single technology.

“There are all kinds of remarkable things that we think quantum computers could do,” said Travis Humble, director of the Quantum Science Center at Oak Ridge. “But to have even more impact, we would like to integrate that capability with our high-performance computing systems. If we could combine quantum computers with those, that would be an amazing process.”

Tapping the power of quantum, classical computing

Oak Ridge and other early users aim to harness the properties of quantum physics as they build the next generation of computing platforms. Quantum processes such as superposition and entanglement could provide the computational power and parallelism to supercharge simulation and optimization applications, among others. Quantum bits, or qubits, provide the basic unit of information that will let quantum computers work with vast amounts of data and process a multitude of calculations.

Many industry experts believe quantum computers will evolve in cooperation with classical supercomputing. Indeed, several recent quantum computing deployments have been in HPC facilities. At the Rensselaer Polytechnic Institute in New York, for example, an IBM Quantum System One machine shares the campus with the university’s Artificial Intelligence Multiprocessing Optimized System supercomputer. Similarly, Riken, a national scientific research institute in Japan, has integrated an IBM Quantum System Two with its Fugaku supercomputer.

The Oak Ridge project fits that pattern. The goal is to combine quantum and classical systems so that each component can take on the parts of a complex problem it is best suited to handle. A conventional supercomputer might help narrow the scope of a particular computing problem, parceling out the stickiest chores for the quantum computer.

But much work remains before any of that happens regularly. Simply put, it’s a matter of reconciling apples and oranges.

“The technology, the mindset around quantum computers is not obviously compatible with high-performance computing technologies, tools and culture,” Humble said.

Sorting out the fundamentals

Oak Ridge has begun work on overcoming those differences and, in the process, plans to identify the key elements that will help enterprises deploy quantum-classical systems. A fundamental question for enterprises in healthcare, materials development or defense, for example, is determining the timeline for acquiring a hybrid HPC quantum computer that will meet their mission goals and computing targets, Humble said.

“What we are learning now is about the timelines that you’ll need for integration, the tools that you’ll need, and even the requirements that you may ultimately have for space and power,” Humble said.

As for the latter, the lab has found that Quantum Brilliance’s computers fit into an HPC center with minimal fuss. Humble said each device is designed to fit into a standard server rack. He noted that the power and cooling requirements “are not far off from a conventional computer server.”

Cooling requirements differ across the various types of quantum machines available. For example, quantum computers based on superconducting qubits typically operate at extremely cold temperatures to protect fragile quantum states from collapsing. Specialized cooling devices like dilution refrigerators are often part of such deployments.

Quantum Brilliance’s computers, however, are based on a nitrogen vacancy in a diamond. This approach, exotic even by quantum computing standards, relies on diamonds’ rigid crystalline structure and electrical insulation properties to create the stability needed for longer processing windows and fewer errors. The diamond technology lets machines operate at room temperature and forgo specialized cooling.

Humble said Oak Ridge opted for the Quantum Brilliance technology partly because of its familiar footprint, room-temperature operations and ability to place the quantum machines close to the lab’s computer infrastructure.

Building tighter connections

The close physical association of quantum and classical components within the HPC center will help the lab explore integration. The results of that experimentation will have implications for quantum’s wider enterprise use.

Marcus Doherty, co-founder and CTO at Quantum Brilliance, said the company and Oak Ridge will jointly develop and test different architectures, methods and applications of tightly integrated classical and quantum processors. He described the deployment as a cluster consisting of three parallelized compute nodes, each containing a CPU, a GPU and a quantum processing unit. The nodes share a classical information bus and communicate with each other through classical communication channels, he added.

This type of “massively paralyzed and hybridized” environment is essential for quantum HPC to have the “broadest and greatest impact for enterprise users in the future,” Doherty said.

He pointed to the example of quantum machine learning, which will require mass parallelization during model training to execute large amounts of data and numerous training cycles. That parallelization will also be needed during inferencing, when multiple enterprise users call the trained model simultaneously.

Another nod for integration comes from a joint report from Alice & Bob, a quantum computing company, and Hyperion Research, an industry analysis company focused on HPC. The report, published on Sept. 9, stated, “Now is the time to build bridges between quantum and HPC.”

The joint study projected that early fault-tolerant quantum computers will arrive by the end of the decade, perhaps as early as 2027. Those machines will have lower error rates than the current crop of quantum computers, pushing them beyond the capabilities of classical supercomputers. Quantum computer vendors, researchers and early adopters seek to achieve quantum advantage over conventional IT.

The report cited the creation of a hybrid software stack as a key step toward integrating HPC with early fault-tolerant computers. That stack will function as a middleware layer that coordinates quantum and HPC processing.

“In the next few years, we’ll see the middleware for allowing hybrid workflows,” said Juliette Peyronnet, a co-author of the report and U.S. general manager of Alice & Bob, which is based in Paris and Boston.

In the meantime, organizations like Oak Ridge, which aim to parallelize quantum computing with classical workflows, must develop a deep understanding of how these machines work — and anticipate errors, she said.

“When you run a job, there is always some failure somewhere,” she said. “You have to anticipate that and build a robust software stack that allows you to still get the result you are expecting,” Peyronnet said.

Testing as many types of quantum computers as they can

At Oak Ridge, the integration task will likely span additional on-premises machines.

“I think we need to get access to as many different types of quantum computers as possible in order to understand the advantages and the nuances that come with each of them,” Humble said.

Moves in that direction appear to be underway. In August, IQM Quantum Computers, a Finnish quantum computer maker, said Oak Ridge had selected its IQM Radiance machine for on-premises deployment and integration. IQM’s computers are based on superconducting qubits.

Jan Goetz, co-CEO at IQM, said integrating quantum and traditional HPC could reshape entire industries. In the near term, quantum computers can act as accelerators, boosting the performance of HPC systems, he noted. However, integration will eventually open broader enterprise opportunities.

“Down the road, this collaboration and hybridization of workflows will accelerate and enable completely new use cases,” Goetz said.

Humble said the ongoing integration of quantum and HPC systems will help define future architectures.

“One of the most exciting things is getting to do the research today, but knowing what we are doing today is going to have a huge impact on what tomorrow’s systems look like,” he said.

For more insights from John Moore on quantum computing, check out the article: Backplane to the Future: InfiniBand Technology Meets Quantum.