This week, I attended the Danish Quantum Industry Day where we announced that Atom Computing is establishing a European headquarters in Copenhagen as part of a strategic partnership with Denmark to accelerate quantum research in the country and serve quantum computing customers across Europe. As part of this partnership, Denmark’s Ministry of Foreign Affairs and Export and Investment Fund, EIFO, made an investment in our company.

The partnership supports Denmark’s National Strategy on Quantum Technology, announced in 2023, and is a prime example of the QIST Collaboration Agreement between Denmark and the United States. The agreement, signed in June 2022, commits both countries to foster a vibrant quantum information science and technology (QIST) ecosystem through good-faith cooperation, inclusive research communities, and collaborative venues. In Denmark, Atom Computing is tapping into quantum research that began over 100 years ago at the Niels Bohr Institute and continues to this day at NATO’s Deep Tech Lab – Quantum and Danish universities.

As part of the strategic partnership, EIFO has invested 70 million DKK in Atom Computing, which is the fund’s first direct investment in a company outside of Denmark. The investment aims to advance quantum computing as a unifying capability for the benefit of Denmark, the United States, and European Union. Quantum computing leadership is a strategic imperative between these partners with potential to reshape industries, enhance security, and address global challenges.

Atom Computing will bring leading-edge quantum computing technology, know-how, and jobs to the thriving quantum computing ecosystem already present in Denmark. We are excited to partner with Denmark due to the country’s continued investment in quantum research coupled with the clear motivation from academia, government, and industry to advance the field of quantum computing out of the labs into deployment. With this long-term partnership, we aim to advance the national strategic imperatives of both our countries.

Partnership Forms to Explore Quantum Computing for Healthcare

Atom Computing wins CTA's Emerging Tech Company of the Year Award

Novel Solutions for Continuously Loading Large Atomic Arrays

Researchers at Atom Computing have invented a way to keep the atomic array at the heart of the company’s quantum computing technology continuously populated with qubits.

In a preprint article on arXiv, the Atom team describes its approach both to assembling a 1,200-plus qubit array and for overcoming atom loss, a major technical challenge for quantum computers that use neutral atoms as qubits.

Dr. Ben Bloom, Founder and Chief Technology Officer, said the advancements ensure that as Atom Computing scales to larger numbers of qubits, its technologies can efficiently perform mid-circuit measurement, which is necessary for quantum error correction, and other operations.

“All quantum computing technologies need to demonstrate the ability to scale with high-fidelity qubits,” he said. “Neutral atom systems are no exception.”

Technical challenges

Atom loss occurs for numerous reasons. Qubits can be knocked out of place by stray atoms in a vacuum chamber or occasionally disrupted during “read out” when they are imaged to check their quantum state.

All quantum computers that use individual atoms as qubits (such as trapped ion or neutral atom systems) experience atom loss. But the problem is particularly acute for neutral atom quantum computing technologies.

With neutral atom technologies, developers often assemble arrays (two-dimensional grids) with extra qubits to act as a buffer. The system still experiences loss but has enough qubits to run calculations.

Another approach involves slowing down the read out to reduce the number of qubits lost during the process. But this method has the disadvantage of slowing down operations and is less efficient.

It also is difficult to replace individual qubits within an array. The conventional practice is to throw out the entire array and replace it with a new one, which becomes unwieldy and time consuming as systems scale.

“Error correction will require quantum information to survive long past the lifetime of a single atom, and we need to find effective strategies to reach and stay at very large qubit numbers,” Bloom said.

Atom Computing’s approach

To overcome these challenges, Atom researchers have engineered novel solutions into the company’s next-generation quantum computing systems, which will be commercially available within the next year.

Outlined in the arXiv paper, Atom Computing has developed a method to continuously load ytterbium atoms into the computation zone and store them until needed. Individual atoms are moved as needed into the array to replace missing qubits using optical tweezers (concentrated beams of light).

In addition, our researchers have designed and installed an optical cavity into our quantum computers that creates large numbers of deep energy wells to hold qubits tightly in place when imaged. These deeper wells help protect qubits, reducing the number that are lost during read out.

These innovations have enabled the Atom Computing team to demonstrate that we can consistently fully load a large atomic array with more than a thousand qubits.

“We’ve known from the beginning that we needed to overcome these challenges if our technology is to successfully scale, operate efficiently, and achieve our goal of fault-tolerant quantum computing,” Bloom said. “It’s exciting to be able to showcase the solutions we have been developing for the past several years.”

Leveraging Atom's technology to improve constrained optimization algorithms

Jonathan King, Co-Founder and Chief Scientist

Researchers at Atom Computing have formulated a new method for stabilizing optimization algorithms that could lead to more reliable results from early-stage quantum computers.

In a pre-print article posted on arXiv, the authors describe how the method, known as subspace correction, leverages key features of Atom Computing’s atomic array quantum computing technologies, notably long qubit coherence times and mid-circuit measurement. The technique detects whether an algorithm is violating constraints – essentially going off track during computation- and then self-corrects.

Subspace correction could reduce the number of circuits developers need to run on a quantum computer to get the correct answer for optimization problems.

We sat down with Dr. Kelly Ann Pawlak and Dr. Jeffrey Epstein, Senior Quantum Applications Engineers at Atom Computing and the lead authors of the paper, to learn more.

Tell us more about this technique. What do you mean by “subspace correction?” How does it work?

Kelly: First let’s talk about quantum error correction, one of the most important topics in quantum computing. In quantum error correction, we use mid-circuit measurements to detect the state of the quantum system and identify whether errors, such as bit flips, have occurred during operation. If we detect an error, we can apply a correction using feed-forward capabilities and resume the calculation.

Subspace correction works the same way, except it isn’t detecting and correcting “operational errors,” from electrical noise or thermal effects. Instead, it probes information relevant to solving a computational problem. In lieu of looking for bit flip errors due to stray radiation, it can check if a solution to industrial optimization problems obeys all the feasibility constraints. It can do so in the middle of a calculation without destroying the “quantumness” of the data.

In short, subspace correction really tries to leverage methods behind quantum error correction. It is a method of encoding redundancy that checks into a quantum computation in order to solve high-value problems, not just the basic quantum control problems.

Can you give us background on this technique? What inspired it?

Kelly: Atom Computing is pursuing quantum computer design practices that will enable us to achieve fault-tolerant operation as soon as possible. So for us the engineering pipeline for implementing quantum error correction, rather than general NISQ operation, is our primary focus.

We challenged ourselves to answer the question: “Given our strengths and thoughtful engineering of a platform fully committed to running QEC functionality as soon as possible, are there any problems we can try to speed up on the way to early fault tolerance?” The answer is “Yes!”

It turns out that techniques of error correction are effective algorithmic tools for many problems. As mentioned, we have looked specifically at problems that have constraints, like constrained optimization and satisfaction type problems in classical computing. You can find examples of these problems everywhere in finance, manufacturing, logistics and science.

Jeffrey: One of the things that’s interesting about quantum error correction is that it involves interplay between classical and quantum information. By extracting partial information about the state of the quantum computer and using classical processing to determine appropriate corrections to make, you can protect the coherent, quantum information stored in the remaining degrees of freedom of the processor. The starting point for these kinds of protocols is the choice of a particular subspace of the possible states of the quantum computer, which you take to be the “meaningful” states or the ones on which we encode information. The whole machinery of quantum error correction is then about returning to this space when physical errors take you outside of it.

Many optimization problems naturally have this structure of “good” states because the problems are defined in terms of two pieces - a constraint and a cost function. The “good” states satisfy the constraint, and the answer should be chosen to minimize the cost function within that set of states. So we would like a method to ensure that our optimization algorithm does in fact return answers that correspond to constraint-satisfying states. As in the case of quantum error correction, we’d like to maintain coherence within this subspace to take advantage of any speedup that may be available for optimization problems. A possible difference is that in some cases, it may be that the method can still work for finding good solutions even if the correction strategy returns the computer to a different point in the good subspace than where it was when an error occurred, which in the case of error correction would correspond to a logical error.

Why is this technique important? How will it help quantum algorithm developers?

Kelly: Right now, it is really difficult to ensure an algorithm obeys the constraints of a problem when run on a gate-based quantum computer. Typically, you run a calculation several times, toss out bad results, and identify a solution. With subspace correction, you can check to make sure the calculation is obeying constraints and if it is not, correct it. We think this approach will reduce the number of circuit executions that need to be run on early-stage quantum computers. It will save a lot of computational time and overhead.

Were you able to simulate or test the technique on optimization problems? If so, what were the results?

Jeffrey: One of the things we show in the paper is that the state preparation protocol we describe for the independent set problem has the same statistics as a classical sampling algorithm. This characteristic makes it possible to simulate its performance on relatively large graphs, although it also directly implies that the method cannot provide a sampling speedup over classical methods. Our hope is that by combining this method with optimization techniques such as Grover search, there might be speedups for some classes of graphs. We’re planning to investigate this possibility in more detail, both theoretically and using larger scale simulations in collaboration with Lawrence Berkeley National Laboratory.

Can subspace correction be applied to other problems?

Kelly: Certainly yes. Constraints are just one kind of “subspace” we can correct. We have a lot of ideas about how to apply this method to improve quantum simulation algorithms. When running a quantum simulation algorithm, you can detect when the simulation goes off course, by breaking energy conservation laws for example, and try to fix it. We would also like to explore using this method to prepare classical data in a quantum computer.

Can this method be used with other types of quantum computers?

Kelly: It could! But it would be nearly impossible for some quantum computing architectures to run parts of this algorithm prior to full fault tolerance due to the long coherence requirements. Even in early fault tolerance, where some QC architectures are racing against the clock to do quantum error correction, it would be very difficult.

Jeffrey: Everything in the method we studied lives within the framework of universal gate-based quantum computing, so our analysis doesn’t actually depend specifically on the hardware Atom Computing or any other quantum computing company is developing - it could be implemented on any platform that supports mid-circuit measurement and feedback. But the performance will depend a lot on the speed of classical computation relative to circuit times and coherence times, and our neutral atom device with long-lived qubits gives us a clear advantage.

Record-breaking quantum computer has more than 1000 qubits

Atom Computing is the first to announce a 1,000+ qubit quantum computer

Quantum startup Atom Computing first to exceed 1,000 qubits

Systems to be available in 2024, on path to fault-tolerant quantum computing this decade

October 24, 2023 - Boulder, CO - Atom Computing announced it has created a 1,225-site atomic array, currently populated with 1,180 qubits, in its next-generation quantum computing platform.

This is the first time a company has crossed the 1,000-qubit threshold for a universal gate-based system, planned for release next year. It marks an industry milestone toward fault-tolerant quantum computers capable of solving large-scale problems.

CEO Rob Hays said rapid scaling is a key benefit of Atom Computing’s unique atomic array technology. “This order-of-magnitude leap – from 100 to 1,000-plus qubits within a generation – shows our atomic array systems are quickly gaining ground on more mature qubit modalities,” Hays said. “Scaling to large numbers of qubits is critical for fault-tolerant quantum computing, which is why it has been our focus from the beginning. We are working closely with partners to explore near-term applications that can take advantage of these larger scale systems.”

Paul Smith-Goodson, vice president and a principal analyst at Moor Insights & Strategy, said the 1,000-plus qubit milestone makes Atom Computing a serious contender in the race to build a fault-tolerant system.

“It is highly impressive that Atom Computing, which was founded just five years ago, is going up against larger companies with more resources and holding its own,” he said. “The company has been laser focused on scaling its atomic array technology and is making rapid progress.”

Fault-tolerant quantum computers that can overcome errors during computations and deliver accurate results will require hundreds of thousands, if not millions, of physical qubits along with other key capabilities, including:

Long coherence times. The company has achieved record coherence times by demonstrating its qubits can store quantum information for 40 seconds.

High fidelities. Being able to control qubits consistently and accurately to reduce the number of errors that occur during a computation.

Error correction. The ability to correct errors in real time.

Logical qubits. Implementing algorithms and controls to combine large numbers of physical qubits into a “logical qubit” designed to yield correct results even when errors occur.

Hays said Atom Computing continues to work toward these capabilities with its next-generation system, which provides new opportunities for its partners.

Guenter Klas, leader of the Quantum Research Cluster at Vodafone said, “We welcome innovations like the neutral atom approach to building quantum computers as from Atom Computing. In the end, we want quantum algorithms to make an economic difference and open up new opportunities, and for that goal scalable hardware, high fidelity, and long coherence times are very promising ingredients.”

Tommaso Demarie, CEO of Entropica Labs, a strategic partner of Atom Computing, said, “Developing a 1,000-plus qubit quantum technology marks an exceptional achievement for the Atom Computing team and the entire industry. With expanded computational capabilities, we can now delve deeper into the intricate realm of error correction schemes, designing and implementing strategies that pave the way for more reliable and scalable quantum computing systems. Entropica is enthusiastic about collaborating with Atom Computing as we create software that takes full advantage of their large-scale quantum computers."

Atom Computing is working with enterprise, academic, and government users today to develop applications and reserve time on the systems, which will be available in 2024.

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with arrays of optically trapped neutral atoms. We collaborate with researchers, organizations, governments, and companies to help develop quantum-enabled tools and solutions for the growing global ecosystem. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.

From semiconductors to quantum computing: What the US can learn from past oversight

Atom Computing: How the US can Win the Quantum Race

This website uses cookies for both essential and non-essential purposes, including website analytics and interest-based advertising. We also permit third parties to collect cookies for analytics and advertising purposes. By clicking “I accept” below, you consent to our use of non-essential cookies on the Site. To learn more about how we use cookies and your choices, please visit our Cookie Policy. To learn more about our privacy practices in general, please review our Privacy Policy. I acceptI do not acceptPrivacy policy