Novel Solutions for Continuously Loading Large Atomic Arrays

Researchers at Atom Computing have invented a way to keep the atomic array at the heart of the company’s quantum computing technology continuously populated with qubits.

In a preprint article on arXiv, the Atom team describes its approach both to assembling a 1,200-plus qubit array and for overcoming atom loss, a major technical challenge for quantum computers that use neutral atoms as qubits.

Dr. Ben Bloom, Founder and Chief Technology Officer, said the advancements ensure that as Atom Computing scales to larger numbers of qubits, its technologies can efficiently perform mid-circuit measurement, which is necessary for quantum error correction, and other operations.

“All quantum computing technologies need to demonstrate the ability to scale with high-fidelity qubits,” he said. “Neutral atom systems are no exception.”

Technical challenges

Atom loss occurs for numerous reasons.  Qubits can be knocked out of place by stray atoms in a vacuum chamber or occasionally disrupted during “read out” when they are imaged to check their quantum state.   

All quantum computers that use individual atoms as qubits  (such as trapped ion or neutral atom systems) experience atom loss.  But the problem is particularly acute for neutral atom quantum computing technologies.

With neutral atom technologies, developers often assemble arrays (two-dimensional grids) with extra qubits to act as a buffer.  The system still experiences loss but has enough qubits to run calculations.  

Another approach involves slowing down the read out to reduce the number of qubits lost during the process.  But this method has the disadvantage of slowing down operations and is less efficient.  

It also is difficult to replace individual qubits within an array. The conventional practice is to throw out the entire array and replace it with a new one, which becomes unwieldy and time consuming as systems scale.

“Error correction will require quantum information to survive long past the lifetime of a single atom, and we need to find effective strategies to reach and stay at very large qubit numbers,” Bloom said.

Atom Computing’s approach

To overcome these challenges, Atom researchers have engineered novel solutions into the company’s next-generation quantum computing systems, which will be commercially available within the next year. 

Outlined in the arXiv paper, Atom Computing has developed a method to continuously load ytterbium atoms into the computation zone and store them until needed.  Individual atoms are moved as needed into the array to replace missing qubits using optical tweezers (concentrated beams of light).

In addition, our researchers have designed and installed  an optical cavity into our quantum computers that creates large numbers of deep energy wells to hold qubits tightly in place when imaged. These deeper wells help protect qubits, reducing the number that are lost during read out.

These innovations have enabled the Atom Computing team to demonstrate that we can consistently fully load a large atomic array with more than a thousand qubits.

“We’ve known from the beginning that we needed to overcome these challenges if our technology is to successfully scale, operate efficiently, and achieve our goal of fault-tolerant quantum computing,” Bloom said. “It’s exciting to be able to showcase the solutions we have been developing for the past several years.”

Leveraging Atom's technology to improve constrained optimization algorithms

Jonathan King, Co-Founder and Chief Scientist

Researchers at Atom Computing have formulated a new method for stabilizing optimization algorithms that could lead to more reliable results from early-stage quantum computers.

In a pre-print article posted on arXiv, the authors describe how the method, known as subspace correction, leverages key features of Atom Computing’s atomic array quantum computing technologies, notably long qubit coherence times and mid-circuit measurement.  The technique detects whether an algorithm is violating constraints – essentially going off track during computation- and then self-corrects.

Subspace correction could reduce the number of circuits developers need to run on a quantum computer to get the correct answer for optimization problems.

We sat down with Dr. Kelly Ann Pawlak and Dr. Jeffrey Epstein, Senior Quantum Applications Engineers at Atom Computing and the lead authors of the paper, to learn more.

Tell us more about this technique. What do you mean by “subspace correction?” How does it work?

Kelly: First let’s talk about quantum error correction, one of the most important topics in quantum computing. In quantum error correction, we use mid-circuit measurements to detect the state of the quantum system and identify whether errors, such as bit flips, have occurred during operation. If we detect an error, we can apply a correction using feed-forward capabilities and resume the calculation.

Subspace correction works the same way, except it isn’t detecting and correcting “operational errors,” from electrical noise or thermal effects. Instead, it probes information relevant to solving a computational problem. In lieu of looking for bit flip errors due to stray radiation, it can check if a solution to industrial optimization problems obeys all the feasibility constraints. It can do so in the middle of a calculation without destroying the “quantumness” of the data.

In short, subspace correction really tries to leverage methods behind quantum error correction.  It is a method of encoding redundancy that checks into a quantum computation in order to solve high-value problems, not  just the basic quantum control problems.

Can you give us background on this technique? What inspired it?

Kelly:  Atom Computing is pursuing quantum computer design practices that will enable us to achieve fault-tolerant operation as soon as possible. So for us the engineering pipeline for implementing quantum error correction, rather than general NISQ operation, is our primary focus.

We challenged ourselves to answer the question: “Given our strengths and thoughtful engineering of a platform fully committed to running QEC functionality as soon as possible, are there any problems we can try to speed up on the way to early fault tolerance?” The answer is “Yes!”

It turns out that techniques of error correction are effective algorithmic tools for many problems. As mentioned, we have looked specifically at problems that have constraints, like constrained optimization and satisfaction type problems in classical computing. You can find examples of these problems everywhere in finance, manufacturing, logistics and science. 

Jeffrey: One of the things that’s interesting about quantum error correction is that it involves interplay between classical and quantum information. By extracting partial information about the state of the quantum computer and using classical processing to determine appropriate corrections to make, you can protect the coherent, quantum information stored in the remaining degrees of freedom of the processor. The starting point for these kinds of protocols is the choice of a particular subspace of the possible states of the quantum computer, which you take to be the “meaningful” states or the ones on which we encode information.  The whole machinery of quantum error correction is then about returning to this space when physical errors take you outside of it.

Many optimization problems naturally have this structure of “good” states because the problems are defined in terms of two pieces - a constraint and a cost function. The “good” states satisfy the constraint, and the answer should be chosen to minimize the cost function within that set of states. So we would like a method to ensure that our optimization algorithm does in fact return answers that correspond to constraint-satisfying states. As in the case of quantum error correction, we’d like to maintain coherence within this subspace to take advantage of any speedup that may be available for optimization problems. A possible difference is that in some cases, it may be that the method can still work for finding good solutions even if the correction strategy returns the computer to a different point in the good subspace than where it was when an error occurred, which in the case of error correction would correspond to a logical error.

Why is this technique important? How will it help quantum algorithm developers?

Kelly: Right now, it is really difficult to ensure an algorithm obeys the constraints of a problem when run on a gate-based quantum computer. Typically, you run a calculation several times, toss out bad results, and identify a solution.  With subspace correction, you can check to make sure the calculation is obeying constraints and if it is not, correct it.  We think this approach will reduce the number of circuit executions that need to be run on early-stage quantum computers.  It will save a lot of computational time and overhead.

Were you able to simulate or test the technique on optimization problems? If so, what were the results?

Jeffrey: One of the things we show in the paper is that the state preparation protocol we describe for the independent set problem has the same statistics as a classical sampling algorithm. This characteristic makes it possible to simulate its performance on relatively large graphs, although it also directly implies that the method cannot provide a sampling speedup over classical methods. Our hope is that by combining this method with optimization techniques such as Grover search, there might be speedups for some classes of graphs. We’re planning to investigate this possibility in more detail, both theoretically and using larger scale simulations in collaboration with Lawrence Berkeley National Laboratory.

Can subspace correction be applied to other problems?

Kelly: Certainly yes. Constraints are just one kind of “subspace” we can correct. We have a lot of ideas about how to apply this method to improve quantum simulation algorithms. When running a quantum simulation algorithm, you can detect when the simulation goes off course, by breaking energy conservation laws for example, and try to fix it. We would also like to explore using this method to prepare classical data in a quantum computer.

Can this method be used with other types of quantum computers?

Kelly: It could! But it would be nearly impossible for some quantum computing architectures to run parts of this algorithm prior to full fault tolerance due to the long coherence requirements. Even in early fault tolerance, where some QC architectures are racing against the clock to do quantum error correction, it would be very difficult.

Jeffrey: Everything in the method we studied lives within the framework of universal gate-based quantum computing, so our analysis doesn’t actually depend specifically on the hardware Atom Computing or any other quantum computing company is developing - it could be implemented on any platform that supports mid-circuit measurement and feedback. But the performance will depend a lot on the speed of classical computation relative to circuit times and coherence times, and our neutral atom device with long-lived qubits gives us a clear advantage.

A demonstration of how the distribution generating primitive works on a graph using SSC
An overview of the paradigm we use for algorithm development. It has the same pattern as error correction.

Quantum startup Atom Computing first to exceed 1,000 qubits

Systems to be available in 2024, on path to fault-tolerant quantum computing this decade

October 24, 2023 - Boulder, CO - Atom Computing announced it has created a 1,225-site atomic array, currently populated with 1,180 qubits, in its next-generation quantum computing platform.

This is the first time a company has crossed the 1,000-qubit threshold for a universal gate-based system, planned for release next year.  It marks an industry milestone toward fault-tolerant quantum computers capable of solving large-scale problems. 

CEO Rob Hays said rapid scaling is a key benefit of Atom Computing’s unique atomic array technology.  “This order-of-magnitude leap – from 100 to 1,000-plus qubits within a generation – shows our atomic array systems are quickly gaining ground on more mature qubit modalities,” Hays said.  “Scaling to large numbers of qubits is critical for fault-tolerant quantum computing, which is why it has been our focus from the beginning. We are working closely with partners to explore near-term applications that can take advantage of these larger scale systems.”

Paul Smith-Goodson, vice president and a principal analyst at Moor Insights & Strategy, said the 1,000-plus qubit milestone makes Atom Computing a serious contender in the race to build a fault-tolerant system.

“It is highly impressive that Atom Computing, which was founded just five years ago, is going up against larger companies with more resources and holding its own,” he said. “The company has been laser focused on scaling its atomic array technology and is making rapid progress.”

Fault-tolerant quantum computers that can overcome errors during computations and deliver accurate results will require hundreds of thousands, if not millions, of physical qubits along with other key capabilities, including:

Hays said Atom Computing continues to work toward these capabilities with its next-generation system, which provides new opportunities for its partners.

Guenter Klas, leader of the Quantum Research Cluster at Vodafone said, “We welcome innovations like the neutral atom approach to building quantum computers as from Atom Computing. In the end, we want quantum algorithms to make an economic difference and open up new opportunities, and for that goal scalable hardware, high fidelity, and long coherence times are very promising ingredients.”

Tommaso Demarie, CEO of Entropica Labs, a strategic partner of Atom Computing, said, “Developing a 1,000-plus qubit quantum technology marks an exceptional achievement for the Atom Computing team and the entire industry. With expanded computational capabilities, we can now delve deeper into the intricate realm of error correction schemes, designing and implementing strategies that pave the way for more reliable and scalable quantum computing systems. Entropica is enthusiastic about collaborating with Atom Computing as we create software that takes full advantage of their large-scale quantum computers."

Atom Computing is working with enterprise, academic, and government users today to develop applications and reserve time on the systems, which will be available in 2024.

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with arrays of optically trapped neutral atoms. We collaborate with researchers, organizations, governments, and companies to help develop quantum-enabled tools and solutions for the growing global ecosystem. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.

Atom Computing adds key leaders to accelerate quantum computing momentum with the U.S. government

August 30, 2023 — Berkeley, CA – Atom Computing announced it has appointed Ken Braithwaite, former Secretary of the Navy, to its Board of Directors and that Greg Muhlner has joined the company as Vice President of Public Sector to lead engagement with the U.S. government.

CEO Rob Hays said the addition of Braithwaite and Muhlner reflects the important role of the U.S. government in the advancement and adoption of quantum computing, noting Atom Computing’s collaborations with the U.S. Department of Defense, U.S. Department of Energy, and the National Science Foundation.

“The United States has a vibrant quantum ecosystem thanks, in part, to investments the federal government has made in quantum computing research and development, workforce initiatives, and procurement,” he said.  “Public-private partnership with our company will help to advance the technology and ensure U.S. leadership in this area of strategic importance. Ken (Braithwaite) and Greg (Muhlner) have extensive federal government experience that will help position Atom Computing as the premier partner to the U.S. in winning the race to large-scale, fault-tolerant quantum computing.”

Braithwaite was sworn in as the Secretary of the Navy in 2020 and previously served as a U.S. ambassador to Norway. He graduated from the U.S. Naval Academy in 1984 and was commissioned as an ensign in the U.S. Navy.  Braithwaite left active duty in 1993 but continued his service in the Navy Reserve while holding several executive leadership positions in private industry.

Muhlner has 15 years of experience in business development and sales to the federal government. Before joining Atom Computing, he was Vice President of Sales for Rebellion Defense and led Navy and U.S. Marine Corps sales at Amazon Web Services.  Muhlner served as a Naval Special Warfare (SEAL) Officer, participating in Operation Enduring Freedom in Afghanistan and Operation Iraqi Freedom. 

“Quantum computing is a disruptive technology that will redefine computing and the complexity of problems that we can solve,” Braithwaite said. “For our national security and to fuel our economy, it is imperative the United States and its allies win the quantum computing race.  I am proud to serve on Atom Computing’s board of directors to help the company achieve its mission in this critical new domain.”

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. We collaborate with researchers, organizations, governments, and companies to develop world-changing tools and solutions; and support the growing global ecosystem. Learn more at atom-computing.com and follow us on LinkedIn.

Atom Computing and National Renewable Energy Laboratory exploring electric grid optimization using quantum computing

July 20, 2023 — Boulder, C0 – Atom Computing and the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) today announced a collaboration to explore how quantum computing can help optimize electric grid operations.

During this week’s IEEE Power and Energy Society general meeting, NREL researchers demonstrated how they incorporated Atom Computing’s atomic array quantum computing technologies into the lab’s Advanced Research on Integrated Energy Systems (ARIES) research platform and its hardware-in-the-loop testing to create a first-of-a-kind “quantum-in-the-loop” capability that can run certain types of optimization problems on a quantum computer.

Dr. Rob Hovsapian, a research advisor at NREL, called the new capability an important step toward understanding how quantum computers can better balance energy loads across an electric grid. 

“Electric grids are increasingly complex as we add new power generation resources such as wind and solar, electric vehicle charging, sensors and other devices,” he said.  “We are reaching the point where electric grids have more inputs and outputs than what our classical computing models can handle. By incorporating quantum computing into our testing platform, we can begin exploring how this technology could help solve certain problems.”

Optimization problems such as managing supply chains, devising more efficient transportation routes, and improving electric grid and telecommunications networks are considered “killer applications” for quantum computing. These are large-scale problems with numerous factors and variables involved, which makes them well suited for quantum computers and the way in which they run calculations. 

Keeping power flowing across an electric grid is a good example of an optimization problem. Power plants, wind turbines, and solar farms must generate enough electricity to meet demand, which can fluctuate depending on the time of day and weather conditions.  This electricity is then routed across miles and miles of transmission lines and delivered to homes, businesses, hospitals, and other facilities in real time.

Initially, NREL and Atom Computing are exploring how quantum computing can improve decision making on the re-routing of power between feeder lines that carry electricity from a substation to a local or regional service area in the event of switch or line downtime. 

“Right now, operators primarily rely on their own experience to make this decision,” Hovsapian said. “This works but it doesn’t necessarily result in an optimal solution.  We are evaluating how a quantum computer can provide better data to make these decisions.”

Atom Computing CEO Rob Hays called the project an important example of how private industry and national laboratories can collaborate on quantum computing technology and valuable use case development. 

“Collaborations like this are extremely important for advancing quantum computing and scientific research,” Hays said.  “NREL is a global leader in renewable energy and electric grids.  We are proud to partner with them to advance their research.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. We collaborate with researchers, organizations, governments, and companies to help develop quantum-enabled tools and solutions; and support the growing global ecosystem. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.

Meet Qualition: Winners of the Atom-sponsored Visualization Challenge

Kortny Rolston-Duce, Director of Marketing Communications

In February, quantum researchers, developers and enthusiasts gathered online to listen to speakers, complete coding challenges, and compete in hackathons as part of QHack 2023.  

More than 2,800 people from 105 countries participated in this year’s event, which was organized by Xanadu and was part expo, hackathon, and scientific conference.  Since its inception in 2019, the multi-day QHack has become one of the largest online gatherings for the quantum computing community.

As part of its ongoing support for the growing quantum ecosystem, Atom Computing sponsored the visualization hackathon for QHack 2023.  A team from Qualition won the challenge after presenting a prototype for quantum image processing framework.

We met with Qualition CEO Alish Shamskhoozani, Chief Technology Officer Amirali Malekani Nezhad, and other members of the team to learn more about their organization, their winning project, and why they believe image processing is a killer application for quantum computing.

First off, tell us about Qualition.  Can you please describe what Qualition is and does?

Qualition was launched in 2022 by a group of researchers and developers located around the world with the common goal of advancing the nascent quantum computing industry and turning hype around this technology into reality.  Qualition, which is derived from “Quantum Coalition,” has expertise in a wide variety of quantum computing applications.

We take great pride in being mentored by Womanium, an organization dedicated to promoting diversity and inclusion in the field of quantum technology.  Moreover, we are honored to have technical ambassadors from some of the top quantum technology companies in the industry, including Quantinuum, qBraid, QWORLD, and CQTech. These partnerships provide us with the expertise and guidance of some of the most respected names in the field and enable us to collaborate on projects that have the potential to make a significant impact.

Whether through our research initiatives, our participation in quantum hackathons, or our partnerships with industry leaders, Qualition is committed to making a meaningful contribution to the advancement of quantum technology.

What project did your team complete for the QHack 2023 challenge?
Qualition presented a prototype for quantum image processing framework based on efficient quantum information encoding algorithms (A Distributed Amplitude Encoder, and A linearly scalable, FRQI inspired by the paper “Quantum pixel representations and compression for N-dimensional images'' by Mercy G. Amankwah, Dr. Daan Camps, Professor. E. Wes Bethel, Dr. Roel Van Beeumen & Dr. Talita Perciano) capable of tackling high dimensional data.  We demonstrated the features and performance of the framework through two branches of applications, namely Recognition, and Alteration use cases.

Recognition encompasses all classes of classification tasks, whereas Alteration encompasses all classes of filters, and provides a practical, fully-quantum framework for enhancing or processing images using techniques with similarities to traditional image processing. The two methods taken together can accommodate all traditional image processing algorithms in a fully quantum mechanical framework.

We presented a comprehensive analysis of a variety of pipeline options and assessed overall performance as a function of  the scale of the model, the fidelity of the encoder, and the accuracy of the trained model. This work serves as the foundation for Qualition’s Quantum Image Processing framework.

What were the team’s findings? Was there an advantage to using quantum computing?

Image processing was founded on the mathematics of linear algebra, where images are mathematically represented as vectors (similarly to quantum states), and operations applied to the images, also known as filters, are mathematically represented as matrices (similar to quantum operators).

Given the mathematical parallels between image processing and quantum mechanics, we consider image processing as a quantum evolution, and thus anticipate a significant advantage when performing the image processing tasks using quantum hardware. Quantum parallelism and presence of quantum Interference allow us to achieve a speedup when performing these tasks through a modeling of states and unitary operations. Assuming the near-term existence of more fault-tolerant and quantum hardware with higher quantum volumes, we can expect to tackle large classification tasks much faster than the equivalent classical paradigms.

The Recognition model is perhaps the most advantageous application of a quantum-enabled image processing model courtesy of the low dimensionality of the output, which is usually a label, and therefore easily extracted with a low number of shots. In contrast to the Recognition model, there exists an inherent challenge for the Alteration model. It requires the user to read out a large state vector, which in turn requires a considerably high number of shots for an adequately accurate demonstration.  As quantum computing hardware improves, however, so will the potential advantage of employing quantum filters characteristic of Alteration use cases to carry out  quantum image processing. Furthermore, our current research paves the way for development of feasible generative quantum AI models, something which we are striving for in the year to come!

In conclusion, we are proud to demonstrate initial evidencethat quantum image processing is a promising avenue for performing classification  for large models The current body of work at Qualition can be further extended to a variety of modalities and generalized to be the first of its kind, “High Dimensional QML”, feasible on current hardware. These pursuits  will be integral to demonstrating quantum hardware’s advantage for real-life industrial use cases over a vast array of fields, and certain aspects of the framework can be extended to other fields of quantum computing e.g., large-scale QAOA, more complex quantum drug simulation, and even faster quantum key distribution.

Why is image processing a strong application for quantum computing?

 Image processing is an inherently resource-intensive task due to its high dimensional nature. Classical paradigms have been performing quite well, as can be seen in a variety of papers and products for classification, alteration, and generations aspects of AI and machine learning. However, the computational cost of the models and the time taken to train them grows as we scale these models and may impact the environment due to the increased demands on classical computing hardware.

Given the mathematical similarities between image processing and quantum mechanics, we can demonstrate a significant advantage when performing the classification tasks using quantum hardware. Assuming we can feasibly implement the models on current hardware, we can expect significant speedups in training and inference time as compared to the equivalent classical models.

That said, image processing is perhaps the perfect use case for demonstrating the quantum advantage in industrial use-cases for the NISQ era hardware, for both scientific as well as artistic endeavors.

Why did Qualition select the visualization challenge?

Our Quantum Image Processing framework provides a user-friendly illustration for understanding the effect of quantum evolution operators on a given state, similar to that of a Bloch Circle, but generalized to N-dimensional states.

Given the mathematical correspondence, we can infer that real-valued quantum states can be represented as images, and thus we can visually observe the effect of applying any real-valued quantum operation to the state through the change in the state’s image representation. 

This visualization can be expanded to a procedural generation illustrating the gradual change in the quantum state by gradually applying the operation, i.e., applying an RY as we gradually increase the rotation angle.

What are some areas/industries in which quantum-enabled image processing could be used?
Quantum image processing can be applied to virtually any image processing task present in the industry, and some examples which would considerably benefit from such a quantum-enabled model would be medical Imaging (e.g., cancer detection, disease classification, segmentation for neurological diseases), satellite Imaging (e.g., object detection, land cover and land use detection, and change detection), and real-time unmanned vehicle routing (e.g., autonomous vehicles, unmanned drones, and robotics).

Quantum-enabled image processing obviates the need to remove certain features for a faster inferenc, a necessary step in classical Deep Learning models. Therefore, using quantum techniques, we can capture and retain more information. In the case of medical imaging, the ability to retain key information content may be the difference between finding a small tumor or not.

We look forward to seeing what the Qualition team does next!

Atom Computing Demonstrates Key Milestone on Path to Fault Tolerance

Rob Hays, CEO

Today, researchers at Atom Computing released a pre-print publication on arXiv, demonstrating the ability to perform mid-circuit measurement on arbitrary qubits without disrupting others in an atomic array. (Read the updated article published in Physical Review X.) The team applied mid-circuit measurement to detect the loss of qubits from the array (a well-known anticipated error), and successfully replaced the lost qubits from a nearby reservoir.

Path to Fault Tolerance

At Atom Computing, we believe the true potential of quantum computing will be achieved when devices are capable of fault-tolerant computation.  Our company’s focus is on leading that race to unlock the enormous potential of quantum computing applications for industrial and scientific uses. 

Dr. John Preskill, famed theoretical physics professor at California Institute of Technology, who coined the phrase Noisy Intermediate Scale Quantum (NISQ) to describe the current stage of quantum computing, said it best in 2018: “We must not lose sight of the essential longer-term goal: hastening the onset of the fault tolerant era.”

It will likely require hundreds of thousands or millions of physical qubits to achieve fault tolerant systems that can operate continuously and deliver accurate results, overcoming any errors that may occur during computation just as classical computing systems do.  Mid-circuit measurement is one of several key building blocks required to achieve fault tolerant systems:

Mid-circuit measurement has been demonstrated on superconducting and trapped-ion quantum technologies.  Atom Computing, however, is the first company to do so on a commercial atomic array system.  Recent achievements in the neutral atom research community have shown that atomic arrays are emerging from their “dark horse” status to a preferred architecture with intriguing potential.

Importance of Mid-Circuit Measurement

In quantum computing, circuits act as instructions that tell a quantum computer how to perform a calculation.  Circuits define how the programmer intends for the qubits to interact, which gates they need to complete, and in what order they need to be performed.

Mid-circuit measurement involves probing the quantum state of certain qubits, known as ancillas, without disrupting nearby data qubits that perform calculations.  The ability to measure or read out specific qubits during computation without disrupting the rest is essential for quantum developers.  It enables them to glimpse inside a calculation and use conditional branching to determine which action to take based on results of the measurement, similar to IF/THEN statements used in classical computing. With this capability errors can be detected, identified, and corrected in real time. 

Dr. Ben Bloom, Atom Computing Founder and Chief Technology Officer, called this demonstration an important step for the company’s technology, which uses lasers to hold neutral atom qubits in a two-dimensional array to perform computations.

“This is further proof that atomic array quantum computers are rapidly gaining ground in the race to build large-scale, fault-tolerant quantum computers,” Bloom said. “Mid-circuit measurement enables us to understand what is happening during a computation and make decisions based on the information we are seeing,”

Doing this is tricky. Qubits, whether they are in an atomic array, ion trap, or on a chip, are situated microscopically close. Qubits are finicky, fragile, and sensitive.  A stray photon from a laser or a stray electric field can cause the wrong qubit to decohere and lose its quantum state. 

The Atom Computing team exhibited a technique to “hide” data qubits and shield them from the laser used to measure ancillas without losing any of the quantum information stored in the data qubits. They also showed a competitive SPAM fidelity, a metric that states how well a qubit can be read out. This work demonstrates an important pathway to continuous circuit processing.

What’s Next

Atom Computing is building quantum computers from arrays of neutral atoms because of the potential to significantly scale qubit numbers with each generation.  We previously demonstrated record coherence times on our 100-qubit prototype system and are now working on larger scale production systems to offer as a commercial cloud service.  Our demonstration of mid-circuit measurement, error detection, and correction was performed on these next-generation systems.

Our journey toward fault tolerance continues. We are working to achieve all the necessary “ingredients” listed above on our current systems and on future machines with the Defense Advanced Research Projects Agency.  DARPA selected Atom Computing to explore how atomic arrays of neutral atoms can accelerate the path to fault-tolerant quantum computing.


Atom Computing Welcomes New Scientific Advisors

Dr. Jonathan King, Chief Scientist and Co-Founder

I am pleased to welcome two renowned researchers in the field of quantum information science to Atom Computing as independent scientific advisors. 

Dr. Bert de Jong, Senior Scientist and Deputy Director of the Quantum Systems Accelerator at Lawrence Berkeley National Laboratory, and Dr. Eliot Kapit, Associate Professor of Physics and Director of Quantum Engineering at Colorado School of Mines, join our long-time advisor Dr. Jun Ye, Professor of Physics at University of Colorado-Boulder and Fellow at JILA and the National Institutes of Standards and Technology.  Together, these scientists and academic leaders will help us advance the state of the art in quantum computing by providing deep technical expertise and guidance to our team. 

Since its inception in 2018, Atom Computing has been building quantum computers from atomic arrays of optically trapped neutral atoms with the goal of delivering large-scale fault-tolerant quantum computing for a range of use cases and applications.  Building high-quality quantum computers is hard work that requires tight collaboration across our team of theoretical and experimental physicists and engineers of multiple disciplines. 

We also frequently consult with researchers from other organizations to solve difficult technical challenges.   In addition to our scientific advisors, we also have active collaborations with other experts from Sandia National Laboratories, DARPA, University of Washington, University of Colorado-Boulder, and others to support R&D of technologies required for our future roadmap.

We are at the point where we are focusing not only on developing our hardware, but also what users can do with it to solve commercial problems at scale.  Bert and Eliot are experts in quantum computing use cases and algorithms.  Their expertise will help our quantum applications team and customers learn how to get the most value out of our hardware platforms.

Bert leads Lawrence Berkeley National Laboratory’s Computational Sciences department and its Applied Computing for Scientific Discovery Group. His group’s research encompasses exascale computing, quantum computing and AI for scientific discovery.  (View his bio.

When asked about the role, Bert stated, "Atom Computing's path to universal quantum computing with atom arrays is exciting, and I am honored to be a scientific advisor providing critical input and guidance into their software and algorithm strategies.”

Eliot’s research at Colorado School of Mines focuses on quantum information science, particularly routes to achieve practical and scalable quantum advantage in noisy, near-term hardware.  (View his bio). 

Here is what Eliot said about his new role: I'm excited to have joined the scientific advisory board at Atom Computing. Neutral atom quantum computing has progressed shockingly quickly in just the past few years - and I say this as someone who's been working on superconducting qubits for the past decade, which certainly haven't been standing still.  I think Atom, in particular, has both a compelling roadmap toward large scale quantum computers, and a very capable team to make it happen.”

I am looking forward to future collaborations with Bert, Eliot, and Jun to drive large-scale quantum computing with applications that change the world.

How neutral atoms could help power next-gen quantum computers

Atom Computing selected by DARPA to accelerate scalable quantum computing with atomic arrays of neutral atoms

January 31, 2023 — Berkeley, CA – Atom Computing has been selected by the Defense Advanced Research Projects Agency (DARPA) to explore how atomic arrays of neutral atoms could accelerate the path to fault-tolerant quantum computing. 

The company received a project award and funding to develop a next-generation system as part of DARPA’s Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program.  According to DARPA, the primary goal of the US2QC program is to determine if an underexplored approach to quantum computing is capable of achieving utility-scale operation much sooner than conventional predictions.

For this project, Atom Computing will focus on the scalability of atomic array-based quantum computing and the capability of the company to produce systems based on the technology.  Atom Computing has shown early demonstrations of its speed and the scalability of its technology by being the fastest company to develop a 100-qubit prototype and demonstrating record coherence times.

The DARPA-sponsored project will explore new ways to scale qubit count for larger systems, additional layers of entanglement connectivity for faster performance, and a broader set of quantum error correction algorithms for fault tolerance.

“In order to realize the scaling advantages of our quantum computing technology, there are a number of engineering challenges that need to be overcome.  With DARPA’s support, we will be able to accelerate our development timeframe”, said Rob Hays, CEO of Atom Computing.  “We are honored to be selected for such an important program to advance Atom Computing and the United States toward utility-scale quantum computing.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.