Novel Solutions for Continuously Loading Large Atomic Arrays

Researchers at Atom Computing have invented a way to keep the atomic array at the heart of the company’s quantum computing technology continuously populated with qubits.

In a preprint article on arXiv, the Atom team describes its approach both to assembling a 1,200-plus qubit array and for overcoming atom loss, a major technical challenge for quantum computers that use neutral atoms as qubits.

Dr. Ben Bloom, Founder and Chief Technology Officer, said the advancements ensure that as Atom Computing scales to larger numbers of qubits, its technologies can efficiently perform mid-circuit measurement, which is necessary for quantum error correction, and other operations.

“All quantum computing technologies need to demonstrate the ability to scale with high-fidelity qubits,” he said. “Neutral atom systems are no exception.”

Technical challenges

Atom loss occurs for numerous reasons.  Qubits can be knocked out of place by stray atoms in a vacuum chamber or occasionally disrupted during “read out” when they are imaged to check their quantum state.   

All quantum computers that use individual atoms as qubits  (such as trapped ion or neutral atom systems) experience atom loss.  But the problem is particularly acute for neutral atom quantum computing technologies.

With neutral atom technologies, developers often assemble arrays (two-dimensional grids) with extra qubits to act as a buffer.  The system still experiences loss but has enough qubits to run calculations.  

Another approach involves slowing down the read out to reduce the number of qubits lost during the process.  But this method has the disadvantage of slowing down operations and is less efficient.  

It also is difficult to replace individual qubits within an array. The conventional practice is to throw out the entire array and replace it with a new one, which becomes unwieldy and time consuming as systems scale.

“Error correction will require quantum information to survive long past the lifetime of a single atom, and we need to find effective strategies to reach and stay at very large qubit numbers,” Bloom said.

Atom Computing’s approach

To overcome these challenges, Atom researchers have engineered novel solutions into the company’s next-generation quantum computing systems, which will be commercially available within the next year. 

Outlined in the arXiv paper, Atom Computing has developed a method to continuously load ytterbium atoms into the computation zone and store them until needed.  Individual atoms are moved as needed into the array to replace missing qubits using optical tweezers (concentrated beams of light).

In addition, our researchers have designed and installed  an optical cavity into our quantum computers that creates large numbers of deep energy wells to hold qubits tightly in place when imaged. These deeper wells help protect qubits, reducing the number that are lost during read out.

These innovations have enabled the Atom Computing team to demonstrate that we can consistently fully load a large atomic array with more than a thousand qubits.

“We’ve known from the beginning that we needed to overcome these challenges if our technology is to successfully scale, operate efficiently, and achieve our goal of fault-tolerant quantum computing,” Bloom said. “It’s exciting to be able to showcase the solutions we have been developing for the past several years.”

Leveraging Atom's technology to improve constrained optimization algorithms

Jonathan King, Co-Founder and Chief Scientist

Researchers at Atom Computing have formulated a new method for stabilizing optimization algorithms that could lead to more reliable results from early-stage quantum computers.

In a pre-print article posted on arXiv, the authors describe how the method, known as subspace correction, leverages key features of Atom Computing’s atomic array quantum computing technologies, notably long qubit coherence times and mid-circuit measurement.  The technique detects whether an algorithm is violating constraints – essentially going off track during computation- and then self-corrects.

Subspace correction could reduce the number of circuits developers need to run on a quantum computer to get the correct answer for optimization problems.

We sat down with Dr. Kelly Ann Pawlak and Dr. Jeffrey Epstein, Senior Quantum Applications Engineers at Atom Computing and the lead authors of the paper, to learn more.

Tell us more about this technique. What do you mean by “subspace correction?” How does it work?

Kelly: First let’s talk about quantum error correction, one of the most important topics in quantum computing. In quantum error correction, we use mid-circuit measurements to detect the state of the quantum system and identify whether errors, such as bit flips, have occurred during operation. If we detect an error, we can apply a correction using feed-forward capabilities and resume the calculation.

Subspace correction works the same way, except it isn’t detecting and correcting “operational errors,” from electrical noise or thermal effects. Instead, it probes information relevant to solving a computational problem. In lieu of looking for bit flip errors due to stray radiation, it can check if a solution to industrial optimization problems obeys all the feasibility constraints. It can do so in the middle of a calculation without destroying the “quantumness” of the data.

In short, subspace correction really tries to leverage methods behind quantum error correction.  It is a method of encoding redundancy that checks into a quantum computation in order to solve high-value problems, not  just the basic quantum control problems.

Can you give us background on this technique? What inspired it?

Kelly:  Atom Computing is pursuing quantum computer design practices that will enable us to achieve fault-tolerant operation as soon as possible. So for us the engineering pipeline for implementing quantum error correction, rather than general NISQ operation, is our primary focus.

We challenged ourselves to answer the question: “Given our strengths and thoughtful engineering of a platform fully committed to running QEC functionality as soon as possible, are there any problems we can try to speed up on the way to early fault tolerance?” The answer is “Yes!”

It turns out that techniques of error correction are effective algorithmic tools for many problems. As mentioned, we have looked specifically at problems that have constraints, like constrained optimization and satisfaction type problems in classical computing. You can find examples of these problems everywhere in finance, manufacturing, logistics and science. 

Jeffrey: One of the things that’s interesting about quantum error correction is that it involves interplay between classical and quantum information. By extracting partial information about the state of the quantum computer and using classical processing to determine appropriate corrections to make, you can protect the coherent, quantum information stored in the remaining degrees of freedom of the processor. The starting point for these kinds of protocols is the choice of a particular subspace of the possible states of the quantum computer, which you take to be the “meaningful” states or the ones on which we encode information.  The whole machinery of quantum error correction is then about returning to this space when physical errors take you outside of it.

Many optimization problems naturally have this structure of “good” states because the problems are defined in terms of two pieces - a constraint and a cost function. The “good” states satisfy the constraint, and the answer should be chosen to minimize the cost function within that set of states. So we would like a method to ensure that our optimization algorithm does in fact return answers that correspond to constraint-satisfying states. As in the case of quantum error correction, we’d like to maintain coherence within this subspace to take advantage of any speedup that may be available for optimization problems. A possible difference is that in some cases, it may be that the method can still work for finding good solutions even if the correction strategy returns the computer to a different point in the good subspace than where it was when an error occurred, which in the case of error correction would correspond to a logical error.

Why is this technique important? How will it help quantum algorithm developers?

Kelly: Right now, it is really difficult to ensure an algorithm obeys the constraints of a problem when run on a gate-based quantum computer. Typically, you run a calculation several times, toss out bad results, and identify a solution.  With subspace correction, you can check to make sure the calculation is obeying constraints and if it is not, correct it.  We think this approach will reduce the number of circuit executions that need to be run on early-stage quantum computers.  It will save a lot of computational time and overhead.

Were you able to simulate or test the technique on optimization problems? If so, what were the results?

Jeffrey: One of the things we show in the paper is that the state preparation protocol we describe for the independent set problem has the same statistics as a classical sampling algorithm. This characteristic makes it possible to simulate its performance on relatively large graphs, although it also directly implies that the method cannot provide a sampling speedup over classical methods. Our hope is that by combining this method with optimization techniques such as Grover search, there might be speedups for some classes of graphs. We’re planning to investigate this possibility in more detail, both theoretically and using larger scale simulations in collaboration with Lawrence Berkeley National Laboratory.

Can subspace correction be applied to other problems?

Kelly: Certainly yes. Constraints are just one kind of “subspace” we can correct. We have a lot of ideas about how to apply this method to improve quantum simulation algorithms. When running a quantum simulation algorithm, you can detect when the simulation goes off course, by breaking energy conservation laws for example, and try to fix it. We would also like to explore using this method to prepare classical data in a quantum computer.

Can this method be used with other types of quantum computers?

Kelly: It could! But it would be nearly impossible for some quantum computing architectures to run parts of this algorithm prior to full fault tolerance due to the long coherence requirements. Even in early fault tolerance, where some QC architectures are racing against the clock to do quantum error correction, it would be very difficult.

Jeffrey: Everything in the method we studied lives within the framework of universal gate-based quantum computing, so our analysis doesn’t actually depend specifically on the hardware Atom Computing or any other quantum computing company is developing - it could be implemented on any platform that supports mid-circuit measurement and feedback. But the performance will depend a lot on the speed of classical computation relative to circuit times and coherence times, and our neutral atom device with long-lived qubits gives us a clear advantage.

A demonstration of how the distribution generating primitive works on a graph using SSC
An overview of the paradigm we use for algorithm development. It has the same pattern as error correction.

Meet Qualition: Winners of the Atom-sponsored Visualization Challenge

Kortny Rolston-Duce, Director of Marketing Communications

In February, quantum researchers, developers and enthusiasts gathered online to listen to speakers, complete coding challenges, and compete in hackathons as part of QHack 2023.  

More than 2,800 people from 105 countries participated in this year’s event, which was organized by Xanadu and was part expo, hackathon, and scientific conference.  Since its inception in 2019, the multi-day QHack has become one of the largest online gatherings for the quantum computing community.

As part of its ongoing support for the growing quantum ecosystem, Atom Computing sponsored the visualization hackathon for QHack 2023.  A team from Qualition won the challenge after presenting a prototype for quantum image processing framework.

We met with Qualition CEO Alish Shamskhoozani, Chief Technology Officer Amirali Malekani Nezhad, and other members of the team to learn more about their organization, their winning project, and why they believe image processing is a killer application for quantum computing.

First off, tell us about Qualition.  Can you please describe what Qualition is and does?

Qualition was launched in 2022 by a group of researchers and developers located around the world with the common goal of advancing the nascent quantum computing industry and turning hype around this technology into reality.  Qualition, which is derived from “Quantum Coalition,” has expertise in a wide variety of quantum computing applications.

We take great pride in being mentored by Womanium, an organization dedicated to promoting diversity and inclusion in the field of quantum technology.  Moreover, we are honored to have technical ambassadors from some of the top quantum technology companies in the industry, including Quantinuum, qBraid, QWORLD, and CQTech. These partnerships provide us with the expertise and guidance of some of the most respected names in the field and enable us to collaborate on projects that have the potential to make a significant impact.

Whether through our research initiatives, our participation in quantum hackathons, or our partnerships with industry leaders, Qualition is committed to making a meaningful contribution to the advancement of quantum technology.

What project did your team complete for the QHack 2023 challenge?
Qualition presented a prototype for quantum image processing framework based on efficient quantum information encoding algorithms (A Distributed Amplitude Encoder, and A linearly scalable, FRQI inspired by the paper “Quantum pixel representations and compression for N-dimensional images'' by Mercy G. Amankwah, Dr. Daan Camps, Professor. E. Wes Bethel, Dr. Roel Van Beeumen & Dr. Talita Perciano) capable of tackling high dimensional data.  We demonstrated the features and performance of the framework through two branches of applications, namely Recognition, and Alteration use cases.

Recognition encompasses all classes of classification tasks, whereas Alteration encompasses all classes of filters, and provides a practical, fully-quantum framework for enhancing or processing images using techniques with similarities to traditional image processing. The two methods taken together can accommodate all traditional image processing algorithms in a fully quantum mechanical framework.

We presented a comprehensive analysis of a variety of pipeline options and assessed overall performance as a function of  the scale of the model, the fidelity of the encoder, and the accuracy of the trained model. This work serves as the foundation for Qualition’s Quantum Image Processing framework.

What were the team’s findings? Was there an advantage to using quantum computing?

Image processing was founded on the mathematics of linear algebra, where images are mathematically represented as vectors (similarly to quantum states), and operations applied to the images, also known as filters, are mathematically represented as matrices (similar to quantum operators).

Given the mathematical parallels between image processing and quantum mechanics, we consider image processing as a quantum evolution, and thus anticipate a significant advantage when performing the image processing tasks using quantum hardware. Quantum parallelism and presence of quantum Interference allow us to achieve a speedup when performing these tasks through a modeling of states and unitary operations. Assuming the near-term existence of more fault-tolerant and quantum hardware with higher quantum volumes, we can expect to tackle large classification tasks much faster than the equivalent classical paradigms.

The Recognition model is perhaps the most advantageous application of a quantum-enabled image processing model courtesy of the low dimensionality of the output, which is usually a label, and therefore easily extracted with a low number of shots. In contrast to the Recognition model, there exists an inherent challenge for the Alteration model. It requires the user to read out a large state vector, which in turn requires a considerably high number of shots for an adequately accurate demonstration.  As quantum computing hardware improves, however, so will the potential advantage of employing quantum filters characteristic of Alteration use cases to carry out  quantum image processing. Furthermore, our current research paves the way for development of feasible generative quantum AI models, something which we are striving for in the year to come!

In conclusion, we are proud to demonstrate initial evidencethat quantum image processing is a promising avenue for performing classification  for large models The current body of work at Qualition can be further extended to a variety of modalities and generalized to be the first of its kind, “High Dimensional QML”, feasible on current hardware. These pursuits  will be integral to demonstrating quantum hardware’s advantage for real-life industrial use cases over a vast array of fields, and certain aspects of the framework can be extended to other fields of quantum computing e.g., large-scale QAOA, more complex quantum drug simulation, and even faster quantum key distribution.

Why is image processing a strong application for quantum computing?

 Image processing is an inherently resource-intensive task due to its high dimensional nature. Classical paradigms have been performing quite well, as can be seen in a variety of papers and products for classification, alteration, and generations aspects of AI and machine learning. However, the computational cost of the models and the time taken to train them grows as we scale these models and may impact the environment due to the increased demands on classical computing hardware.

Given the mathematical similarities between image processing and quantum mechanics, we can demonstrate a significant advantage when performing the classification tasks using quantum hardware. Assuming we can feasibly implement the models on current hardware, we can expect significant speedups in training and inference time as compared to the equivalent classical models.

That said, image processing is perhaps the perfect use case for demonstrating the quantum advantage in industrial use-cases for the NISQ era hardware, for both scientific as well as artistic endeavors.

Why did Qualition select the visualization challenge?

Our Quantum Image Processing framework provides a user-friendly illustration for understanding the effect of quantum evolution operators on a given state, similar to that of a Bloch Circle, but generalized to N-dimensional states.

Given the mathematical correspondence, we can infer that real-valued quantum states can be represented as images, and thus we can visually observe the effect of applying any real-valued quantum operation to the state through the change in the state’s image representation. 

This visualization can be expanded to a procedural generation illustrating the gradual change in the quantum state by gradually applying the operation, i.e., applying an RY as we gradually increase the rotation angle.

What are some areas/industries in which quantum-enabled image processing could be used?
Quantum image processing can be applied to virtually any image processing task present in the industry, and some examples which would considerably benefit from such a quantum-enabled model would be medical Imaging (e.g., cancer detection, disease classification, segmentation for neurological diseases), satellite Imaging (e.g., object detection, land cover and land use detection, and change detection), and real-time unmanned vehicle routing (e.g., autonomous vehicles, unmanned drones, and robotics).

Quantum-enabled image processing obviates the need to remove certain features for a faster inferenc, a necessary step in classical Deep Learning models. Therefore, using quantum techniques, we can capture and retain more information. In the case of medical imaging, the ability to retain key information content may be the difference between finding a small tumor or not.

We look forward to seeing what the Qualition team does next!

Atom Computing Demonstrates Key Milestone on Path to Fault Tolerance

Rob Hays, CEO

Today, researchers at Atom Computing released a pre-print publication on arXiv, demonstrating the ability to perform mid-circuit measurement on arbitrary qubits without disrupting others in an atomic array. (Read the updated article published in Physical Review X.) The team applied mid-circuit measurement to detect the loss of qubits from the array (a well-known anticipated error), and successfully replaced the lost qubits from a nearby reservoir.

Path to Fault Tolerance

At Atom Computing, we believe the true potential of quantum computing will be achieved when devices are capable of fault-tolerant computation.  Our company’s focus is on leading that race to unlock the enormous potential of quantum computing applications for industrial and scientific uses. 

Dr. John Preskill, famed theoretical physics professor at California Institute of Technology, who coined the phrase Noisy Intermediate Scale Quantum (NISQ) to describe the current stage of quantum computing, said it best in 2018: “We must not lose sight of the essential longer-term goal: hastening the onset of the fault tolerant era.”

It will likely require hundreds of thousands or millions of physical qubits to achieve fault tolerant systems that can operate continuously and deliver accurate results, overcoming any errors that may occur during computation just as classical computing systems do.  Mid-circuit measurement is one of several key building blocks required to achieve fault tolerant systems:

Mid-circuit measurement has been demonstrated on superconducting and trapped-ion quantum technologies.  Atom Computing, however, is the first company to do so on a commercial atomic array system.  Recent achievements in the neutral atom research community have shown that atomic arrays are emerging from their “dark horse” status to a preferred architecture with intriguing potential.

Importance of Mid-Circuit Measurement

In quantum computing, circuits act as instructions that tell a quantum computer how to perform a calculation.  Circuits define how the programmer intends for the qubits to interact, which gates they need to complete, and in what order they need to be performed.

Mid-circuit measurement involves probing the quantum state of certain qubits, known as ancillas, without disrupting nearby data qubits that perform calculations.  The ability to measure or read out specific qubits during computation without disrupting the rest is essential for quantum developers.  It enables them to glimpse inside a calculation and use conditional branching to determine which action to take based on results of the measurement, similar to IF/THEN statements used in classical computing. With this capability errors can be detected, identified, and corrected in real time. 

Dr. Ben Bloom, Atom Computing Founder and Chief Technology Officer, called this demonstration an important step for the company’s technology, which uses lasers to hold neutral atom qubits in a two-dimensional array to perform computations.

“This is further proof that atomic array quantum computers are rapidly gaining ground in the race to build large-scale, fault-tolerant quantum computers,” Bloom said. “Mid-circuit measurement enables us to understand what is happening during a computation and make decisions based on the information we are seeing,”

Doing this is tricky. Qubits, whether they are in an atomic array, ion trap, or on a chip, are situated microscopically close. Qubits are finicky, fragile, and sensitive.  A stray photon from a laser or a stray electric field can cause the wrong qubit to decohere and lose its quantum state. 

The Atom Computing team exhibited a technique to “hide” data qubits and shield them from the laser used to measure ancillas without losing any of the quantum information stored in the data qubits. They also showed a competitive SPAM fidelity, a metric that states how well a qubit can be read out. This work demonstrates an important pathway to continuous circuit processing.

What’s Next

Atom Computing is building quantum computers from arrays of neutral atoms because of the potential to significantly scale qubit numbers with each generation.  We previously demonstrated record coherence times on our 100-qubit prototype system and are now working on larger scale production systems to offer as a commercial cloud service.  Our demonstration of mid-circuit measurement, error detection, and correction was performed on these next-generation systems.

Our journey toward fault tolerance continues. We are working to achieve all the necessary “ingredients” listed above on our current systems and on future machines with the Defense Advanced Research Projects Agency.  DARPA selected Atom Computing to explore how atomic arrays of neutral atoms can accelerate the path to fault-tolerant quantum computing.


Atom Computing Welcomes New Scientific Advisors

Dr. Jonathan King, Chief Scientist and Co-Founder

I am pleased to welcome two renowned researchers in the field of quantum information science to Atom Computing as independent scientific advisors. 

Dr. Bert de Jong, Senior Scientist and Deputy Director of the Quantum Systems Accelerator at Lawrence Berkeley National Laboratory, and Dr. Eliot Kapit, Associate Professor of Physics and Director of Quantum Engineering at Colorado School of Mines, join our long-time advisor Dr. Jun Ye, Professor of Physics at University of Colorado-Boulder and Fellow at JILA and the National Institutes of Standards and Technology.  Together, these scientists and academic leaders will help us advance the state of the art in quantum computing by providing deep technical expertise and guidance to our team. 

Since its inception in 2018, Atom Computing has been building quantum computers from atomic arrays of optically trapped neutral atoms with the goal of delivering large-scale fault-tolerant quantum computing for a range of use cases and applications.  Building high-quality quantum computers is hard work that requires tight collaboration across our team of theoretical and experimental physicists and engineers of multiple disciplines. 

We also frequently consult with researchers from other organizations to solve difficult technical challenges.   In addition to our scientific advisors, we also have active collaborations with other experts from Sandia National Laboratories, DARPA, University of Washington, University of Colorado-Boulder, and others to support R&D of technologies required for our future roadmap.

We are at the point where we are focusing not only on developing our hardware, but also what users can do with it to solve commercial problems at scale.  Bert and Eliot are experts in quantum computing use cases and algorithms.  Their expertise will help our quantum applications team and customers learn how to get the most value out of our hardware platforms.

Bert leads Lawrence Berkeley National Laboratory’s Computational Sciences department and its Applied Computing for Scientific Discovery Group. His group’s research encompasses exascale computing, quantum computing and AI for scientific discovery.  (View his bio.

When asked about the role, Bert stated, "Atom Computing's path to universal quantum computing with atom arrays is exciting, and I am honored to be a scientific advisor providing critical input and guidance into their software and algorithm strategies.”

Eliot’s research at Colorado School of Mines focuses on quantum information science, particularly routes to achieve practical and scalable quantum advantage in noisy, near-term hardware.  (View his bio). 

Here is what Eliot said about his new role: I'm excited to have joined the scientific advisory board at Atom Computing. Neutral atom quantum computing has progressed shockingly quickly in just the past few years - and I say this as someone who's been working on superconducting qubits for the past decade, which certainly haven't been standing still.  I think Atom, in particular, has both a compelling roadmap toward large scale quantum computers, and a very capable team to make it happen.”

I am looking forward to future collaborations with Bert, Eliot, and Jun to drive large-scale quantum computing with applications that change the world.

Two (electrons) is Better than One

Kortny Rolston-Duce, Director of Marketing Communications

February 7th is National Periodic Table Day, a time to celebrate this ubiquitous chart.  

While versions of element tables sorted by various properties or masses have existed throughout the centuries, the modern Periodic Table of the Elements  emerged in the 1860s. It is arranged by increasing atomic number, which represents the number of protons in the nucleus of each atom.  

Here at Atom Computing, we are partial to alkaline earth metals, members of group two on the periodic table.  We use atoms of alkaline earth metals as qubits in our atomic array quantum computing hardware technology.

What are alkaline earth metals? What makes them well suited for our atomic array quantum computing technology? Dr. Mickey McDonald, Principal Quantum Engineer and Technical Lead, explains: 

What are alkaline earth metals?

Alkaline earth metals are all the atoms that live in the second column of the periodic table. The column number corresponds to the number of electrons that are contained in the atom’s outermost shell. Alkaline earth metals have two. There are also a few atoms in the periodic table which don’t live in the second column but still share a very similar electronic structure—the so-called “alkaline earth-like metals.”

Why does Atom Computing use alkaline earth metal atoms as qubits? 

There are two primary reasons.  The first reason is that alkaline earth metal atoms captured in optical tweezers will oscillate at very high frequencies with high stability.  For decades, researchers at places like the National Institute of Standards and Technologies (NIST) have pioneered the use of these long-lived “metastable” states to create extremely fast clocks that harness quantum effects to measure time to incredible precision.  We leverage the techniques used for timekeeping and apply them to our quantum computers.

The second reason is the structure of an alkaline earth atom guarantees that certain pairs of energy levels called “nuclear spin states” will be insensitive to external perturbations. In atoms with only a single outer-shell electron, that electron’s “spin” can couple strongly to external magnetic fields and the various angular momenta of the rest of the atom’s structure.  That coupling leads to drifts that can interfere with the qubit’s coherence. The spins of the two electrons present in alkaline earth metals “cancel out” in a way that makes certain energy levels much less sensitive to ambient fields.

What are the benefits of nuclear spin qubits?

Structure-wise, the electron outer-shell electron in alkaline earths is the big difference and what allows us to create these very environmentally insensitive nuclear spin state qubits.  Once we encode quantum information in the nuclear spin states, the information remains coherent for tens of seconds because the nuclear spin states are so well-protected from the environment. 

Another benefit is a little more esoteric, but very exciting to me as an engineer and a physicist. We use a technique called “laser cooling” to prepare samples of atoms at temperatures only a few millionths of a degree above absolute zero, which involves shining carefully tuned lasers at atoms and extracting a bit of energy from them one photon at a time. Alkaline earth atoms have a structure complex enough to allow several different colors of laser to be used for this cooling. We combine those different colors to rapidly produce samples of atoms at microkelvin temperatures, whereas other species of atoms require cooling schemes to be more complicated and take longer to reach such low temperatures.

What else makes alkaline earth metal atoms attractive for quantum computing?

We can use the atomic level structure kind of like a Swiss army knife.  Certain states are good for cooling, others are good for storing quantum information, others are good for driving entangling operations.  These unique characteristics make neutral atom qubits and Atom Computing possible!

Atom Computing Honors Lesser-Known Researchers and their Contributions to Quantum Computing    

Kortny Rolston-Duce, Director of Marketing Communications

Walk through any office building and the conference rooms likely are numbered or named according to a theme such as geographic landmarks, fictional characters, or notable figures in a field.

We initially planned to take a similar approach at our newly opened research and development facility in Boulder, Colorado by recognizing well-known physicists whose research laid the groundwork for the rapidly evolving field of quantum computing.  People such as Niels Bohr, whose work on the structure of atoms earned him the 1922 Nobel Prize for Physics, or potentially Richard Feynman, one of the first to put forth the idea of quantum computers.

Then Rob Hays, our CEO, challenged us to go beyond the usual names and consider lesser-known and more diverse researchers.  We gladly accepted the challenge and naively believed such information was a few Google searches away.  It wasn’t.  Finding names and information from verified sources was a struggle – until we discovered the Center for the History of Physics at the American Institute of Physics.

We contacted the center and to our delight, Joanna Behrman, an assistant public historian, researched a list of scientists who made significant contributions to quantum physics in the early 20th century.  Each of our conference rooms now proudly display a sign outside the door commemorating these heroes of physics and computation with a brief description of their scientific breakthroughs:

The room names have sparked conversations amongst Atom Computing staff and with customers and other visitors to our Boulder facility.  A few of our physicists didn’t know some of these researchers despite having studied and worked in the field for years.      

Sadly, many of these pioneering scientists struggled to establish careers in science once they earned their doctorates because of the limited opportunities available.

“Most of the major research institutions or universities at that time did not hire female or African- American professors,” Behrman said. “Once they graduated, it was hard for them to find positions in which they could continue their research.  Most ended up teaching at historically black or female universities or colleges and even there, some faced challenges.”

Unfortunately, that is not unusual.

“They faced overwhelming odds during the time at which they were alive to make these contributions. People like Emmy Noether made amazing contributions anyway. Her ideas about universal symmetries and conservation principles are still fundamental to physics today,” Behrman said. “If you don’t get much credit during your lifetime, it’s unlikely that you will get it after you die and people will continue to say ‘Well, I haven’t heard of her so therefore she didn’t make very many contributions.’”    

Behrman, her colleagues at the American Institute of Physics, and other organizations are trying to change that.  A team at the Niels Bohr Library and Archives produces Initial Conditions, a podcast that focuses on understudied stories in the history of physics, while Behrman and others develop lesson plans for teachers.

“In recent decades, many of these people’s names have been wonderfully resurrected from the depths of history. People are writing books and having conferences,” she said. “There are definitely movements within the physics and the history of science communities to make sure that contributions from under-recognized physicists get their due.”

At Atom Computing, we are happy to play a part in this effort and support diversity, equity, and inclusion.   We are now in the process of renaming the conference rooms at our headquarters in Berkeley, California along a similar theme.  There are many more unsung physics heroes who should be celebrated.

Stay tuned.

Tech Predictions: Wrapping Up 2022 and Looking Ahead at 2023

Rob Hays, CEO

At the beginning of last year, I published my 2022 Quantum Computing Tech Predictions.  I expected big advancements in technology development and the breadth of involvement and thought it would be interesting to look back and see how good my crystal ball was.  Here were my top six quantum computing predictions for 2022 and my comments on how it played out.

Newer quantum computing modalities would achieve eye-opening breakthroughs.

I predicted we would see an acceleration of technical demonstrations and product readiness in neutral atom technology, which in turn would build further interest in the modality and elevate awareness to be on par with the earlier technical approaches.

Neutral atom quantum computing technologies gained a lot of traction in 2022.  Various outlets published articles about the promise of neutral atoms in 2022, some of which referred to this approach as a “dark horse” in the quantum computing race or that it recently “quickened pulses in the quantum community.”  I like these descriptions because neutral atoms are a younger technology than superconducting or trapped ion systems yet are progressing rapidly. 

In May, we published a paper in Nature Communications demonstrating record coherence times of 40 seconds.  In September, researchers from various institutions won the Breakthrough Foundation’s New Horizons Prize in Physics for their work advancing optical tweezers, which are a fundamental component for neutral atom systems being developed by Atom Computing.  In November, QuEra became the first neutral atom quantum computer company to make their analog quantum computers available on the cloud via Amazon Braket.

Momentum is building in neutral atom-based quantum computing now that the technology stack has been proven to work by multiple academic and industry organizations.  I believe 2023 will be a banner year for demonstrating the scalability of the technology, which is one of the value-props that is attracting so much attention.

Increased attention on NISQ use-cases at various scales

While hardware providers are playing the long game of scaling the number of qubits to reach fault-tolerance, I predicted we would see application developers find novel ways of using Noisy Intermediate-Scale Quantum (NISQ) machines for a modest set of use-cases in financial services, aerospace, logistics, and perhaps pharma or chemistry. While I expected developers to figure out how to gain value out of NISQ systems with thousands of qubits, I didn’t think the hardware would quite get there in 2022.

Application developers did indeed deliver.  Quantum South partnered with D-Wave to study airplane cargo loading. Japan-based Qunasys proposed a hybrid quantum-classical algorithm for chemistry modeling.  Entropica Labs in Singapore developed an approach for finding minimum vertex cover (optimization for problems such as water networks, synthetic biology, and portfolio allocation.) They also deployed OpenQAOA, an open-source software development kit to create, refine, and run the Quantum Approximate Optimization Algorithm (QAOA) on NISQ devices and simulators.

Leading Cloud Service Providers would double down on quantum computing.

Citing the major cloud service providers’ unique abilities and motivation to own quantum computing hardware and software technologies, I expected US and Chinese providers to double down on quantum computing to ensure they have viable technology options that can meet their “hyper-scale” needs and capture a significant share of the value chain in the future. I predicted some will invest in multiple, competing technologies in the form of organic R&D, acquisitions, and partnerships or all the above in many cases.

This might be the hardest prediction to accurately judge the results of because much of what the cloud service providers do is shrouded in secrecy for years until they are ready to unveil new products and services.  Furthermore, the underlying infrastructure that supports their AI-driven services is often proprietary and never fully disclosed publicly. 

Nevertheless, we can see the importance of quantum computing to these companies based on their significant investments in quantum computing  people, facilities, and R&D revealed in public announcements.  The top 3 U.S. based cloud service providers each have internal teams investing in various quantum computing research to develop platforms of their own.  At the same time, they have public market places to host third-party quantum computing platforms to give users best of breed choice on what hardware to run their QIS software applications on.  While we didn’t see any major acquisitions of pure-play quantum computing companies by the cloud service providers in 2022, I think it’s only a matter of time until we do.

Investments in Quantum would continue to break records.

Coming off a record year in 2021 of venture capital investment in Quantum Computing surpassing $1.7B, according to McKinsey, some were talking of a quantum bubble and a looming quantum winter.  Given that much more investment is required to deliver large-scale quantum computing and the enormous forecasted market value, I expected the amount of money invested to continue rising with even more companies entering the race.  With one company publicly listed in Q4 2021 and two more companies signaling an intent to IPO, I said I wouldn’t be surprised to see at least five major acquisitions or IPO announcements in 2022.

There’s no doubt that the overall public and private markets took a macro-economic downturn in 2022 with the S&P500 down more than 19 percent in 2022 and the nascent publicly-traded quantum computing stocks down even further.  The two quantum companies that had announced their intent to IPO, indeed did so.  There were rumors of other SPAC deals that were scuttled as the market demand for early-stage IPOs waned.  While I said I wouldn’t have been surprised to see five or more deals, I’m not surprised that we didn’t.  It’s becoming clear that, in the current environment, public markets are not as accepting of pre-profit companies without predictable revenue growth.

Despite the public market turmoil, venture capital investments continued in quantum computing start-ups in 2022, including a $60M Series B by Atom Computing, which we are using to build our next-generation commercial quantum computing systems.  Governments around the world also continued to increase their investments in QIS research programs to gain economic and national security advantages, as described in this World Economic Forum report about global investments in quantum computing.  What’s more, these government investments are increasingly being targeted at technology diversification now that newer quantum technologies have advanced to promising stages.  I expect we will see several public announcements of more investments in these technologies around the world throughout the year.

Diversity and inclusion would be a bigger focus in Quantum Information Science

Like the broader tech industry, the QIS workforce does not represent the diverse population of our society.  I predicted we would see more cooperation in attracting talent to the ecosystem and career development support for women and minorities. As the technology and businesses mature, a broader set of job roles become available expanding beyond the physics labs of our universities. With additional job roles, an expanded talent pool, and the exciting opportunity in quantum computing, I believed we could move the needle in diversity and inclusion.

We did see a lot of focus on diversity and inclusion programs across the industry and some extraordinary efforts.  I applaud the energy and know that we can do more as an industry to help various affinity groups collaborate to advance a common agenda.

We commissioned a private study of gender diversity among the quantum computing hardware companies from publicly-available data gathered from LinkedIn as an indicator of where we stand as an industry.  According to the data set, women make up less than a quarter of the workforce in 13 quantum computing hardware companies evaluated.  Because increasing diversity has been a business priority for our company, I wasn’t surprised to see Atom Computing among the leaders with the greatest mix of female employees at 26 percent.  I was, however, surprised to see that 26 percent is the best.  That’s not nearly good enough - not by a factor of 2!  We can collectively do much better to attract, retain, and promote diverse talent if we work together among competitors and co-travellers to cooperate on supporting diversity in the QIS industry.

Regional Quantum Centers of Excellence would enable tighter collaboration.

It takes a village to build an integrated system of complex hardware, software, and services that interoperate together to deliver a superior user experience. I predicted that collaborations would emerge among universities, government research labs, and private companies to provide environments of innovation in their regions to gain an advantage over other regions in the quantum computing race.

We are seeing quantum ecosystems emerge and/or strengthen across the country.  Atom, for example, joined the University of Colorado’s Cubit Quantum Initiative, the Chicago Quantum Exchange, and the Pistoia Alliance. We also partnered on National Science Foundation proposals to expand regional workforce development programs and bolster the quantum ecosystem in the intermountain West.

The National Science Foundation and other government agencies have been instrumental in the effort to build regional quantum computing hubs that enable researchers from academia, industry, and national labs and other federal entities to connect and collaborate.

Looking ahead in 2023, I think this trend will continue and we will see many of these organizations, especially those funded by federal agencies, to include and/or expand their focus on neutral atom quantum computing technologies and partnering with companies like Atom Computing.

While my crystal ball wasn’t perfect in predicting 2022, I would say that the trends I discussed are largely in line with the results achieved as Atom Computing and the quantum computing industry made big strides in bringing scalable quantum computing technology to the hands of researchers and users who are looking to achieve unprecedented computational results.  I think that progress will only accelerate as we head into 2023 and beyond.

What are Optical Tweezer Arrays and How are They Used in Quantum Computing? Atom Computing’s Remy Notermans Explains.

In recent months, researchers from different institutions won major physics awards for advancing optical tweezer arrays and their use in quantum information sciences.

These announcements drew broader attention to optical tweezer arrays, even in the physics community.  At Atom Computing, however, they are always top-of-mind – optical tweezers are critical to our atomic array quantum computing technology. 

What are optical tweezer arrays and how and why do we use them in our quantum computers? Dr. Remy Notermans, who helped develop the optical tweezer array for Phoenix, our prototype system, answers these questions and more.

What are optical tweezer arrays?
A single optical tweezer is a beam of light used to capture atoms, molecules, cells, or nanoparticles, hold them in place or move them as needed. 

This is possible because light can attract or repulse a particle depending on the color (wavelength) of the light and the absorption properties (electronic energy level structure) of the particle.  By choosing the right wavelength, a particle will be drawn or attracted to the region with the highest intensity of light, trapped in what is known as a potential well (the energy landscape in which an atom wants to go to the lowest point.)

An optical tweezer is created when a laser beam is focused through a microscope objective lens. As the laser beam gets focused it forms into a "tweezer" capable of holding miniscule objects and manipulating them in its focal point. Think of the tractor beam from Star Trek.

To create an optical tweezer array, the laser beam is manipulated before it is focused through a microscope object lens to create a custom-made array of optical tweezers that can be tailored to specific needs – topology, dimensions, and orientation.

Are optical tweezer arrays a new technology?
Optical tweezers have been used by researchers in the fields of medicine, genetics, and chemistry for decades. In fact, Arthur Ashkin, “the father of optical tweezers,” was awarded the Nobel Prize in Physics in 2018. Ashkin’s work dates to 1970 when he first detected optical scattering and the effect of different levels of force on particles the size of microns.  He and some of his colleagues later observed a focused beam of light holding tiny particles in place – or optical tweezers.

More recent scientific work has expanded to actual arrays of optical tweezers, allowing for studying many particles simultaneously, biophysics research, and of course quantum information processing.

How does Atom Computing use optical tweezer arrays? What are the benefits?
Optical tweezers are critical to our atomic array quantum computing technology, which uses neutral atoms as qubits.  We reflect a laser beam off a spatial light modulator to create an array of many optical tweezers that each “trap” an individual qubit.  For example, Phoenix, our 100-qubit prototype quantum computer, has more than 200 optical tweezers created from a single laser. Each tweezer can be individually calibrated and optimized to ensure precise control. 

Optical tweezer arrays enable us to fit many qubits in a very small amount of space, which means that scaling the number of qubits by orders of magnitude does not significantly change the size of our quantum processing unit.  By integrating clever optical designs, we foresee a sustainable path toward atomic arrays that are large enough for fault-tolerant quantum computing.

In fact, optical tweezers inspired the Atom Computing logo.  If you turn our “A” logo upside down, it is a visual representation of an optical tweezer holding an atom in a potential well.

Are optical tweezer arrays used for other purposes?
Yes, optical tweezer arrays have been used extensively by researchers in other scientific fields. They have been used by scientists to trap living cells, viruses, bacteria, molecules, and even DNA strands so they can be studied.

Has the work of the New Horizons Physics Prize winners influenced Atom Computing’s approach? If so, how? 
We understand this is a fundamental part of the academic-industrial ecosystem, which is why Atom Computing is involved with many partnerships and funds academic research efforts that potentially help us propel our technology forward.  Combined with the knowledge and experience of our world-class engineering teams, we take these breakthroughs to the next level in terms of scalability, robustness, and systems integration.


What Developers Need to Know about our Atomic Array Quantum Computing Technology

Justin Ging, Chief Product Officer

If you are a developer working in the quantum computing space, you are familiar with or have run a circuit on a superconducting or trapped ion quantum computer. 

These two technologies were the early pioneers of the quantum hardware landscape and small versions of each have been available commercially for years.  A major challenge with these approaches is how to scale them to thousands or millions of qubits with error correction.

More recently, an alternative quantum computing technology with the potential to scale much quicker and easier has emerged - systems based on atomic arrays of neutral atoms.   These systems have inherent advantages, which have led to multiple teams developing them.  

But just as there is more than one way to cook an egg, there are different approaches to building quantum computers from atomic arrays.

At Atom Computing, we are pioneering an approach to deliver highly scalable gate-based quantum computing systems with large numbers of qubits, long coherence times, and high fidelities. 

Here are some key advantages of our atomic array quantum computing technology:

  1. Long coherence times.  Most quantum hardware companies measure coherence in units of milliseconds.  We measure it in seconds. The Atom team recently set a record for the longest coherence time in a quantum computer with Phoenix, our first-generation 100-qubit system. Phoenix demonstrated qubit coherence times of 21 seconds.  The longer qubits maintain their quantum state, the better.  Developers can run deeper circuits for more complex calculations and there is more time to detect and correct errors during computation.  How do we create such long-lived qubits? Weuse alkaline earth atoms for our qubits. These atoms do not have an electrical charge, thus they are “neutral.”  Each atom is identical, which helps with quality control, and are highly immune to environmental noise.
  2. Flexible, gate-based architecture.  Atom Computing is focused on developing a flexible and agile platform for quantum computing by supporting a universal quantum gate-set that can be programmed using standard industry quantum development platforms.  This gate-based approach allows developers to create a wide range of quantum algorithms for many use cases.  Our qubit connectivity uses Rydberg interactions where the atoms are excited to a highly energized level using laser pulses causing their electrons to orbit the nucleus at a greater distance than their ground state to interact with nearby atoms.
  3. Designed to scale.  Neutral atoms can be tightly packed into a computational array of qubits, making the quantum processor core just fractions of a cubic millimeter.  Lasers hold the atomic qubits in position in this tight array and manipulate their quantum states wirelessly with pulses of light to perform computations. This arrangement of individually trapped atoms, spaced only microns apart, allows for massive scalability, as it is possible to expand the qubit array size without substantially changing the overall footprint of the system.  For example, at a 4-micron pitch between each atom and arranged in a 3D array, a million qubits could fit in less than 1/10th of a cubic millimeter volume.

Developers looking for gate-based quantum computers with large numbers of qubits with long coherence times, should be looking to partner with Atom Computing.  We are working with private beta partners to facilitate their research on our platforms. Have questions about partnering? Contact us.