Atom Computing Demonstrates Key Milestone on Path to Fault Tolerance

Rob Hays, CEO

Today, researchers at Atom Computing released a pre-print publication on arXiv , demonstrating the ability to perform mid-circuit measurement on arbitrary qubits without disrupting others in an atomic array. The team applied mid-circuit measurement to detect the loss of qubits from the array (a well-known anticipated error), and successfully replaced the lost qubits from a nearby reservoir.

Path to Fault Tolerance

At Atom Computing, we believe the true potential of quantum computing will be achieved when devices are capable of fault-tolerant computation.  Our company’s focus is on leading that race to unlock the enormous potential of quantum computing applications for industrial and scientific uses. 

Dr. John Preskill, famed theoretical physics professor at California Institute of Technology, who coined the phrase Noisy Intermediate Scale Quantum (NISQ) to describe the current stage of quantum computing, said it best in 2018: “We must not lose sight of the essential longer-term goal: hastening the onset of the fault tolerant era.”

It will likely require hundreds of thousands or millions of physical qubits to achieve fault tolerant systems that can operate continuously and deliver accurate results, overcoming any errors that may occur during computation just as classical computing systems do.  Mid-circuit measurement is one of several key building blocks required to achieve fault tolerant systems:

Mid-circuit measurement has been demonstrated on superconducting and trapped-ion quantum technologies.  Atom Computing, however, is the first company to do so on a commercial atomic array system.  Recent achievements in the neutral atom research community have shown that atomic arrays are emerging from their “dark horse” status to a preferred architecture with intriguing potential.

Importance of Mid-Circuit Measurement

In quantum computing, circuits act as instructions that tell a quantum computer how to perform a calculation.  Circuits define how the programmer intends for the qubits to interact, which gates they need to complete, and in what order they need to be performed.

Mid-circuit measurement involves probing the quantum state of certain qubits, known as ancillas, without disrupting nearby data qubits that perform calculations.  The ability to measure or read out specific qubits during computation without disrupting the rest is essential for quantum developers.  It enables them to glimpse inside a calculation and use conditional branching to determine which action to take based on results of the measurement, similar to IF/THEN statements used in classical computing. With this capability errors can be detected, identified, and corrected in real time. 

Dr. Ben Bloom, Atom Computing Founder and Chief Technology Officer, called this demonstration an important step for the company’s technology, which uses lasers to hold neutral atom qubits in a two-dimensional array to perform computations.

“This is further proof that atomic array quantum computers are rapidly gaining ground in the race to build large-scale, fault-tolerant quantum computers,” Bloom said. “Mid-circuit measurement enables us to understand what is happening during a computation and make decisions based on the information we are seeing,”

Doing this is tricky. Qubits, whether they are in an atomic array, ion trap, or on a chip, are situated microscopically close. Qubits are finicky, fragile, and sensitive.  A stray photon from a laser or a stray electric field can cause the wrong qubit to decohere and lose its quantum state. 

The Atom Computing team exhibited a technique to “hide” data qubits and shield them from the laser used to measure ancillas without losing any of the quantum information stored in the data qubits. They also showed a competitive SPAM fidelity, a metric that states how well a qubit can be read out. This work demonstrates an important pathway to continuous circuit processing.

What’s Next

Atom Computing is building quantum computers from arrays of neutral atoms because of the potential to significantly scale qubit numbers with each generation.  We previously demonstrated record coherence times on our 100-qubit prototype system and are now working on larger scale production systems to offer as a commercial cloud service.  Our demonstration of mid-circuit measurement, error detection, and correction was performed on these next-generation systems.

Our journey toward fault tolerance continues. We are working to achieve all the necessary “ingredients” listed above on our current systems and on future machines with the Defense Advanced Research Projects Agency.  DARPA selected Atom Computing to explore how atomic arrays of neutral atoms can accelerate the path to fault-tolerant quantum computing.


Atom Computing Welcomes New Scientific Advisors

Dr. Jonathan King, Chief Scientist and Co-Founder

I am pleased to welcome two renowned researchers in the field of quantum information science to Atom Computing as independent scientific advisors. 

Dr. Bert de Jong, Senior Scientist and Deputy Director of the Quantum Systems Accelerator at Lawrence Berkeley National Laboratory, and Dr. Eliot Kapit, Associate Professor of Physics and Director of Quantum Engineering at Colorado School of Mines, join our long-time advisor Dr. Jun Ye, Professor of Physics at University of Colorado-Boulder and Fellow at JILA and the National Institutes of Standards and Technology.  Together, these scientists and academic leaders will help us advance the state of the art in quantum computing by providing deep technical expertise and guidance to our team. 

Since its inception in 2018, Atom Computing has been building quantum computers from atomic arrays of optically trapped neutral atoms with the goal of delivering large-scale fault-tolerant quantum computing for a range of use cases and applications.  Building high-quality quantum computers is hard work that requires tight collaboration across our team of theoretical and experimental physicists and engineers of multiple disciplines. 

We also frequently consult with researchers from other organizations to solve difficult technical challenges.   In addition to our scientific advisors, we also have active collaborations with other experts from Sandia National Laboratories, DARPA, University of Washington, University of Colorado-Boulder, and others to support R&D of technologies required for our future roadmap.

We are at the point where we are focusing not only on developing our hardware, but also what users can do with it to solve commercial problems at scale.  Bert and Eliot are experts in quantum computing use cases and algorithms.  Their expertise will help our quantum applications team and customers learn how to get the most value out of our hardware platforms.

Bert leads Lawrence Berkeley National Laboratory’s Computational Sciences department and its Applied Computing for Scientific Discovery Group. His group’s research encompasses exascale computing, quantum computing and AI for scientific discovery.  (View his bio.

When asked about the role, Bert stated, "Atom Computing's path to universal quantum computing with atom arrays is exciting, and I am honored to be a scientific advisor providing critical input and guidance into their software and algorithm strategies.”

Eliot’s research at Colorado School of Mines focuses on quantum information science, particularly routes to achieve practical and scalable quantum advantage in noisy, near-term hardware.  (View his bio). 

Here is what Eliot said about his new role: I'm excited to have joined the scientific advisory board at Atom Computing. Neutral atom quantum computing has progressed shockingly quickly in just the past few years - and I say this as someone who's been working on superconducting qubits for the past decade, which certainly haven't been standing still.  I think Atom, in particular, has both a compelling roadmap toward large scale quantum computers, and a very capable team to make it happen.”

I am looking forward to future collaborations with Bert, Eliot, and Jun to drive large-scale quantum computing with applications that change the world.

How neutral atoms could help power next-gen quantum computers

Atom Computing Selected by DARPA to Accelerate Scalable Quantum Computing with Atomic Arrays of Neutral Atoms

January 31, 2023 — Berkeley, CA – Atom Computing has been selected by the Defense Advanced Research Projects Agency (DARPA) to explore how atomic arrays of neutral atoms could accelerate the path to fault-tolerant quantum computing. 

The company received a project award and funding to develop a next-generation system as part of DARPA’s Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program.  According to DARPA, the primary goal of the US2QC program is to determine if an underexplored approach to quantum computing is capable of achieving utility-scale operation much sooner than conventional predictions.

For this project, Atom Computing will focus on the scalability of atomic array-based quantum computing and the capability of the company to produce systems based on the technology.  Atom Computing has shown early demonstrations of its speed and the scalability of its technology by being the fastest company to develop a 100-qubit prototype and demonstrating record coherence times.

The DARPA-sponsored project will explore new ways to scale qubit count for larger systems, additional layers of entanglement connectivity for faster performance, and a broader set of quantum error correction algorithms for fault tolerance.

“In order to realize the scaling advantages of our quantum computing technology, there are a number of engineering challenges that need to be overcome.  With DARPA’s support, we will be able to accelerate our development timeframe”, said Rob Hays, CEO of Atom Computing.  “We are honored to be selected for such an important program to advance Atom Computing and the United States toward utility-scale quantum computing.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.

What are Optical Tweezer Arrays and How are They Used in Quantum Computing? Atom Computing’s Remy Notermans Explains.

In recent months, researchers from different institutions won major physics awards for advancing optical tweezer arrays and their use in quantum information sciences.

These announcements drew broader attention to optical tweezer arrays, even in the physics community.  At Atom Computing, however, they are always top-of-mind – optical tweezers are critical to our atomic array quantum computing technology. 

What are optical tweezer arrays and how and why do we use them in our quantum computers? Dr. Remy Notermans, who helped develop the optical tweezer array for Phoenix, our prototype system, answers these questions and more.

What are optical tweezer arrays?
A single optical tweezer is a beam of light used to capture atoms, molecules, cells, or nanoparticles, hold them in place or move them as needed. 

This is possible because light can attract or repulse a particle depending on the color (wavelength) of the light and the absorption properties (electronic energy level structure) of the particle.  By choosing the right wavelength, a particle will be drawn or attracted to the region with the highest intensity of light, trapped in what is known as a potential well (the energy landscape in which an atom wants to go to the lowest point.)

An optical tweezer is created when a laser beam is focused through a microscope objective lens. As the laser beam gets focused it forms into a "tweezer" capable of holding miniscule objects and manipulating them in its focal point. Think of the tractor beam from Star Trek.

To create an optical tweezer array, the laser beam is manipulated before it is focused through a microscope object lens to create a custom-made array of optical tweezers that can be tailored to specific needs – topology, dimensions, and orientation.

Are optical tweezer arrays a new technology?
Optical tweezers have been used by researchers in the fields of medicine, genetics, and chemistry for decades. In fact, Arthur Ashkin, “the father of optical tweezers,” was awarded the Nobel Prize in Physics in 2018. Ashkin’s work dates to 1970 when he first detected optical scattering and the effect of different levels of force on particles the size of microns.  He and some of his colleagues later observed a focused beam of light holding tiny particles in place – or optical tweezers.

More recent scientific work has expanded to actual arrays of optical tweezers, allowing for studying many particles simultaneously, biophysics research, and of course quantum information processing.

How does Atom Computing use optical tweezer arrays? What are the benefits?
Optical tweezers are critical to our atomic array quantum computing technology, which uses neutral atoms as qubits.  We reflect a laser beam off a spatial light modulator to create an array of many optical tweezers that each “trap” an individual qubit.  For example, Phoenix, our 100-qubit prototype quantum computer, has more than 200 optical tweezers created from a single laser. Each tweezer can be individually calibrated and optimized to ensure precise control. 

Optical tweezer arrays enable us to fit many qubits in a very small amount of space, which means that scaling the number of qubits by orders of magnitude does not significantly change the size of our quantum processing unit.  By integrating clever optical designs, we foresee a sustainable path toward atomic arrays that are large enough for fault-tolerant quantum computing.

In fact, optical tweezers inspired the Atom Computing logo.  If you turn our “A” logo upside down, it is a visual representation of an optical tweezer holding an atom in a potential well.

Are optical tweezer arrays used for other purposes?
Yes, optical tweezer arrays have been used extensively by researchers in other scientific fields. They have been used by scientists to trap living cells, viruses, bacteria, molecules, and even DNA strands so they can be studied.

Has the work of the New Horizons Physics Prize winners influenced Atom Computing’s approach? If so, how? 
We understand this is a fundamental part of the academic-industrial ecosystem, which is why Atom Computing is involved with many partnerships and funds academic research efforts that potentially help us propel our technology forward.  Combined with the knowledge and experience of our world-class engineering teams, we take these breakthroughs to the next level in terms of scalability, robustness, and systems integration.


What Developers Need to Know about our Atomic Array Quantum Computing Technology

Justin Ging, Chief Product Officer

If you are a developer working in the quantum computing space, you are familiar with or have run a circuit on a superconducting or trapped ion quantum computer. 

These two technologies were the early pioneers of the quantum hardware landscape and small versions of each have been available commercially for years.  A major challenge with these approaches is how to scale them to thousands or millions of qubits with error correction.

More recently, an alternative quantum computing technology with the potential to scale much quicker and easier has emerged - systems based on atomic arrays of neutral atoms.   These systems have inherent advantages, which have led to multiple teams developing them.  

But just as there is more than one way to cook an egg, there are different approaches to building quantum computers from atomic arrays.

At Atom Computing, we are pioneering an approach to deliver highly scalable gate-based quantum computing systems with large numbers of qubits, long coherence times, and high fidelities. 

Here are some key advantages of our atomic array quantum computing technology:

  1. Long coherence times.  Most quantum hardware companies measure coherence in units of milliseconds.  We measure it in seconds. The Atom team recently set a record for the longest coherence time in a quantum computer with Phoenix, our first-generation 100-qubit system. Phoenix demonstrated qubit coherence times of 21 seconds.  The longer qubits maintain their quantum state, the better.  Developers can run deeper circuits for more complex calculations and there is more time to detect and correct errors during computation.  How do we create such long-lived qubits? Weuse alkaline earth atoms for our qubits. These atoms do not have an electrical charge, thus they are “neutral.”  Each atom is identical, which helps with quality control, and are highly immune to environmental noise.
  2. Flexible, gate-based architecture.  Atom Computing is focused on developing a flexible and agile platform for quantum computing by supporting a universal quantum gate-set that can be programmed using standard industry quantum development platforms.  This gate-based approach allows developers to create a wide range of quantum algorithms for many use cases.  Our qubit connectivity uses Rydberg interactions where the atoms are excited to a highly energized level using laser pulses causing their electrons to orbit the nucleus at a greater distance than their ground state to interact with nearby atoms.
  3. Designed to scale.  Neutral atoms can be tightly packed into a computational array of qubits, making the quantum processor core just fractions of a cubic millimeter.  Lasers hold the atomic qubits in position in this tight array and manipulate their quantum states wirelessly with pulses of light to perform computations. This arrangement of individually trapped atoms, spaced only microns apart, allows for massive scalability, as it is possible to expand the qubit array size without substantially changing the overall footprint of the system.  For example, at a 4-micron pitch between each atom and arranged in a 3D array, a million qubits could fit in less than 1/10th of a cubic millimeter volume.

Developers looking for gate-based quantum computers with large numbers of qubits with long coherence times, should be looking to partner with Atom Computing.  We are working with private beta partners to facilitate their research on our platforms. Have questions about partnering? Contact us.

Silicon Valley Up-Start, Atom Computing, Chooses Colorado to Build Next-Generation Quantum Computers

September 28, 2022 — Boulder, CO — Atom Computing today announced the opening of its new research and development facility in Boulder during a ceremony attended by industry and academic partners, officials from federal, state, and local government, and representatives from Colorado’s Congressional delegation.

The new facility is Atom’s largest to date and will house future generations of its highly scalable quantum computers, which use atomic arrays of optically-trapped neutral atoms. The company opened its first office, which also serves as its global headquarters, in Berkeley, California in 2018.

Governor Jared Polis called the Boulder facility a significant and important investment in Colorado and evidence the state is emerging as the preeminent hub for quantum computing innovation in the U.S. and globally.

“We are excited to welcome Atom Computing to Boulder, which is already one of the world’s most booming centers for the quantum computing sector,” Polis said. “The addition of Atom Computing helps further position Colorado as an economic leader for the next big wave of technology development and will create more good-paying jobs for Coloradans.”

The Boulder facility represents an important milestone for Atom Computing, which raised $60 million through a Series B earlier this year to build its second-generation systems. The company’s 100-qubit prototype system, Phoenix, is housed in Berkeley and recently set an industry record for coherence time.

“Leading researchers and companies are choosing to partner with Atom Computing to develop quantum-enabled solutions because our atomic arrays have the potential to scale larger and faster than other qubit technologies,” said Rob Hays, CEO of Atom Computing.

Hays said the company chose Colorado because of the quantum expertise and top talent in the area and plans to expand its presence in the state. 

“We expect to invest $100 million in Colorado over the next three years as we develop our roadmap and hire more employees to support those efforts,” he said.

Ben Bloom, Atom Computing’s founder and CTO, said the company’s strong ties to Colorado also contributed to its decision to build a facility in Boulder.

“Many of our team members, myself included, have connections with local universities,” said Bloom, who earned a Ph.D. from University of Colorado-Boulder where he helped renowned physicist, Dr. Jun Ye, build one of the world’s most accurate atomic clocks. “We are committed to Colorado.”

Jun Ye, who currently serves as Atom’s Scientific Advisor, called the new facility an important addition to the quantum ecosystem.

"It is extremely gratifying to see our recent CU graduates emerge as the early trailblazers of the rapidly growing quantum industry,” said Ye, a physics professor at CU Boulder. “This creates a powerful ecosystem for the best science and technology to develop side-by-side, providing outstanding opportunities for Colorado students to lead the next wave of innovations in quantum research and the market.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically-trapped neutral atoms, empowering researchers and companies to achieve unprecedented breakthroughs. Learn more at atom-computing.com, and follow our journey on LinkedIn and Twitter.

Atom Computing Sets a World-Record Coherence Time for Neutral Atom Qubits

Following a number of accomplishments in 2021, last week Atom Computing announced it had set a qubit coherence time record that was longer than any other commercial quantum platform.