Atom Computing Demonstrates Key Milestone on Path to Fault Tolerance

Rob Hays, CEO

Today, researchers at Atom Computing released a pre-print publication on arXiv , demonstrating the ability to perform mid-circuit measurement on arbitrary qubits without disrupting others in an atomic array. The team applied mid-circuit measurement to detect the loss of qubits from the array (a well-known anticipated error), and successfully replaced the lost qubits from a nearby reservoir.

Path to Fault Tolerance

At Atom Computing, we believe the true potential of quantum computing will be achieved when devices are capable of fault-tolerant computation.  Our company’s focus is on leading that race to unlock the enormous potential of quantum computing applications for industrial and scientific uses. 

Dr. John Preskill, famed theoretical physics professor at California Institute of Technology, who coined the phrase Noisy Intermediate Scale Quantum (NISQ) to describe the current stage of quantum computing, said it best in 2018: “We must not lose sight of the essential longer-term goal: hastening the onset of the fault tolerant era.”

It will likely require hundreds of thousands or millions of physical qubits to achieve fault tolerant systems that can operate continuously and deliver accurate results, overcoming any errors that may occur during computation just as classical computing systems do.  Mid-circuit measurement is one of several key building blocks required to achieve fault tolerant systems:

Mid-circuit measurement has been demonstrated on superconducting and trapped-ion quantum technologies.  Atom Computing, however, is the first company to do so on a commercial atomic array system.  Recent achievements in the neutral atom research community have shown that atomic arrays are emerging from their “dark horse” status to a preferred architecture with intriguing potential.

Importance of Mid-Circuit Measurement

In quantum computing, circuits act as instructions that tell a quantum computer how to perform a calculation.  Circuits define how the programmer intends for the qubits to interact, which gates they need to complete, and in what order they need to be performed.

Mid-circuit measurement involves probing the quantum state of certain qubits, known as ancillas, without disrupting nearby data qubits that perform calculations.  The ability to measure or read out specific qubits during computation without disrupting the rest is essential for quantum developers.  It enables them to glimpse inside a calculation and use conditional branching to determine which action to take based on results of the measurement, similar to IF/THEN statements used in classical computing. With this capability errors can be detected, identified, and corrected in real time. 

Dr. Ben Bloom, Atom Computing Founder and Chief Technology Officer, called this demonstration an important step for the company’s technology, which uses lasers to hold neutral atom qubits in a two-dimensional array to perform computations.

“This is further proof that atomic array quantum computers are rapidly gaining ground in the race to build large-scale, fault-tolerant quantum computers,” Bloom said. “Mid-circuit measurement enables us to understand what is happening during a computation and make decisions based on the information we are seeing,”

Doing this is tricky. Qubits, whether they are in an atomic array, ion trap, or on a chip, are situated microscopically close. Qubits are finicky, fragile, and sensitive.  A stray photon from a laser or a stray electric field can cause the wrong qubit to decohere and lose its quantum state. 

The Atom Computing team exhibited a technique to “hide” data qubits and shield them from the laser used to measure ancillas without losing any of the quantum information stored in the data qubits. They also showed a competitive SPAM fidelity, a metric that states how well a qubit can be read out. This work demonstrates an important pathway to continuous circuit processing.

What’s Next

Atom Computing is building quantum computers from arrays of neutral atoms because of the potential to significantly scale qubit numbers with each generation.  We previously demonstrated record coherence times on our 100-qubit prototype system and are now working on larger scale production systems to offer as a commercial cloud service.  Our demonstration of mid-circuit measurement, error detection, and correction was performed on these next-generation systems.

Our journey toward fault tolerance continues. We are working to achieve all the necessary “ingredients” listed above on our current systems and on future machines with the Defense Advanced Research Projects Agency.  DARPA selected Atom Computing to explore how atomic arrays of neutral atoms can accelerate the path to fault-tolerant quantum computing.


Atom Computing Welcomes New Scientific Advisors

Dr. Jonathan King, Chief Scientist and Co-Founder

I am pleased to welcome two renowned researchers in the field of quantum information science to Atom Computing as independent scientific advisors. 

Dr. Bert de Jong, Senior Scientist and Deputy Director of the Quantum Systems Accelerator at Lawrence Berkeley National Laboratory, and Dr. Eliot Kapit, Associate Professor of Physics and Director of Quantum Engineering at Colorado School of Mines, join our long-time advisor Dr. Jun Ye, Professor of Physics at University of Colorado-Boulder and Fellow at JILA and the National Institutes of Standards and Technology.  Together, these scientists and academic leaders will help us advance the state of the art in quantum computing by providing deep technical expertise and guidance to our team. 

Since its inception in 2018, Atom Computing has been building quantum computers from atomic arrays of optically trapped neutral atoms with the goal of delivering large-scale fault-tolerant quantum computing for a range of use cases and applications.  Building high-quality quantum computers is hard work that requires tight collaboration across our team of theoretical and experimental physicists and engineers of multiple disciplines. 

We also frequently consult with researchers from other organizations to solve difficult technical challenges.   In addition to our scientific advisors, we also have active collaborations with other experts from Sandia National Laboratories, DARPA, University of Washington, University of Colorado-Boulder, and others to support R&D of technologies required for our future roadmap.

We are at the point where we are focusing not only on developing our hardware, but also what users can do with it to solve commercial problems at scale.  Bert and Eliot are experts in quantum computing use cases and algorithms.  Their expertise will help our quantum applications team and customers learn how to get the most value out of our hardware platforms.

Bert leads Lawrence Berkeley National Laboratory’s Computational Sciences department and its Applied Computing for Scientific Discovery Group. His group’s research encompasses exascale computing, quantum computing and AI for scientific discovery.  (View his bio.

When asked about the role, Bert stated, "Atom Computing's path to universal quantum computing with atom arrays is exciting, and I am honored to be a scientific advisor providing critical input and guidance into their software and algorithm strategies.”

Eliot’s research at Colorado School of Mines focuses on quantum information science, particularly routes to achieve practical and scalable quantum advantage in noisy, near-term hardware.  (View his bio). 

Here is what Eliot said about his new role: I'm excited to have joined the scientific advisory board at Atom Computing. Neutral atom quantum computing has progressed shockingly quickly in just the past few years - and I say this as someone who's been working on superconducting qubits for the past decade, which certainly haven't been standing still.  I think Atom, in particular, has both a compelling roadmap toward large scale quantum computers, and a very capable team to make it happen.”

I am looking forward to future collaborations with Bert, Eliot, and Jun to drive large-scale quantum computing with applications that change the world.

Two (electrons) is better than one

Kortny Rolston-Duce, Director of Marketing Communications

February 7th is National Periodic Table Day, a time to celebrate this ubiquitous chart.  

While versions of element tables sorted by various properties or masses have existed throughout the centuries, the modern Periodic Table of the Elements  emerged in the 1860s. It is arranged by increasing atomic number, which represents the number of protons in the nucleus of each atom.  

Here at Atom Computing, we are partial to alkaline earth metals, members of group two on the periodic table.  We use atoms of alkaline earth metals as qubits in our atomic array quantum computing hardware technology.

What are alkaline earth metals? What makes them well suited for our atomic array quantum computing technology? Dr. Mickey McDonald, Principal Quantum Engineer and Technical Lead, explains: 

What are alkaline earth metals?

Alkaline earth metals are all the atoms that live in the second column of the periodic table. The column number corresponds to the number of electrons that are contained in the atom’s outermost shell. Alkaline earth metals have two. There are also a few atoms in the periodic table which don’t live in the second column but still share a very similar electronic structure—the so-called “alkaline earth-like metals.”

Why does Atom Computing use alkaline earth metal atoms as qubits? 

There are two primary reasons.  The first reason is that alkaline earth metal atoms captured in optical tweezers will oscillate at very high frequencies with high stability.  For decades, researchers at places like the National Institute of Standards and Technologies (NIST) have pioneered the use of these long-lived “metastable” states to create extremely fast clocks that harness quantum effects to measure time to incredible precision.  We leverage the techniques used for timekeeping and apply them to our quantum computers.

The second reason is the structure of an alkaline earth atom guarantees that certain pairs of energy levels called “nuclear spin states” will be insensitive to external perturbations. In atoms with only a single outer-shell electron, that electron’s “spin” can couple strongly to external magnetic fields and the various angular momenta of the rest of the atom’s structure.  That coupling leads to drifts that can interfere with the qubit’s coherence. The spins of the two electrons present in alkaline earth metals “cancel out” in a way that makes certain energy levels much less sensitive to ambient fields.

What are the benefits of nuclear spin qubits?

Structure-wise, the electron outer-shell electron in alkaline earths is the big difference and what allows us to create these very environmentally insensitive nuclear spin state qubits.  Once we encode quantum information in the nuclear spin states, the information remains coherent for tens of seconds because the nuclear spin states are so well-protected from the environment. 

Another benefit is a little more esoteric, but very exciting to me as an engineer and a physicist. We use a technique called “laser cooling” to prepare samples of atoms at temperatures only a few millionths of a degree above absolute zero, which involves shining carefully tuned lasers at atoms and extracting a bit of energy from them one photon at a time. Alkaline earth atoms have a structure complex enough to allow several different colors of laser to be used for this cooling. We combine those different colors to rapidly produce samples of atoms at microkelvin temperatures, whereas other species of atoms require cooling schemes to be more complicated and take longer to reach such low temperatures.

What else makes alkaline earth metal atoms attractive for quantum computing?

We can use the atomic level structure kind of like a Swiss army knife.  Certain states are good for cooling, others are good for storing quantum information, others are good for driving entangling operations.  These unique characteristics make neutral atom qubits and Atom Computing possible!

Atom Computing Honors Lesser-Known Researchers and their Contributions to Quantum Computing    

Kortny Rolston-Duce, Director of Marketing Communications

Walk through any office building and the conference rooms likely are numbered or named according to a theme such as geographic landmarks, fictional characters, or notable figures in a field.

We initially planned to take a similar approach at our newly opened research and development facility in Boulder, Colorado by recognizing well-known physicists whose research laid the groundwork for the rapidly evolving field of quantum computing.  People such as Niels Bohr, whose work on the structure of atoms earned him the 1922 Nobel Prize for Physics, or potentially Richard Feynman, one of the first to put forth the idea of quantum computers.

Then Rob Hays, our CEO, challenged us to go beyond the usual names and consider lesser-known and more diverse researchers.  We gladly accepted the challenge and naively believed such information was a few Google searches away.  It wasn’t.  Finding names and information from verified sources was a struggle – until we discovered the Center for the History of Physics at the American Institute of Physics.

We contacted the center and to our delight, Joanna Behrman, an assistant public historian, researched a list of scientists who made significant contributions to quantum physics in the early 20th century.  Each of our conference rooms now proudly display a sign outside the door commemorating these heroes of physics and computation with a brief description of their scientific breakthroughs:

Conference Room Slider-Desktop-01
Conference Room Slider-Desktop-02
Conference Room Slider-Desktop-03
Conference Room Slider-Desktop-04
Conference Room Slider-Desktop-05
Conference Room Slider-Desktop-06 update
Conference Room Slider-Desktop-07
previous arrow
next arrow
Conference Room Slider-Mobile-01
Conference Room Slider-Mobile-02
Conference Room Slider-Mobile-03
Conference Room Slider-Mobile-04
Conference Room Slider-Mobile-05
Conference Room Slider-Mobile-06 update
Conference Room Slider-Mobile-07
previous arrow
next arrow

The room names have sparked conversations amongst Atom Computing staff and with customers and other visitors to our Boulder facility.  A few of our physicists didn’t know some of these researchers despite having studied and worked in the field for years.      

Sadly, many of these pioneering scientists struggled to establish careers in science once they earned their doctorates because of the limited opportunities available.

“Most of the major research institutions or universities at that time did not hire female or African- American professors,” Behrman said. “Once they graduated, it was hard for them to find positions in which they could continue their research.  Most ended up teaching at historically black or female universities or colleges and even there, some faced challenges.”

Unfortunately, that is not unusual.

“They faced overwhelming odds during the time at which they were alive to make these contributions. People like Emmy Noether made amazing contributions anyway. Her ideas about universal symmetries and conservation principles are still fundamental to physics today,” Behrman said. “If you don’t get much credit during your lifetime, it’s unlikely that you will get it after you die and people will continue to say ‘Well, I haven’t heard of her so therefore she didn’t make very many contributions.’”    

Behrman, her colleagues at the American Institute of Physics, and other organizations are trying to change that.  A team at the Niels Bohr Library and Archives produces Initial Conditions, a podcast that focuses on understudied stories in the history of physics, while Behrman and others develop lesson plans for teachers.

“In recent decades, many of these people’s names have been wonderfully resurrected from the depths of history. People are writing books and having conferences,” she said. “There are definitely movements within the physics and the history of science communities to make sure that contributions from under-recognized physicists get their due.”

At Atom Computing, we are happy to play a part in this effort and support diversity, equity, and inclusion.   We are now in the process of renaming the conference rooms at our headquarters in Berkeley, California along a similar theme.  There are many more unsung physics heroes who should be celebrated.

Stay tuned.

Tech Predictions: Wrapping Up 2022 and Looking Ahead at 2023

Rob Hays, CEO

At the beginning of last year, I published my 2022 Quantum Computing Tech Predictions.  I expected big advancements in technology development and the breadth of involvement and thought it would be interesting to look back and see how good my crystal ball was.  Here were my top six quantum computing predictions for 2022 and my comments on how it played out.

Newer quantum computing modalities would achieve eye-opening breakthroughs.

I predicted we would see an acceleration of technical demonstrations and product readiness in neutral atom technology, which in turn would build further interest in the modality and elevate awareness to be on par with the earlier technical approaches.

Neutral atom quantum computing technologies gained a lot of traction in 2022.  Various outlets published articles about the promise of neutral atoms in 2022, some of which referred to this approach as a “dark horse” in the quantum computing race or that it recently “quickened pulses in the quantum community.”  I like these descriptions because neutral atoms are a younger technology than superconducting or trapped ion systems yet are progressing rapidly. 

In May, we published a paper in Nature Communications demonstrating record coherence times of 40 seconds.  In September, researchers from various institutions won the Breakthrough Foundation’s New Horizons Prize in Physics for their work advancing optical tweezers, which are a fundamental component for neutral atom systems being developed by Atom Computing.  In November, QuEra became the first neutral atom quantum computer company to make their analog quantum computers available on the cloud via Amazon Braket.

Momentum is building in neutral atom-based quantum computing now that the technology stack has been proven to work by multiple academic and industry organizations.  I believe 2023 will be a banner year for demonstrating the scalability of the technology, which is one of the value-props that is attracting so much attention.

Increased attention on NISQ use-cases at various scales

While hardware providers are playing the long game of scaling the number of qubits to reach fault-tolerance, I predicted we would see application developers find novel ways of using Noisy Intermediate-Scale Quantum (NISQ) machines for a modest set of use-cases in financial services, aerospace, logistics, and perhaps pharma or chemistry. While I expected developers to figure out how to gain value out of NISQ systems with thousands of qubits, I didn’t think the hardware would quite get there in 2022.

Application developers did indeed deliver.  Quantum South partnered with D-Wave to study airplane cargo loading. Japan-based Qunasys proposed a hybrid quantum-classical algorithm for chemistry modeling.  Entropica Labs in Singapore developed an approach for finding minimum vertex cover (optimization for problems such as water networks, synthetic biology, and portfolio allocation.) They also deployed OpenQAOA, an open-source software development kit to create, refine, and run the Quantum Approximate Optimization Algorithm (QAOA) on NISQ devices and simulators.

Leading Cloud Service Providers would double down on quantum computing.

Citing the major cloud service providers’ unique abilities and motivation to own quantum computing hardware and software technologies, I expected US and Chinese providers to double down on quantum computing to ensure they have viable technology options that can meet their “hyper-scale” needs and capture a significant share of the value chain in the future. I predicted some will invest in multiple, competing technologies in the form of organic R&D, acquisitions, and partnerships or all the above in many cases.

This might be the hardest prediction to accurately judge the results of because much of what the cloud service providers do is shrouded in secrecy for years until they are ready to unveil new products and services.  Furthermore, the underlying infrastructure that supports their AI-driven services is often proprietary and never fully disclosed publicly. 

Nevertheless, we can see the importance of quantum computing to these companies based on their significant investments in quantum computing  people, facilities, and R&D revealed in public announcements.  The top 3 U.S. based cloud service providers each have internal teams investing in various quantum computing research to develop platforms of their own.  At the same time, they have public market places to host third-party quantum computing platforms to give users best of breed choice on what hardware to run their QIS software applications on.  While we didn’t see any major acquisitions of pure-play quantum computing companies by the cloud service providers in 2022, I think it’s only a matter of time until we do.

Investments in Quantum would continue to break records.

Coming off a record year in 2021 of venture capital investment in Quantum Computing surpassing $1.7B, according to McKinsey, some were talking of a quantum bubble and a looming quantum winter.  Given that much more investment is required to deliver large-scale quantum computing and the enormous forecasted market value, I expected the amount of money invested to continue rising with even more companies entering the race.  With one company publicly listed in Q4 2021 and two more companies signaling an intent to IPO, I said I wouldn’t be surprised to see at least five major acquisitions or IPO announcements in 2022.

There’s no doubt that the overall public and private markets took a macro-economic downturn in 2022 with the S&P500 down more than 19 percent in 2022 and the nascent publicly-traded quantum computing stocks down even further.  The two quantum companies that had announced their intent to IPO, indeed did so.  There were rumors of other SPAC deals that were scuttled as the market demand for early-stage IPOs waned.  While I said I wouldn’t have been surprised to see five or more deals, I’m not surprised that we didn’t.  It’s becoming clear that, in the current environment, public markets are not as accepting of pre-profit companies without predictable revenue growth.

Despite the public market turmoil, venture capital investments continued in quantum computing start-ups in 2022, including a $60M Series B by Atom Computing, which we are using to build our next-generation commercial quantum computing systems.  Governments around the world also continued to increase their investments in QIS research programs to gain economic and national security advantages, as described in this World Economic Forum report about global investments in quantum computing.  What’s more, these government investments are increasingly being targeted at technology diversification now that newer quantum technologies have advanced to promising stages.  I expect we will see several public announcements of more investments in these technologies around the world throughout the year.

Diversity and inclusion would be a bigger focus in Quantum Information Science

Like the broader tech industry, the QIS workforce does not represent the diverse population of our society.  I predicted we would see more cooperation in attracting talent to the ecosystem and career development support for women and minorities. As the technology and businesses mature, a broader set of job roles become available expanding beyond the physics labs of our universities. With additional job roles, an expanded talent pool, and the exciting opportunity in quantum computing, I believed we could move the needle in diversity and inclusion.

We did see a lot of focus on diversity and inclusion programs across the industry and some extraordinary efforts.  I applaud the energy and know that we can do more as an industry to help various affinity groups collaborate to advance a common agenda.

We commissioned a private study of gender diversity among the quantum computing hardware companies from publicly-available data gathered from LinkedIn as an indicator of where we stand as an industry.  According to the data set, women make up less than a quarter of the workforce in 13 quantum computing hardware companies evaluated.  Because increasing diversity has been a business priority for our company, I wasn’t surprised to see Atom Computing among the leaders with the greatest mix of female employees at 26 percent.  I was, however, surprised to see that 26 percent is the best.  That’s not nearly good enough - not by a factor of 2!  We can collectively do much better to attract, retain, and promote diverse talent if we work together among competitors and co-travellers to cooperate on supporting diversity in the QIS industry.

Regional Quantum Centers of Excellence would enable tighter collaboration.

It takes a village to build an integrated system of complex hardware, software, and services that interoperate together to deliver a superior user experience. I predicted that collaborations would emerge among universities, government research labs, and private companies to provide environments of innovation in their regions to gain an advantage over other regions in the quantum computing race.

We are seeing quantum ecosystems emerge and/or strengthen across the country.  Atom, for example, joined the University of Colorado’s Cubit Quantum Initiative, the Chicago Quantum Exchange, and the Pistoia Alliance. We also partnered on National Science Foundation proposals to expand regional workforce development programs and bolster the quantum ecosystem in the intermountain West.

The National Science Foundation and other government agencies have been instrumental in the effort to build regional quantum computing hubs that enable researchers from academia, industry, and national labs and other federal entities to connect and collaborate.

Looking ahead in 2023, I think this trend will continue and we will see many of these organizations, especially those funded by federal agencies, to include and/or expand their focus on neutral atom quantum computing technologies and partnering with companies like Atom Computing.

While my crystal ball wasn’t perfect in predicting 2022, I would say that the trends I discussed are largely in line with the results achieved as Atom Computing and the quantum computing industry made big strides in bringing scalable quantum computing technology to the hands of researchers and users who are looking to achieve unprecedented computational results.  I think that progress will only accelerate as we head into 2023 and beyond.

What are Optical Tweezer Arrays and How are They Used in Quantum Computing? Atom Computing’s Remy Notermans Explains.

In recent months, researchers from different institutions won major physics awards for advancing optical tweezer arrays and their use in quantum information sciences.

These announcements drew broader attention to optical tweezer arrays, even in the physics community.  At Atom Computing, however, they are always top-of-mind – optical tweezers are critical to our atomic array quantum computing technology. 

What are optical tweezer arrays and how and why do we use them in our quantum computers? Dr. Remy Notermans, who helped develop the optical tweezer array for Phoenix, our prototype system, answers these questions and more.

What are optical tweezer arrays?
A single optical tweezer is a beam of light used to capture atoms, molecules, cells, or nanoparticles, hold them in place or move them as needed. 

This is possible because light can attract or repulse a particle depending on the color (wavelength) of the light and the absorption properties (electronic energy level structure) of the particle.  By choosing the right wavelength, a particle will be drawn or attracted to the region with the highest intensity of light, trapped in what is known as a potential well (the energy landscape in which an atom wants to go to the lowest point.)

An optical tweezer is created when a laser beam is focused through a microscope objective lens. As the laser beam gets focused it forms into a "tweezer" capable of holding miniscule objects and manipulating them in its focal point. Think of the tractor beam from Star Trek.

To create an optical tweezer array, the laser beam is manipulated before it is focused through a microscope object lens to create a custom-made array of optical tweezers that can be tailored to specific needs – topology, dimensions, and orientation.

Are optical tweezer arrays a new technology?
Optical tweezers have been used by researchers in the fields of medicine, genetics, and chemistry for decades. In fact, Arthur Ashkin, “the father of optical tweezers,” was awarded the Nobel Prize in Physics in 2018. Ashkin’s work dates to 1970 when he first detected optical scattering and the effect of different levels of force on particles the size of microns.  He and some of his colleagues later observed a focused beam of light holding tiny particles in place – or optical tweezers.

More recent scientific work has expanded to actual arrays of optical tweezers, allowing for studying many particles simultaneously, biophysics research, and of course quantum information processing.

How does Atom Computing use optical tweezer arrays? What are the benefits?
Optical tweezers are critical to our atomic array quantum computing technology, which uses neutral atoms as qubits.  We reflect a laser beam off a spatial light modulator to create an array of many optical tweezers that each “trap” an individual qubit.  For example, Phoenix, our 100-qubit prototype quantum computer, has more than 200 optical tweezers created from a single laser. Each tweezer can be individually calibrated and optimized to ensure precise control. 

Optical tweezer arrays enable us to fit many qubits in a very small amount of space, which means that scaling the number of qubits by orders of magnitude does not significantly change the size of our quantum processing unit.  By integrating clever optical designs, we foresee a sustainable path toward atomic arrays that are large enough for fault-tolerant quantum computing.

In fact, optical tweezers inspired the Atom Computing logo.  If you turn our “A” logo upside down, it is a visual representation of an optical tweezer holding an atom in a potential well.

Are optical tweezer arrays used for other purposes?
Yes, optical tweezer arrays have been used extensively by researchers in other scientific fields. They have been used by scientists to trap living cells, viruses, bacteria, molecules, and even DNA strands so they can be studied.

Has the work of the New Horizons Physics Prize winners influenced Atom Computing’s approach? If so, how? 
We understand this is a fundamental part of the academic-industrial ecosystem, which is why Atom Computing is involved with many partnerships and funds academic research efforts that potentially help us propel our technology forward.  Combined with the knowledge and experience of our world-class engineering teams, we take these breakthroughs to the next level in terms of scalability, robustness, and systems integration.


What Developers Need to Know about our Atomic Array Quantum Computing Technology

Justin Ging, Chief Product Officer

If you are a developer working in the quantum computing space, you are familiar with or have run a circuit on a superconducting or trapped ion quantum computer. 

These two technologies were the early pioneers of the quantum hardware landscape and small versions of each have been available commercially for years.  A major challenge with these approaches is how to scale them to thousands or millions of qubits with error correction.

More recently, an alternative quantum computing technology with the potential to scale much quicker and easier has emerged - systems based on atomic arrays of neutral atoms.   These systems have inherent advantages, which have led to multiple teams developing them.  

But just as there is more than one way to cook an egg, there are different approaches to building quantum computers from atomic arrays.

At Atom Computing, we are pioneering an approach to deliver highly scalable gate-based quantum computing systems with large numbers of qubits, long coherence times, and high fidelities. 

Here are some key advantages of our atomic array quantum computing technology:

  1. Long coherence times.  Most quantum hardware companies measure coherence in units of milliseconds.  We measure it in seconds. The Atom team recently set a record for the longest coherence time in a quantum computer with Phoenix, our first-generation 100-qubit system. Phoenix demonstrated qubit coherence times of 21 seconds.  The longer qubits maintain their quantum state, the better.  Developers can run deeper circuits for more complex calculations and there is more time to detect and correct errors during computation.  How do we create such long-lived qubits? Weuse alkaline earth atoms for our qubits. These atoms do not have an electrical charge, thus they are “neutral.”  Each atom is identical, which helps with quality control, and are highly immune to environmental noise.
  2. Flexible, gate-based architecture.  Atom Computing is focused on developing a flexible and agile platform for quantum computing by supporting a universal quantum gate-set that can be programmed using standard industry quantum development platforms.  This gate-based approach allows developers to create a wide range of quantum algorithms for many use cases.  Our qubit connectivity uses Rydberg interactions where the atoms are excited to a highly energized level using laser pulses causing their electrons to orbit the nucleus at a greater distance than their ground state to interact with nearby atoms.
  3. Designed to scale.  Neutral atoms can be tightly packed into a computational array of qubits, making the quantum processor core just fractions of a cubic millimeter.  Lasers hold the atomic qubits in position in this tight array and manipulate their quantum states wirelessly with pulses of light to perform computations. This arrangement of individually trapped atoms, spaced only microns apart, allows for massive scalability, as it is possible to expand the qubit array size without substantially changing the overall footprint of the system.  For example, at a 4-micron pitch between each atom and arranged in a 3D array, a million qubits could fit in less than 1/10th of a cubic millimeter volume.

Developers looking for gate-based quantum computers with large numbers of qubits with long coherence times, should be looking to partner with Atom Computing.  We are working with private beta partners to facilitate their research on our platforms. Have questions about partnering? Contact us.

Establishing World-Record Coherence Times on Nuclear Spin qubits made from neutral atoms

Mickey McDonald, Senior Quantum Engineer

Coherence Times Matter

When building a quantum computer, the need to isolate qubits from environmental effects must be balanced against the need to engineer site-specific, controllable interactions with external fields. In our paper recently published in Nature Communications, we show results from our first-generation quantum computing system called Phoenix, which successfully navigates these competing requirements while demonstrating the capability to load more than 100 qubits. The most notable achievement we describe in this paper is the long coherence times of each of our qubits. Coherence is a term used to describe how long a qubit maintains its quantum state or encoded information. It’s important because longer coherence times mean fewer limitations on running deep circuits, and error-correction schemes have more time to detect and correct errors through mid-circuit measurements. On Phoenix, we set a new high water mark for coherence time.

Ramsey-echo measurements performed on an array of 21 qubits exhibit high contrast across tens of seconds, indicating a dephasing rate of T2echo = 40 +/- 7 seconds - the longest coherence time ever demonstrated on a commercial platform.

Achieving long coherence times requires that a qubit interact minimally with its environment, but this requirement often comes with a drawback: it is usually the case that the more weakly a qubit interacts with its environment, the more difficult it is to couple that qubit to whatever control fields are being used to drive interactions required to perform quantum computation. We manage these competing requirements by using a clever choice of neutral atom-based qubit, and by performing single-qubit control using software-configurable dynamic lasers which can be steered and actuated with sub-micron spatial accuracy and sub-microsecond timing precision.

Simultaneous Control of Qubits 

Our software-configurable optical control scheme allows Phoenix to simultaneously drive arbitrary single-qubit gate operations on all qubits within a single column or row in parallel, while at the same time maintaining coherence times longer than any yet demonstrated on a commercial platform, with a measured T2echo = 40 +/- 7 seconds.

A high-level drawing of Phoenix. Qubits are trapped within a vacuum cell and controlled using software-configurable dynamic lasers projected through a high-numerical aperture microscope objective. Readout is performed by collecting scattered light through the same microscope objective onto a camera.

Our system encodes quantum information, i.e. the qubit states |0> and |1>, in two of the nuclear spin states of a single, uncharged strontium atom. This kind of qubit encoding has two key advantages. First, because both qubit states exist in the electronic ground state, the time it takes for one state to spontaneously decay to the other (AKA the spin-relaxation time “T1”) is effectively infinite. We demonstrate this on Phoenix by making spin relaxation measurements out to several seconds and confirming that the relative populations in each state remain unchanged to the limits of measurement precision.

The spin relaxation time (known as “T1”) describes how long it takes for a qubit initially prepared in one qubit state to decay to another. A short T1 would manifest as a line which starts near 1 and drifts downward. That our measured population can be fit with a horizontal line with no apparent downward drift indicates that our spin relaxation time is far longer than our longest measurement time.

The second key advantage of nuclear spin qubits is that because the qubit states have such similar energies, they are nearly identically affected by external fields. This means that perturbations, such as those induced by externally applied trapping light, will affect both qubit states in the same way. Because these perturbations are common-mode, they do not impact the system’s overall coherence - a feature which fundamentally enables our world-record coherence times.

Our Path Forward

This paper describing Phoenix demonstrates several key technological innovations necessary for the construction of a large-scale, commercial quantum computer: long coherence times, the ability to drive arbitrary single qubit operations across large portions of the array in parallel, and the ability to trap 100+ qubits (and far beyond in the future). As we develop our second-generation quantum computers, we will build on the proven architecture and successes demonstrated on Phoenix to scale up to systems with fidelities and qubit numbers high enough to solve problems that cannot be solved with classical computers. Stay tuned and sign up for our Tech Perspectives blog series to learn more!

Quantum Computing Value as it Scales to Thousands of Qubits

Rob Hays, CEO
 
Rob Hays discusses Atom Computing’s architecture of nuclear-spin qubits made from neutral atoms. He shares how the technology will enable large-scale quantum computers and the necessity for coherent, error-corrected systems -- exploring applications and use-cases that quantum computing can help solve as the system scales to 1,000s of qubits and beyond.

Building Scalable Quantum Computers from Arrays of Neutral Atoms

Krish Kotru, Quantum Engineering Manager

Arrays of neutral atoms have emerged as a more promising platform for scalable quantum computation due to the rapid advancements in optical trapping and controlling of individually trapped atoms. Krish discusses the technology enabling the assembly of such arrays and recent results demonstrating parallel coherent control and world-record coherence times. He also surveys ongoing work demonstrating interactions between atoms in Rydberg states that enable high-quality two qubit gates.