Establishing World-Record Coherence Times on Nuclear Spin qubits made from neutral atoms
Mickey McDonald, Senior Quantum Engineer
Coherence Times Matter
When building a quantum computer, the need to isolate qubits from environmental effects must be balanced against the need to engineer site-specific, controllable interactions with external fields. In our paper recently published in Nature Communications, we show results from our first-generation quantum computing system called Phoenix, which successfully navigates these competing requirements while demonstrating the capability to load more than 100 qubits. The most notable achievement we describe in this paper is the long coherence times of each of our qubits. Coherence is a term used to describe how long a qubit maintains its quantum state or encoded information. It’s important because longer coherence times mean fewer limitations on running deep circuits, and error-correction schemes have more time to detect and correct errors through mid-circuit measurements. On Phoenix, we set a new high water mark for coherence time.
Achieving long coherence times requires that a qubit interact minimally with its environment, but this requirement often comes with a drawback: it is usually the case that the more weakly a qubit interacts with its environment, the more difficult it is to couple that qubit to whatever control fields are being used to drive interactions required to perform quantum computation. We manage these competing requirements by using a clever choice of neutral atom-based qubit, and by performing single-qubit control using software-configurable dynamic lasers which can be steered and actuated with sub-micron spatial accuracy and sub-microsecond timing precision.
Simultaneous Control of Qubits
Our software-configurable optical control scheme allows Phoenix to simultaneously drive arbitrary single-qubit gate operations on all qubits within a single column or row in parallel, while at the same time maintaining coherence times longer than any yet demonstrated on a commercial platform, with a measured T2echo = 40 +/- 7 seconds.
Our system encodes quantum information, i.e. the qubit states |0> and |1>, in two of the nuclear spin states of a single, uncharged strontium atom. This kind of qubit encoding has two key advantages. First, because both qubit states exist in the electronic ground state, the time it takes for one state to spontaneously decay to the other (AKA the spin-relaxation time “T1”) is effectively infinite. We demonstrate this on Phoenix by making spin relaxation measurements out to several seconds and confirming that the relative populations in each state remain unchanged to the limits of measurement precision.
The second key advantage of nuclear spin qubits is that because the qubit states have such similar energies, they are nearly identically affected by external fields. This means that perturbations, such as those induced by externally applied trapping light, will affect both qubit states in the same way. Because these perturbations are common-mode, they do not impact the system’s overall coherence - a feature which fundamentally enables our world-record coherence times.
Our Path Forward
This paper describing Phoenix demonstrates several key technological innovations necessary for the construction of a large-scale, commercial quantum computer: long coherence times, the ability to drive arbitrary single qubit operations across large portions of the array in parallel, and the ability to trap 100+ qubits (and far beyond in the future). As we develop our second-generation quantum computers, we will build on the proven architecture and successes demonstrated on Phoenix to scale up to systems with fidelities and qubit numbers high enough to solve problems that cannot be solved with classical computers. Stay tuned and sign up for our Tech Perspectives blog series to learn more!
Moor Insights & Strategy- INSIDER PODCAST - with Chief Analyst Patrick Moorhead and CEO Rob Hays
An insightful conversation between Chief Analyst Patrick Moorhead and our CEO Rob Hays on the Insider Podcast. Patrick and Rob dive into what's on the horizon as quantum computing evolves: quantum scalability, roadmaps, QaaS (quantum as-a-service), on-prem vs. off-prem, investments, market consolidation, quantum ecosystem, and more.
The Truth About Scaling Quantum Computing
Rob Hays, CEO, Atom Computing
Last week, quantum computing industry insiders were talking about and debating the current state of the technologies, system requirements for various algorithms, and when quantum advantage will be achieved. This flurry of discussion was stirred up by a report from short-selling activists at Scorpion Capital calling into question “Is quantum computing real?” While sensational in many respects, the report raises important questions about whether quantum computing systems actually exist, how useful are they in solving computation problems, and what is hype vs. reality. These questions compelled me to share my observations on where the technology stands today and how hard-working scientists and engineers are blazing the path to commercial value at scale.
Quantum computers are real.
At a conference hosted by MIT in May 1981, the Nobel Prize-winning physicist, Richard Feynman, urged the world to build what he called a quantum computer, stating “If you want to make a simulation of nature, you’d better make it quantum mechanical.” He laid down a grand challenge for scientists to develop a quantum computer. 40 years later, we have working, albeit small, quantum computers, and quantum computing development is accelerating around the globe. In fact, the Quantum Computing Report is tracking over 144 companies and institutions developing at least 9 different quantum computing architectures.
Today, users can program and run quantum circuits using a standard gate set supported by multiple hardware providers using popular software developer kits from multiple software providers. For decades, hardware providers have been building better qubits, eliminating noise to improve fidelities, extending coherence times, and improving gate speeds. This is the daily toil of quantum engineers, continuously improving their craft. However, the technology is still in its infancy. Today’s quantum computers have dozens to a few hundred qubits. That’s large enough for researchers and users to do early pathfinding for quantum algorithms and gain valuable experience in programming the systems but not large enough for meaningful commercial value. It’s estimated that we need thousands to a million+ qubits to yield error-corrected systems that can run a broad range of commercial applications.
The challenge is to scale.
Building reliable quantum computers with a million+ qubits along with the software and algorithms to provide valuable computation is the hard work ahead of us. The quest for large-scale quantum computers is what drives us at Atom Computing. Our sole focus is on building quantum computers that scale as quickly as possible using neutral atoms. Other prevalent quantum computing architectures, namely superconductors and trapped ions, have helped to advance the industry by proving that systems could be built with quality qubits, but so far this has only been achieved at small numbers of qubits.
The neutral atom approach uses optical tweezers to hold an array of atoms in a palm-sized vacuum chamber. Alkaline earth element atoms are trapped and cooled by lasers and then arranged a few microns apart from each other. They are made into qubits by precisely controlling the spin of each atom’s nucleus and entanglement between their electron clouds using pulses of light. The atoms are so close to each other that millions of them can feasibly be held in the same small vacuum chamber without significant growth of the overall system footprint. Experiments have demonstrated that more qubits can be packed into the chamber by arranging atoms in three dimensions and that a 3D arrangement has advantages for error correction. It’s straightforward to imagine a 3D array of 100 x 100 x 100 = 1 million qubits. At four microns apart, a million qubits would be controlled wirelessly by lasers in a volumetric space of less than 1/10th of a cubic millimeter. Building powerful quantum computers requires scientific know-how and a lot of engineering effort across multiple disciplines. This takes time and we are making steady progress toward our goal.
We need to be realistic about where we are.
As progress is made, it’s important to recognize where the industry is on this journey and for achievements to be realistically represented in appropriate context. At Atom Computing, we believe that honesty and transparency are critical to the success of our company, our partners, and the quantum industry at large. You can count on us to do what we say.
If you want to hear more about my perspectives on the quantum computing industry and Atom Computing’s progress, watch my Insider Podcast discussion with tech industry analyst, Pat Moorhead, and/or join me at Inside Quantum Technology in San Diego where I will give a keynote presentation on May 12. We hope you will continue to follow us as we rise to meet the challenge of building large-scale quantum systems and that you see the promise of neutral atom technology to deliver on Richard Feynman’s quantum computing vision.
Atom Computing’s CEO Believes Neutral Atoms Offer Fastest Path to Scalable Quantum Computing The Quantum Insider
Atom Computing Joins Forces with University of Colorado Boulder’s CUbit Quantum Initiative to Accelerate Quantum Innovation
Rob Hays, CEO, Atom Computing
Large-scale Quantum Computing will provide orders of magnitude greater computational performance compared to today's fastest computers. The Boston Consulting Group estimates it could create $5B-$10B of value in the next three to five years (the early days) and grow to $450B-$850B in the next 15 to 30 years. It’s obvious why governments, academics, and enterprises of all sizes are gearing up to implement quantum computing practices. The role Atom Computing plays, as a quantum computing hardware platform provider, within the ecosystem is critical. I would argue we are one of the primary gates to unlocking this economic value. Both are an enviable and somewhat uncomfortable position to be in. All eyes are on us and other hardware platform providers to scale up quickly so the quantum programmers can get to work writing productive applications.
It takes a village!
Building a strong ecosystem focused on accelerating quantum computing requires an engaged ecosystem of users, software partners, research labs, and talent who can add value up and down the computing stack. That’s why we’ve teamed up with the University of Colorado Boulder’s CUbit Quantum Initiative: to drive R&D and talent development while supporting the ascendence of Colorado as Center of Excellence for Quantum Information Science in the U.S. and globally.
Our visions are aligned:
Atom Computing’s sole mission is to build the most scalable and reliable quantum computers, helping researchers and scientists reach their next big breakthrough.
The CUbit’s focus is on connecting the ecosystem, advancing fundamental science, developing talent, and building the foundation for rapid dissemination, application, and commercialization of quantum technologies.
Collaboration Accelerates R&D and Workforce Development
Within the scope of this collaboration, Atom Computing’s engineers will work closely with CUbit’s research partners to accelerate innovation on our next-generation quantum computers.
I joined the CUbit Advisory Board to help spur fresh new ideas and investment in professional development and networking to engage talent across a multitude of disciplines. Our goal is to build a pipeline of top talent with the skills required to drive this incredible paradigm shift in computing performance and engage industry partners who can deliver the economic benefits that QIS will enable.
Quantum Tech Hub
Colorado is shaping up to be one of the most important quantum computing hubs in the U.S. This partnership is a natural fit for us as our company has benefited greatly from Colorado’s remarkable talent and institutions.
Several of Atom Computing’s team members have matriculated from the University of Colorado Boulder– including our Founder and CTO, Ben Bloom, who completed his PhD in Physics at CU Boulder as part of JILA. During his PhD, Ben worked closely with Atom’s Scientific Advisor, Dr. Jun Ye, where he helped build one of the world’s most accurate atomic clocks – work that won Jun the 2022 Breakthrough Prize in Fundamental Physics and was the catalyst for our approach of using optically-trapped neutral-atoms to build nuclear-spin qubits.
Our connections to CU Boulder are deep and it’s just the beginning of our long-term partnership. I’m looking forward to working together with the CUbit Quantum Initiative, CU Boulder, and the QIS ecosystem in Colorado to accelerate quantum computing. Click here to view the press release.
Atom Computing Plans To Build A Bigger And Better High-Tech Quantum Computer With Its Latest $60 Million Series B Funding | FORBES
Atom Computing, a quantum computing company headquartered in Berkeley, California, seems to be on the fast track for funding.