Assembly and Coherent Control of a Register of Nuclear Spin Qubits | Nature

FoThe generation of a register of highly coherent, but independent, qubits is a prerequisite to performing universal quantum computation. Here we introduce a qubit encoded in two nuclear spin states of a single 87Sr atom and demonstrate coherence approaching the minute-scale within an assembled register of individually-controlled qubits. While other systems have shown impressive coherence times through some combination of shielding, careful trapping, global operations, and dynamical decoupling, we achieve comparable coherence times while individually driving multiple qubits in parallel. We highlight that even with simultaneous manipulation of multiple qubits within the register, we observe coherence in excess of 105 times the current length of the operations, with Techo2=(40±7)T2echo=(40±7) seconds. 

Establishing World-Record Coherence Times on Nuclear Spin qubits made from neutral atoms

Mickey McDonald, Senior Quantum Engineer

Coherence Times Matter

When building a quantum computer, the need to isolate qubits from environmental effects must be balanced against the need to engineer site-specific, controllable interactions with external fields. In our paper recently published in Nature Communications, we show results from our first-generation quantum computing system called Phoenix, which successfully navigates these competing requirements while demonstrating the capability to load more than 100 qubits. The most notable achievement we describe in this paper is the long coherence times of each of our qubits. Coherence is a term used to describe how long a qubit maintains its quantum state or encoded information. It’s important because longer coherence times mean fewer limitations on running deep circuits, and error-correction schemes have more time to detect and correct errors through mid-circuit measurements. On Phoenix, we set a new high water mark for coherence time.

Ramsey-echo measurements performed on an array of 21 qubits exhibit high contrast across tens of seconds, indicating a dephasing rate of T2echo = 40 +/- 7 seconds - the longest coherence time ever demonstrated on a commercial platform.

Achieving long coherence times requires that a qubit interact minimally with its environment, but this requirement often comes with a drawback: it is usually the case that the more weakly a qubit interacts with its environment, the more difficult it is to couple that qubit to whatever control fields are being used to drive interactions required to perform quantum computation. We manage these competing requirements by using a clever choice of neutral atom-based qubit, and by performing single-qubit control using software-configurable dynamic lasers which can be steered and actuated with sub-micron spatial accuracy and sub-microsecond timing precision.

Simultaneous Control of Qubits 

Our software-configurable optical control scheme allows Phoenix to simultaneously drive arbitrary single-qubit gate operations on all qubits within a single column or row in parallel, while at the same time maintaining coherence times longer than any yet demonstrated on a commercial platform, with a measured T2echo = 40 +/- 7 seconds.

A high-level drawing of Phoenix. Qubits are trapped within a vacuum cell and controlled using software-configurable dynamic lasers projected through a high-numerical aperture microscope objective. Readout is performed by collecting scattered light through the same microscope objective onto a camera.

Our system encodes quantum information, i.e. the qubit states |0> and |1>, in two of the nuclear spin states of a single, uncharged strontium atom. This kind of qubit encoding has two key advantages. First, because both qubit states exist in the electronic ground state, the time it takes for one state to spontaneously decay to the other (AKA the spin-relaxation time “T1”) is effectively infinite. We demonstrate this on Phoenix by making spin relaxation measurements out to several seconds and confirming that the relative populations in each state remain unchanged to the limits of measurement precision.

The spin relaxation time (known as “T1”) describes how long it takes for a qubit initially prepared in one qubit state to decay to another. A short T1 would manifest as a line which starts near 1 and drifts downward. That our measured population can be fit with a horizontal line with no apparent downward drift indicates that our spin relaxation time is far longer than our longest measurement time.

The second key advantage of nuclear spin qubits is that because the qubit states have such similar energies, they are nearly identically affected by external fields. This means that perturbations, such as those induced by externally applied trapping light, will affect both qubit states in the same way. Because these perturbations are common-mode, they do not impact the system’s overall coherence - a feature which fundamentally enables our world-record coherence times.

Our Path Forward

This paper describing Phoenix demonstrates several key technological innovations necessary for the construction of a large-scale, commercial quantum computer: long coherence times, the ability to drive arbitrary single qubit operations across large portions of the array in parallel, and the ability to trap 100+ qubits (and far beyond in the future). As we develop our second-generation quantum computers, we will build on the proven architecture and successes demonstrated on Phoenix to scale up to systems with fidelities and qubit numbers high enough to solve problems that cannot be solved with classical computers. Stay tuned and sign up for our Tech Perspectives blog series to learn more!

Quantum Computing Value as it Scales to Thousands of Qubits

Rob Hays, CEO
 
Rob Hays discusses Atom Computing’s architecture of nuclear-spin qubits made from neutral atoms. He shares how the technology will enable large-scale quantum computers and the necessity for coherent, error-corrected systems -- exploring applications and use-cases that quantum computing can help solve as the system scales to 1,000s of qubits and beyond.

Building Scalable Quantum Computers from Arrays of Neutral Atoms

Krish Kotru, Quantum Engineering Manager

Arrays of neutral atoms have emerged as a more promising platform for scalable quantum computation due to the rapid advancements in optical trapping and controlling of individually trapped atoms. Krish discusses the technology enabling the assembly of such arrays and recent results demonstrating parallel coherent control and world-record coherence times. He also surveys ongoing work demonstrating interactions between atoms in Rydberg states that enable high-quality two qubit gates.

Highly Coherent Qubits and the Path to Error Correction

Jonathan King, Co-Founder & Chief Scientist

Realizing the full value of quantum computing will require error correction to suppress the errors intrinsic to quantum systems. Practically, this requires high-quality qubits in a sufficiently large quantity. Jonathan discusses the application of error correction in quantum computing and describes a path to error correction with neutral atoms.

How Boutique Quantum Specialist Firms are Addressing Industry Applications with Atom Computing

Denise Ruffner, Chief Business Officer, Atom Computing
Markus Braun, Founder and Managing Director, JoS Quantum
Rafael Sotelo, Co-Founder, Quantum South
Tennin Yan, QunaSys

Quantum has the potential to solve challenging computation challenges across a variety of industries. While there is growing consensus on the types of problems addressable through quantum (optimization, machine learning acceleration, and simulation), the development of solutions for individual industries and organizations will require focused attention and development by domain experts. In this panel, we will discuss how boutique quantum specialist firms are addressing real world problems and how they will leverage the latest quantum hardware advances to bring value to organizations.

The Truth About Scaling Quantum Computing

Rob Hays, CEO, Atom Computing

Last week, quantum computing industry insiders were talking about and debating the current state of the technologies, system requirements for various algorithms, and when quantum advantage will be achieved. This flurry of discussion was stirred up by a report from short-selling activists at Scorpion Capital calling into question “Is quantum computing real?” While sensational in many respects, the report raises important questions about whether quantum computing systems actually exist, how useful are they in solving computation problems, and what is hype vs. reality. These questions compelled me to share my observations on where the technology stands today and how hard-working scientists and engineers are blazing the path to commercial value at scale.

Quantum computers are real.

At a conference hosted by MIT in May 1981, the Nobel Prize-winning physicist, Richard Feynman, urged the world to build what he called a quantum computer, stating “If you want to make a simulation of nature, you’d better make it quantum mechanical.”  He laid down a grand challenge for scientists to develop a quantum computer.  40 years later, we have working, albeit small, quantum computers, and quantum computing development is accelerating around the globe. In fact, the Quantum Computing Report is tracking over 144 companies and institutions developing at least 9 different quantum computing architectures.

Today, users can program and run quantum circuits using a standard gate set supported by multiple hardware providers using popular software developer kits from multiple software providers. For decades, hardware providers have been building better qubits, eliminating noise to improve fidelities, extending coherence times, and improving gate speeds. This is the daily toil of quantum engineers, continuously improving their craft. However, the technology is still in its infancy. Today’s quantum computers have dozens to a few hundred qubits. That’s large enough for researchers and users to do early pathfinding for quantum algorithms and gain valuable experience in programming the systems but not large enough for meaningful commercial value. It’s estimated that we need thousands to a million+ qubits to yield error-corrected systems that can run a broad range of commercial applications.

The challenge is to scale.

Building reliable quantum computers with a million+ qubits along with the software and algorithms to provide valuable computation is the hard work ahead of us. The quest for large-scale quantum computers is what drives us at Atom Computing. Our sole focus is on building quantum computers that scale as quickly as possible using neutral atoms. Other prevalent quantum computing architectures, namely superconductors and trapped ions, have helped to advance the industry by proving that systems could be built with quality qubits, but so far this has only been achieved at small numbers of qubits.

The neutral atom approach uses optical tweezers to hold an array of atoms in a palm-sized vacuum chamber. Alkaline earth element atoms are trapped and cooled by lasers and then arranged a few microns apart from each other. They are made into qubits by precisely controlling the spin of each atom’s nucleus and entanglement between their electron clouds using pulses of light. The atoms are so close to each other that millions of them can feasibly be held in the same small vacuum chamber without significant growth of the overall system footprint. Experiments have demonstrated that more qubits can be packed into the chamber by arranging atoms in three dimensions and that a 3D arrangement has advantages for error correction. It’s straightforward to imagine a 3D array of 100 x 100 x 100 = 1 million qubits. At four microns apart, a million qubits would be controlled wirelessly by lasers in a volumetric space of less than 1/10th of a cubic millimeter. Building powerful quantum computers requires scientific know-how and a lot of engineering effort across multiple disciplines. This takes time and we are making steady progress toward our goal.

We need to be realistic about where we are.

As progress is made, it’s important to recognize where the industry is on this journey and for achievements to be realistically represented in appropriate context. At Atom Computing, we believe that honesty and transparency are critical to the success of our company, our partners, and the quantum industry at large. You can count on us to do what we say.

If you want to hear more about my perspectives on the quantum computing industry and Atom Computing’s progress, watch my Insider Podcast discussion with tech industry analyst, Pat Moorhead, and/or join me at Inside Quantum Technology in San Diego where I will give a keynote presentation on May 12. We hope you will continue to follow us as we rise to meet the challenge of building large-scale quantum systems and that you see the promise of neutral atom technology to deliver on Richard Feynman’s quantum computing vision.

Atom Computing Joins Forces with University of Colorado Boulder’s CUbit Quantum Initiative to Accelerate Quantum Innovation

Rob Hays, CEO, Atom Computing

Large-scale Quantum Computing will provide orders of magnitude greater computational performance compared to today's fastest computers. The Boston Consulting Group estimates it could create $5B-$10B of value in the next three to five years (the early days) and grow to $450B-$850B in the next 15 to 30 years. It’s obvious why governments, academics, and enterprises of all sizes are gearing up to implement quantum computing practices. The role Atom Computing plays, as a quantum computing hardware platform provider, within the ecosystem is critical. I would argue we are one of the primary gates to unlocking this economic value. Both are an enviable and somewhat uncomfortable position to be in. All eyes are on us and other hardware platform providers to scale up quickly so the quantum programmers can get to work writing productive applications.

It takes a village!

Building a strong ecosystem focused on accelerating quantum computing requires an engaged ecosystem of users, software partners, research labs, and talent who can add value up and down the computing stack. That’s why we’ve teamed up with the University of Colorado Boulder’s CUbit Quantum Initiative: to drive R&D and talent development while supporting the ascendence of Colorado as Center of Excellence for Quantum Information Science in the U.S. and globally.

Our visions are aligned:

Collaboration Accelerates R&D and Workforce Development

Within the scope of this collaboration, Atom Computing’s engineers will work closely with CUbit’s research partners to accelerate innovation on our next-generation quantum computers.

I joined the CUbit Advisory Board to help spur fresh new ideas and investment in professional development and networking to engage talent across a multitude of disciplines. Our goal is to build a pipeline of top talent with the skills required to drive this incredible paradigm shift in computing performance and engage industry partners who can deliver the economic benefits that QIS will enable.

Quantum Tech Hub

Colorado is shaping up to be one of the most important quantum computing hubs in the U.S. This partnership is a natural fit for us as our company has benefited greatly from Colorado’s remarkable talent and institutions.

Several of Atom Computing’s team members have matriculated from the University of Colorado Boulder– including our Founder and CTO, Ben Bloom, who completed his PhD in Physics at CU Boulder as part of JILA. During his PhD, Ben worked closely with Atom’s Scientific Advisor, Dr. Jun Ye, where he helped build one of the world’s most accurate atomic clocks – work that won Jun the 2022 Breakthrough Prize in Fundamental Physics and was the catalyst for our approach of using optically-trapped neutral-atoms to build nuclear-spin qubits.

Our connections to CU Boulder are deep and it’s just the beginning of our long-term partnership. I’m looking forward to working together with the CUbit Quantum Initiative, CU Boulder, and the QIS ecosystem in Colorado to accelerate quantum computing. Click here to view the press release.

2022 Quantum Computing Tech Predictions

Quantum computing gained a lot of attention and headlines in 2021 with companies, like ours, making big bets on this paradigm-shifting technology. While quantum computing may not be a household term just yet, we are going to see big advancements in 2022 on technology development and breadth of involvement. As I’m out talking to customers and partners, I see quantum computing continuing to grow mindshare every day.

Here are my top six quantum computing predictions for 2022:

1. Newer quantum computing modalities will achieve eye-opening breakthroughs

Early adopters are building teams and investing in proof-of-concept use-cases, learning how to take advantage of quantum computing to gain a competitive edge in the market. These customers are eager to try different quantum computing platforms to see what works best for their applications. Over the past couple of years, increased R&D in newer quantum computing modalities has delivered competitive systems that show the potential to scale at a faster pace than the earlier modalities.

In 2022, we are going to see an acceleration of technical demonstrations and product readiness in neutral atom technology – building further interest and credibility in the approach. By the end of the year, I predict that neutral atom-based quantum computing will be on equal footing in terms of customer awareness compared to the earlier technologies – superconductors and trapped ions. We may also see demonstrations of photonics and other new technologies not yet on most peoples' radar screens. 

2. Increased attention in NISQ use-cases at various scales

Hardware providers are pushing to scale the number of qubits per system while potential users work with partners to experiment on these initial, smaller systems. Today’s quantum computers only have dozens or, in some cases, a hundred or so qubits – which isn’t a sufficient scale to compute meaningful commercial problems. Today’s systems are prone to error and do not have enough physical qubits to perform error-correction algorithms to overcome the errors. Thus, we have NISQ machines in search of use-cases.

I predict we will see novel ways of using NISQ machines for a modest set of interesting use-cases in financial services, aerospace, logistics, and perhaps pharma or chemistry. This year, I believe that applications developers will figure out how to gain more commercial value out of NISQ systems with thousands of qubits, but the hardware won’t quite get there in 2022. Additionally, I expect to see demonstrations of mid-circuit measurement and error correction on a number of modalities this year pointing toward the end of the NISQ era and a more productive quantum computing era on the horizon.

3. Leading US and Chinese Cloud Service Providers will double down on quantum computing

Cloud Service Providers (CSPs) should benefit greatly from quantum computing given their breadth of pervasive services, the deep integration of artificial intelligence (AI) into their workloads, and the opportunity to vastly improve AI prediction models based on the highly-parallel, statistical nature of quantum computing. They also control their own infrastructure, applications, and customer interfaces, and have deep technical expertise in semiconductors, system design, and software development. Building their own vertically-integrated solution stack saves cost by eliminating vendor margin stacking and provides better services through hardware and software co-optimizations. All of this gives the CSPs a unique ability and the motivation to own proprietary quantum computing hardware and software technologies.

I believe the CSPs can profit from quantum computing faster and at a much larger magnitude than nearly all other sectors, at least initially. This year, I expect to see them double down to ensure they have viable technology options that can meet their “hyper-scale” needs and capture a significant share of the value chain in the future. Some will invest in multiple, competing technologies in order to diversify their risk and box out their competitors by taking viable options off the table. It will come in the form of organic R&D, acquisitions, and partnerships. All of the above in many cases. The amount of money invested will seem crazy to some but, from the point-of-view of the leading US and Chinese cloud service providers, the investment will pale in comparison to the long-term value to be gained from a leading quantum computing solution.

4. Investments in Quantum will continue to break records

2021 saw 2X growth year-over-year in venture capital investment in Quantum Computing start-ups surpassing $1.7B, according to McKinsey. We also saw the first pure-play quantum computing hardware company go IPO and the first large vertical merger. On top of this, governments and multinational corporations have invested billions of dollars collectively. Some have said it’s too much. We’ve even heard talk of a quantum bubble and a quantum winter.

While the size and growth rate of investment is large, the expected size of the quantum computing market is much larger. We are at a point where quantum computing is turning a corner from scientific research and proofs-of-concept to hardware and software engineers building products with commercial promise in the next few years. More investment is required to deliver large-scale quantum computing that benefits a broad set of industry verticals and use-cases. I expect the amount of money invested in quantum computing to continue rising in 2022+ with more companies entering the race. At least two companies have signaled intent to IPO soon. I wouldn’t be surprised to see at least 5 major acquisitions or IPO announcements before the end of the year.

5. Diversity and inclusion will be a bigger focus in Quantum Information Science (QIS)

The research is clear. Teams who build and celebrate racial, gender, and cultural diversity benefit from a broader range of experiences and ideas which leads to better products and increased profitability. Like the broader tech industry as a whole, the QIS workforce does not currently represent the diverse population of our society. This is not a new issue, but it’s an important one that we need to address. Now is the best time to shape a more diverse workforce while the industry is young before the unconscious biases of “good ‘ole boys” networks have a chance to form. We need to continue to encourage students to enter into STEM programs with a particular focus on sponsoring underrepresented minorities and women in quantum physics and engineering programs to train them for future jobs in QIS.

While there is strong competition for scarce talent among the industry players, I think we will see more cooperation in attracting talent to the ecosystem and career development support for women and minorities. As the technology matures and investments in commercialization increase, we are starting to see a much broader set of job roles available and required to grow the market. As a result, the available talent pool in 2022 is going to expand far beyond the physics labs of our universities. With additional job roles, an expanded talent pool, and the exciting opportunity in quantum computing, there is no excuse for not being able to attract a more diverse workforce who can help shape the future of the industry and benefit from the value created. It’s a long game, but we can move the needle this year.

6. Regional Quantum Centers of Excellence will enable tighter collaboration

It feels like there is a quantum computing startup or research lab popping up in every corner of the planet. That’s a good thing. Many of these companies and institutions are doing important R&D and carving out unique parts of the solution stack to make quantum computing a reality. It takes a village to build an integrated system of complex hardware, software, and services that interoperate together to deliver a superior user experience.

In 2022, constellations of collaborators, organized by national and regional interests will emerge among university and government research labs and private companies in their region. Component suppliers, systems hardware companies, and software platform companies will begin working more closely together forming commercial alliances. They will share specs, IP, customers, and integrate and test their products together to deliver turn-key solutions (well at least solutions that don’t require a physicist to operate them). These collaborative partnerships will provide environments of innovation to accelerate technology and market development in their regions in an effort to gain an advantage over other regions in the quantum computing race.

2022 is going to be an exciting year on the journey toward scalable quantum computing. We have a lot of work to do. If you are interested in connecting or talking more about these predictions, you can reach me via LinkedIn. Here’s to an innovative quantum New Year!

The Building Blocks for Quantum Advantage

By: Jonathan King, Co-Founder & Chief Scientist, Atom Computing

Imagine a world where researchers could discover a new drug or cure that’s not even possible today due to computing limitations, or where companies dealing with complex logistical challenges could predict, model and simulate new delivery routes in a matter of minutes. How might this be possible one day … fast forward to quantum computing.

Quantum computing is more than a buzz word these days, it’s a technology that is moving into the mainstream consciousness. Stated simply, quantum computing makes use of the properties of quantum mechanics to perform calculations. For certain classes of problems, quantum computing provides a path to solutions that are not feasible to solve with classical computers of any scale, present or future. Quantum algorithms promise to deliver value across a variety of applications. Early adopters of quantum computing range from chemistry to pharmaceuticals, logistics, financial institutions, and more. For example, quantum optimization algorithms can discover more efficient solutions to routing and scheduling problems, while molecular simulations will aid in chemical design and drug discovery.

So how do we take advantage of quantum computing?

In order to enable these applications, a sufficiently high number of quantum bits (qubits) is required. Qubits that are both high-quality (stable, with long coherence times) and have high fidelity (with low error-rates) are the ultimate goal to realizing quantum advantage. Many vendors today are producing qubits leveraging a variety of different technologies, but a path to error-correction is needed to maximize quantum advantage for the future.

phoenix machine
Atom Computing’s First-Generation Quantum Computer, Phoenix - Berkeley, California
screen of qubits
First-Generation Quantum Computer, “Phoenix”, Berkeley, California - screen showing an array of qubits

The Pathway to Error-Correction

At Atom Computing, our singular mission is to build systems with high-fidelity, scalable qubits capable of performing fault-tolerant, error-corrected quantum computing. Quantum error-correction is the process by which quantum information is redundantly encoded, and errors are detected and then corrected by a classical control system. While significant, and indeed worthwhile, effort is being expended to develop uses for so-called Noisy Intermediate Scale Quantum (NISQ) computers, the truly transformative applications of quantum computing, such as large-scale exact simulations of molecular properties, are expected to require fault-tolerant, error-corrected quantum computers. This outlook underlies our motivation to deliver scalable, high-quality neutral atom qubits.

In order to achieve fault-tolerant, error-corrected quantum computing, a large number of qubits is required. For example, it is expected that each error-corrected “logical” qubit will need to be encoded in thousands of physical qubits in order to enable beyond-NISQ algorithms. These qubits must be highly coherent and will require a scheme to individually control the atoms in order to scale to a large array of qubits. It’s these fundamental requirements that motivated the development of our nuclear spin qubits in optically-trapped neutral atoms.

Nature’s Perfect Qubit

Our qubits are based on strontium atoms trapped in optical tweezers, which are highly focused beams of laser light that attract atoms to the region of highest intensity. In Phoenix, our first-generation quantum system, we can assemble arrays of more than 100 individual strontium atoms by projecting optical tweezers into a vacuum chamber where they trap optically-cooled atoms. Using proprietary radiofrequency-controlled optical systems, we are able to individually control the qubits to perform quantum logic gates. Given the wireless nature of the optical manipulation and the relatively small size of the atom arrays (the spacing between atoms is a few microns), we believe our neutral atoms provide a path to larger-scale quantum computers.

The heart of Phoenix, showcasing where the Atom Computing’s qubits entangle. (First-Generation Quantum Computer, Phoenix - Berkeley, California)

Nature’s Building Blocks for Error-Correction

Coherent qubits are a necessity for high-fidelity qubit operations. An error-corrected quantum computer requires operations beyond quantum logical gates. Mid-circuit measurement, error syndrome extraction and decoding, and active feedback all require time during which the qubits must preserve the quantum information.

By encoding a qubit in the angular momentum state of the strontium-87 nucleus, we produce qubits that are intrinsically insensitive to environmental fluctuations. The qubit resides in the electronic ground state of the atom, avoiding lifetime limitations due to spontaneous decayThe qubit state is essentially unperturbed by the optical tweezers, enabling remarkable isolation which results in coherence times of 42 seconds*.

Analogous to classical computers, at the most basic level a quantum algorithm is a series of gates, which are discrete operations that manipulate the state of the qubits. Logical gate fidelity is limited by the ratio of the time it takes to perform a gate to the coherence time. With coherence times approaching a minute, optimizing this ratio is readily achievable, and qubit lifetime will not be a limiting factor in achieving fidelities above the threshold required by error-correction codes.

With neutral atoms as the building blocks for large-scale quantum computers, Atom Computing and their customers will jointly develop solutions to problems that have been previously intractable.

To learn more about Atom Computing’s first-generation system, Phoenix visit our website. For more information on our qubits, read our technical paper on the Arxiv.

*https://arxiv.org/abs/2108.04790