Atom Computing adds key leaders to accelerate quantum computing momentum with the U.S. government

August 30, 2023 — Berkeley, CA – Atom Computing announced it has appointed Ken Braithwaite, former Secretary of the Navy, to its Board of Directors and that Greg Muhlner has joined the company as Vice President of Public Sector to lead engagement with the U.S. government.

CEO Rob Hays said the addition of Braithwaite and Muhlner reflects the important role of the U.S. government in the advancement and adoption of quantum computing, noting Atom Computing’s collaborations with the U.S. Department of Defense, U.S. Department of Energy, and the National Science Foundation.

“The United States has a vibrant quantum ecosystem thanks, in part, to investments the federal government has made in quantum computing research and development, workforce initiatives, and procurement,” he said.  “Public-private partnership with our company will help to advance the technology and ensure U.S. leadership in this area of strategic importance. Ken (Braithwaite) and Greg (Muhlner) have extensive federal government experience that will help position Atom Computing as the premier partner to the U.S. in winning the race to large-scale, fault-tolerant quantum computing.”

Braithwaite was sworn in as the Secretary of the Navy in 2020 and previously served as a U.S. ambassador to Norway. He graduated from the U.S. Naval Academy in 1984 and was commissioned as an ensign in the U.S. Navy.  Braithwaite left active duty in 1993 but continued his service in the Navy Reserve while holding several executive leadership positions in private industry.

Muhlner has 15 years of experience in business development and sales to the federal government. Before joining Atom Computing, he was Vice President of Sales for Rebellion Defense and led Navy and U.S. Marine Corps sales at Amazon Web Services.  Muhlner served as a Naval Special Warfare (SEAL) Officer, participating in Operation Enduring Freedom in Afghanistan and Operation Iraqi Freedom. 

“Quantum computing is a disruptive technology that will redefine computing and the complexity of problems that we can solve,” Braithwaite said. “For our national security and to fuel our economy, it is imperative the United States and its allies win the quantum computing race.  I am proud to serve on Atom Computing’s board of directors to help the company achieve its mission in this critical new domain.”

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. We collaborate with researchers, organizations, governments, and companies to develop world-changing tools and solutions; and support the growing global ecosystem. Learn more at atom-computing.com and follow us on LinkedIn.

Atom Computing and National Renewable Energy Laboratory exploring electric grid optimization using quantum computing

July 20, 2023 — Boulder, C0 – Atom Computing and the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) today announced a collaboration to explore how quantum computing can help optimize electric grid operations.

During this week’s IEEE Power and Energy Society general meeting, NREL researchers demonstrated how they incorporated Atom Computing’s atomic array quantum computing technologies into the lab’s Advanced Research on Integrated Energy Systems (ARIES) research platform and its hardware-in-the-loop testing to create a first-of-a-kind “quantum-in-the-loop” capability that can run certain types of optimization problems on a quantum computer.

Dr. Rob Hovsapian, a research advisor at NREL, called the new capability an important step toward understanding how quantum computers can better balance energy loads across an electric grid. 

“Electric grids are increasingly complex as we add new power generation resources such as wind and solar, electric vehicle charging, sensors and other devices,” he said.  “We are reaching the point where electric grids have more inputs and outputs than what our classical computing models can handle. By incorporating quantum computing into our testing platform, we can begin exploring how this technology could help solve certain problems.”

Optimization problems such as managing supply chains, devising more efficient transportation routes, and improving electric grid and telecommunications networks are considered “killer applications” for quantum computing. These are large-scale problems with numerous factors and variables involved, which makes them well suited for quantum computers and the way in which they run calculations. 

Keeping power flowing across an electric grid is a good example of an optimization problem. Power plants, wind turbines, and solar farms must generate enough electricity to meet demand, which can fluctuate depending on the time of day and weather conditions.  This electricity is then routed across miles and miles of transmission lines and delivered to homes, businesses, hospitals, and other facilities in real time.

Initially, NREL and Atom Computing are exploring how quantum computing can improve decision making on the re-routing of power between feeder lines that carry electricity from a substation to a local or regional service area in the event of switch or line downtime. 

“Right now, operators primarily rely on their own experience to make this decision,” Hovsapian said. “This works but it doesn’t necessarily result in an optimal solution.  We are evaluating how a quantum computer can provide better data to make these decisions.”

Atom Computing CEO Rob Hays called the project an important example of how private industry and national laboratories can collaborate on quantum computing technology and valuable use case development. 

“Collaborations like this are extremely important for advancing quantum computing and scientific research,” Hays said.  “NREL is a global leader in renewable energy and electric grids.  We are proud to partner with them to advance their research.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. We collaborate with researchers, organizations, governments, and companies to help develop quantum-enabled tools and solutions; and support the growing global ecosystem. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.

Meet Qualition: Winners of the Atom-sponsored Visualization Challenge

Kortny Rolston-Duce, Director of Marketing Communications

In February, quantum researchers, developers and enthusiasts gathered online to listen to speakers, complete coding challenges, and compete in hackathons as part of QHack 2023.  

More than 2,800 people from 105 countries participated in this year’s event, which was organized by Xanadu and was part expo, hackathon, and scientific conference.  Since its inception in 2019, the multi-day QHack has become one of the largest online gatherings for the quantum computing community.

As part of its ongoing support for the growing quantum ecosystem, Atom Computing sponsored the visualization hackathon for QHack 2023.  A team from Qualition won the challenge after presenting a prototype for quantum image processing framework.

We met with Qualition CEO Alish Shamskhoozani, Chief Technology Officer Amirali Malekani Nezhad, and other members of the team to learn more about their organization, their winning project, and why they believe image processing is a killer application for quantum computing.

First off, tell us about Qualition.  Can you please describe what Qualition is and does?

Qualition was launched in 2022 by a group of researchers and developers located around the world with the common goal of advancing the nascent quantum computing industry and turning hype around this technology into reality.  Qualition, which is derived from “Quantum Coalition,” has expertise in a wide variety of quantum computing applications.

We take great pride in being mentored by Womanium, an organization dedicated to promoting diversity and inclusion in the field of quantum technology.  Moreover, we are honored to have technical ambassadors from some of the top quantum technology companies in the industry, including Quantinuum, qBraid, QWORLD, and CQTech. These partnerships provide us with the expertise and guidance of some of the most respected names in the field and enable us to collaborate on projects that have the potential to make a significant impact.

Whether through our research initiatives, our participation in quantum hackathons, or our partnerships with industry leaders, Qualition is committed to making a meaningful contribution to the advancement of quantum technology.

What project did your team complete for the QHack 2023 challenge?
Qualition presented a prototype for quantum image processing framework based on efficient quantum information encoding algorithms (A Distributed Amplitude Encoder, and A linearly scalable, FRQI inspired by the paper “Quantum pixel representations and compression for N-dimensional images'' by Mercy G. Amankwah, Dr. Daan Camps, Professor. E. Wes Bethel, Dr. Roel Van Beeumen & Dr. Talita Perciano) capable of tackling high dimensional data.  We demonstrated the features and performance of the framework through two branches of applications, namely Recognition, and Alteration use cases.

Recognition encompasses all classes of classification tasks, whereas Alteration encompasses all classes of filters, and provides a practical, fully-quantum framework for enhancing or processing images using techniques with similarities to traditional image processing. The two methods taken together can accommodate all traditional image processing algorithms in a fully quantum mechanical framework.

We presented a comprehensive analysis of a variety of pipeline options and assessed overall performance as a function of  the scale of the model, the fidelity of the encoder, and the accuracy of the trained model. This work serves as the foundation for Qualition’s Quantum Image Processing framework.

What were the team’s findings? Was there an advantage to using quantum computing?

Image processing was founded on the mathematics of linear algebra, where images are mathematically represented as vectors (similarly to quantum states), and operations applied to the images, also known as filters, are mathematically represented as matrices (similar to quantum operators).

Given the mathematical parallels between image processing and quantum mechanics, we consider image processing as a quantum evolution, and thus anticipate a significant advantage when performing the image processing tasks using quantum hardware. Quantum parallelism and presence of quantum Interference allow us to achieve a speedup when performing these tasks through a modeling of states and unitary operations. Assuming the near-term existence of more fault-tolerant and quantum hardware with higher quantum volumes, we can expect to tackle large classification tasks much faster than the equivalent classical paradigms.

The Recognition model is perhaps the most advantageous application of a quantum-enabled image processing model courtesy of the low dimensionality of the output, which is usually a label, and therefore easily extracted with a low number of shots. In contrast to the Recognition model, there exists an inherent challenge for the Alteration model. It requires the user to read out a large state vector, which in turn requires a considerably high number of shots for an adequately accurate demonstration.  As quantum computing hardware improves, however, so will the potential advantage of employing quantum filters characteristic of Alteration use cases to carry out  quantum image processing. Furthermore, our current research paves the way for development of feasible generative quantum AI models, something which we are striving for in the year to come!

In conclusion, we are proud to demonstrate initial evidencethat quantum image processing is a promising avenue for performing classification  for large models The current body of work at Qualition can be further extended to a variety of modalities and generalized to be the first of its kind, “High Dimensional QML”, feasible on current hardware. These pursuits  will be integral to demonstrating quantum hardware’s advantage for real-life industrial use cases over a vast array of fields, and certain aspects of the framework can be extended to other fields of quantum computing e.g., large-scale QAOA, more complex quantum drug simulation, and even faster quantum key distribution.

Why is image processing a strong application for quantum computing?

 Image processing is an inherently resource-intensive task due to its high dimensional nature. Classical paradigms have been performing quite well, as can be seen in a variety of papers and products for classification, alteration, and generations aspects of AI and machine learning. However, the computational cost of the models and the time taken to train them grows as we scale these models and may impact the environment due to the increased demands on classical computing hardware.

Given the mathematical similarities between image processing and quantum mechanics, we can demonstrate a significant advantage when performing the classification tasks using quantum hardware. Assuming we can feasibly implement the models on current hardware, we can expect significant speedups in training and inference time as compared to the equivalent classical models.

That said, image processing is perhaps the perfect use case for demonstrating the quantum advantage in industrial use-cases for the NISQ era hardware, for both scientific as well as artistic endeavors.

Why did Qualition select the visualization challenge?

Our Quantum Image Processing framework provides a user-friendly illustration for understanding the effect of quantum evolution operators on a given state, similar to that of a Bloch Circle, but generalized to N-dimensional states.

Given the mathematical correspondence, we can infer that real-valued quantum states can be represented as images, and thus we can visually observe the effect of applying any real-valued quantum operation to the state through the change in the state’s image representation. 

This visualization can be expanded to a procedural generation illustrating the gradual change in the quantum state by gradually applying the operation, i.e., applying an RY as we gradually increase the rotation angle.

What are some areas/industries in which quantum-enabled image processing could be used?
Quantum image processing can be applied to virtually any image processing task present in the industry, and some examples which would considerably benefit from such a quantum-enabled model would be medical Imaging (e.g., cancer detection, disease classification, segmentation for neurological diseases), satellite Imaging (e.g., object detection, land cover and land use detection, and change detection), and real-time unmanned vehicle routing (e.g., autonomous vehicles, unmanned drones, and robotics).

Quantum-enabled image processing obviates the need to remove certain features for a faster inferenc, a necessary step in classical Deep Learning models. Therefore, using quantum techniques, we can capture and retain more information. In the case of medical imaging, the ability to retain key information content may be the difference between finding a small tumor or not.

We look forward to seeing what the Qualition team does next!

Atom Computing Demonstrates Key Milestone on Path to Fault Tolerance

Rob Hays, CEO

Today, researchers at Atom Computing released a pre-print publication on arXiv , demonstrating the ability to perform mid-circuit measurement on arbitrary qubits without disrupting others in an atomic array. The team applied mid-circuit measurement to detect the loss of qubits from the array (a well-known anticipated error), and successfully replaced the lost qubits from a nearby reservoir.

Path to Fault Tolerance

At Atom Computing, we believe the true potential of quantum computing will be achieved when devices are capable of fault-tolerant computation.  Our company’s focus is on leading that race to unlock the enormous potential of quantum computing applications for industrial and scientific uses. 

Dr. John Preskill, famed theoretical physics professor at California Institute of Technology, who coined the phrase Noisy Intermediate Scale Quantum (NISQ) to describe the current stage of quantum computing, said it best in 2018: “We must not lose sight of the essential longer-term goal: hastening the onset of the fault tolerant era.”

It will likely require hundreds of thousands or millions of physical qubits to achieve fault tolerant systems that can operate continuously and deliver accurate results, overcoming any errors that may occur during computation just as classical computing systems do.  Mid-circuit measurement is one of several key building blocks required to achieve fault tolerant systems:

Mid-circuit measurement has been demonstrated on superconducting and trapped-ion quantum technologies.  Atom Computing, however, is the first company to do so on a commercial atomic array system.  Recent achievements in the neutral atom research community have shown that atomic arrays are emerging from their “dark horse” status to a preferred architecture with intriguing potential.

Importance of Mid-Circuit Measurement

In quantum computing, circuits act as instructions that tell a quantum computer how to perform a calculation.  Circuits define how the programmer intends for the qubits to interact, which gates they need to complete, and in what order they need to be performed.

Mid-circuit measurement involves probing the quantum state of certain qubits, known as ancillas, without disrupting nearby data qubits that perform calculations.  The ability to measure or read out specific qubits during computation without disrupting the rest is essential for quantum developers.  It enables them to glimpse inside a calculation and use conditional branching to determine which action to take based on results of the measurement, similar to IF/THEN statements used in classical computing. With this capability errors can be detected, identified, and corrected in real time. 

Dr. Ben Bloom, Atom Computing Founder and Chief Technology Officer, called this demonstration an important step for the company’s technology, which uses lasers to hold neutral atom qubits in a two-dimensional array to perform computations.

“This is further proof that atomic array quantum computers are rapidly gaining ground in the race to build large-scale, fault-tolerant quantum computers,” Bloom said. “Mid-circuit measurement enables us to understand what is happening during a computation and make decisions based on the information we are seeing,”

Doing this is tricky. Qubits, whether they are in an atomic array, ion trap, or on a chip, are situated microscopically close. Qubits are finicky, fragile, and sensitive.  A stray photon from a laser or a stray electric field can cause the wrong qubit to decohere and lose its quantum state. 

The Atom Computing team exhibited a technique to “hide” data qubits and shield them from the laser used to measure ancillas without losing any of the quantum information stored in the data qubits. They also showed a competitive SPAM fidelity, a metric that states how well a qubit can be read out. This work demonstrates an important pathway to continuous circuit processing.

What’s Next

Atom Computing is building quantum computers from arrays of neutral atoms because of the potential to significantly scale qubit numbers with each generation.  We previously demonstrated record coherence times on our 100-qubit prototype system and are now working on larger scale production systems to offer as a commercial cloud service.  Our demonstration of mid-circuit measurement, error detection, and correction was performed on these next-generation systems.

Our journey toward fault tolerance continues. We are working to achieve all the necessary “ingredients” listed above on our current systems and on future machines with the Defense Advanced Research Projects Agency.  DARPA selected Atom Computing to explore how atomic arrays of neutral atoms can accelerate the path to fault-tolerant quantum computing.


Atom Computing Welcomes New Scientific Advisors

Dr. Jonathan King, Chief Scientist and Co-Founder

I am pleased to welcome two renowned researchers in the field of quantum information science to Atom Computing as independent scientific advisors. 

Dr. Bert de Jong, Senior Scientist and Deputy Director of the Quantum Systems Accelerator at Lawrence Berkeley National Laboratory, and Dr. Eliot Kapit, Associate Professor of Physics and Director of Quantum Engineering at Colorado School of Mines, join our long-time advisor Dr. Jun Ye, Professor of Physics at University of Colorado-Boulder and Fellow at JILA and the National Institutes of Standards and Technology.  Together, these scientists and academic leaders will help us advance the state of the art in quantum computing by providing deep technical expertise and guidance to our team. 

Since its inception in 2018, Atom Computing has been building quantum computers from atomic arrays of optically trapped neutral atoms with the goal of delivering large-scale fault-tolerant quantum computing for a range of use cases and applications.  Building high-quality quantum computers is hard work that requires tight collaboration across our team of theoretical and experimental physicists and engineers of multiple disciplines. 

We also frequently consult with researchers from other organizations to solve difficult technical challenges.   In addition to our scientific advisors, we also have active collaborations with other experts from Sandia National Laboratories, DARPA, University of Washington, University of Colorado-Boulder, and others to support R&D of technologies required for our future roadmap.

We are at the point where we are focusing not only on developing our hardware, but also what users can do with it to solve commercial problems at scale.  Bert and Eliot are experts in quantum computing use cases and algorithms.  Their expertise will help our quantum applications team and customers learn how to get the most value out of our hardware platforms.

Bert leads Lawrence Berkeley National Laboratory’s Computational Sciences department and its Applied Computing for Scientific Discovery Group. His group’s research encompasses exascale computing, quantum computing and AI for scientific discovery.  (View his bio.

When asked about the role, Bert stated, "Atom Computing's path to universal quantum computing with atom arrays is exciting, and I am honored to be a scientific advisor providing critical input and guidance into their software and algorithm strategies.”

Eliot’s research at Colorado School of Mines focuses on quantum information science, particularly routes to achieve practical and scalable quantum advantage in noisy, near-term hardware.  (View his bio). 

Here is what Eliot said about his new role: I'm excited to have joined the scientific advisory board at Atom Computing. Neutral atom quantum computing has progressed shockingly quickly in just the past few years - and I say this as someone who's been working on superconducting qubits for the past decade, which certainly haven't been standing still.  I think Atom, in particular, has both a compelling roadmap toward large scale quantum computers, and a very capable team to make it happen.”

I am looking forward to future collaborations with Bert, Eliot, and Jun to drive large-scale quantum computing with applications that change the world.

How neutral atoms could help power next-gen quantum computers

Atom Computing selected by DARPA to accelerate scalable quantum computing with atomic arrays of neutral atoms

January 31, 2023 — Berkeley, CA – Atom Computing has been selected by the Defense Advanced Research Projects Agency (DARPA) to explore how atomic arrays of neutral atoms could accelerate the path to fault-tolerant quantum computing. 

The company received a project award and funding to develop a next-generation system as part of DARPA’s Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program.  According to DARPA, the primary goal of the US2QC program is to determine if an underexplored approach to quantum computing is capable of achieving utility-scale operation much sooner than conventional predictions.

For this project, Atom Computing will focus on the scalability of atomic array-based quantum computing and the capability of the company to produce systems based on the technology.  Atom Computing has shown early demonstrations of its speed and the scalability of its technology by being the fastest company to develop a 100-qubit prototype and demonstrating record coherence times.

The DARPA-sponsored project will explore new ways to scale qubit count for larger systems, additional layers of entanglement connectivity for faster performance, and a broader set of quantum error correction algorithms for fault tolerance.

“In order to realize the scaling advantages of our quantum computing technology, there are a number of engineering challenges that need to be overcome.  With DARPA’s support, we will be able to accelerate our development timeframe”, said Rob Hays, CEO of Atom Computing.  “We are honored to be selected for such an important program to advance Atom Computing and the United States toward utility-scale quantum computing.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically trapped neutral atoms. Learn more at atom-computing.com, and follow us on LinkedIn and Twitter.

What are Optical Tweezer Arrays and How are They Used in Quantum Computing? Atom Computing’s Remy Notermans Explains.

In recent months, researchers from different institutions won major physics awards for advancing optical tweezer arrays and their use in quantum information sciences.

These announcements drew broader attention to optical tweezer arrays, even in the physics community.  At Atom Computing, however, they are always top-of-mind – optical tweezers are critical to our atomic array quantum computing technology. 

What are optical tweezer arrays and how and why do we use them in our quantum computers? Dr. Remy Notermans, who helped develop the optical tweezer array for Phoenix, our prototype system, answers these questions and more.

What are optical tweezer arrays?
A single optical tweezer is a beam of light used to capture atoms, molecules, cells, or nanoparticles, hold them in place or move them as needed. 

This is possible because light can attract or repulse a particle depending on the color (wavelength) of the light and the absorption properties (electronic energy level structure) of the particle.  By choosing the right wavelength, a particle will be drawn or attracted to the region with the highest intensity of light, trapped in what is known as a potential well (the energy landscape in which an atom wants to go to the lowest point.)

An optical tweezer is created when a laser beam is focused through a microscope objective lens. As the laser beam gets focused it forms into a "tweezer" capable of holding miniscule objects and manipulating them in its focal point. Think of the tractor beam from Star Trek.

To create an optical tweezer array, the laser beam is manipulated before it is focused through a microscope object lens to create a custom-made array of optical tweezers that can be tailored to specific needs – topology, dimensions, and orientation.

Are optical tweezer arrays a new technology?
Optical tweezers have been used by researchers in the fields of medicine, genetics, and chemistry for decades. In fact, Arthur Ashkin, “the father of optical tweezers,” was awarded the Nobel Prize in Physics in 2018. Ashkin’s work dates to 1970 when he first detected optical scattering and the effect of different levels of force on particles the size of microns.  He and some of his colleagues later observed a focused beam of light holding tiny particles in place – or optical tweezers.

More recent scientific work has expanded to actual arrays of optical tweezers, allowing for studying many particles simultaneously, biophysics research, and of course quantum information processing.

How does Atom Computing use optical tweezer arrays? What are the benefits?
Optical tweezers are critical to our atomic array quantum computing technology, which uses neutral atoms as qubits.  We reflect a laser beam off a spatial light modulator to create an array of many optical tweezers that each “trap” an individual qubit.  For example, Phoenix, our 100-qubit prototype quantum computer, has more than 200 optical tweezers created from a single laser. Each tweezer can be individually calibrated and optimized to ensure precise control. 

Optical tweezer arrays enable us to fit many qubits in a very small amount of space, which means that scaling the number of qubits by orders of magnitude does not significantly change the size of our quantum processing unit.  By integrating clever optical designs, we foresee a sustainable path toward atomic arrays that are large enough for fault-tolerant quantum computing.

In fact, optical tweezers inspired the Atom Computing logo.  If you turn our “A” logo upside down, it is a visual representation of an optical tweezer holding an atom in a potential well.

Are optical tweezer arrays used for other purposes?
Yes, optical tweezer arrays have been used extensively by researchers in other scientific fields. They have been used by scientists to trap living cells, viruses, bacteria, molecules, and even DNA strands so they can be studied.

Has the work of the New Horizons Physics Prize winners influenced Atom Computing’s approach? If so, how? 
We understand this is a fundamental part of the academic-industrial ecosystem, which is why Atom Computing is involved with many partnerships and funds academic research efforts that potentially help us propel our technology forward.  Combined with the knowledge and experience of our world-class engineering teams, we take these breakthroughs to the next level in terms of scalability, robustness, and systems integration.


What Developers Need to Know about our Atomic Array Quantum Computing Technology

Justin Ging, Chief Product Officer

If you are a developer working in the quantum computing space, you are familiar with or have run a circuit on a superconducting or trapped ion quantum computer. 

These two technologies were the early pioneers of the quantum hardware landscape and small versions of each have been available commercially for years.  A major challenge with these approaches is how to scale them to thousands or millions of qubits with error correction.

More recently, an alternative quantum computing technology with the potential to scale much quicker and easier has emerged - systems based on atomic arrays of neutral atoms.   These systems have inherent advantages, which have led to multiple teams developing them.  

But just as there is more than one way to cook an egg, there are different approaches to building quantum computers from atomic arrays.

At Atom Computing, we are pioneering an approach to deliver highly scalable gate-based quantum computing systems with large numbers of qubits, long coherence times, and high fidelities. 

Here are some key advantages of our atomic array quantum computing technology:

  1. Long coherence times.  Most quantum hardware companies measure coherence in units of milliseconds.  We measure it in seconds. The Atom team recently set a record for the longest coherence time in a quantum computer with Phoenix, our first-generation 100-qubit system. Phoenix demonstrated qubit coherence times of 21 seconds.  The longer qubits maintain their quantum state, the better.  Developers can run deeper circuits for more complex calculations and there is more time to detect and correct errors during computation.  How do we create such long-lived qubits? Weuse alkaline earth atoms for our qubits. These atoms do not have an electrical charge, thus they are “neutral.”  Each atom is identical, which helps with quality control, and are highly immune to environmental noise.
  2. Flexible, gate-based architecture.  Atom Computing is focused on developing a flexible and agile platform for quantum computing by supporting a universal quantum gate-set that can be programmed using standard industry quantum development platforms.  This gate-based approach allows developers to create a wide range of quantum algorithms for many use cases.  Our qubit connectivity uses Rydberg interactions where the atoms are excited to a highly energized level using laser pulses causing their electrons to orbit the nucleus at a greater distance than their ground state to interact with nearby atoms.
  3. Designed to scale.  Neutral atoms can be tightly packed into a computational array of qubits, making the quantum processor core just fractions of a cubic millimeter.  Lasers hold the atomic qubits in position in this tight array and manipulate their quantum states wirelessly with pulses of light to perform computations. This arrangement of individually trapped atoms, spaced only microns apart, allows for massive scalability, as it is possible to expand the qubit array size without substantially changing the overall footprint of the system.  For example, at a 4-micron pitch between each atom and arranged in a 3D array, a million qubits could fit in less than 1/10th of a cubic millimeter volume.

Developers looking for gate-based quantum computers with large numbers of qubits with long coherence times, should be looking to partner with Atom Computing.  We are working with private beta partners to facilitate their research on our platforms. Have questions about partnering? Contact us.

Silicon Valley Up-Start, Atom Computing, Chooses Colorado to Build Next-Generation Quantum Computers

September 28, 2022 — Boulder, CO — Atom Computing today announced the opening of its new research and development facility in Boulder during a ceremony attended by industry and academic partners, officials from federal, state, and local government, and representatives from Colorado’s Congressional delegation.

The new facility is Atom’s largest to date and will house future generations of its highly scalable quantum computers, which use atomic arrays of optically-trapped neutral atoms. The company opened its first office, which also serves as its global headquarters, in Berkeley, California in 2018.

Governor Jared Polis called the Boulder facility a significant and important investment in Colorado and evidence the state is emerging as the preeminent hub for quantum computing innovation in the U.S. and globally.

“We are excited to welcome Atom Computing to Boulder, which is already one of the world’s most booming centers for the quantum computing sector,” Polis said. “The addition of Atom Computing helps further position Colorado as an economic leader for the next big wave of technology development and will create more good-paying jobs for Coloradans.”

The Boulder facility represents an important milestone for Atom Computing, which raised $60 million through a Series B earlier this year to build its second-generation systems. The company’s 100-qubit prototype system, Phoenix, is housed in Berkeley and recently set an industry record for coherence time.

“Leading researchers and companies are choosing to partner with Atom Computing to develop quantum-enabled solutions because our atomic arrays have the potential to scale larger and faster than other qubit technologies,” said Rob Hays, CEO of Atom Computing.

Hays said the company chose Colorado because of the quantum expertise and top talent in the area and plans to expand its presence in the state. 

“We expect to invest $100 million in Colorado over the next three years as we develop our roadmap and hire more employees to support those efforts,” he said.

Ben Bloom, Atom Computing’s founder and CTO, said the company’s strong ties to Colorado also contributed to its decision to build a facility in Boulder.

“Many of our team members, myself included, have connections with local universities,” said Bloom, who earned a Ph.D. from University of Colorado-Boulder where he helped renowned physicist, Dr. Jun Ye, build one of the world’s most accurate atomic clocks. “We are committed to Colorado.”

Jun Ye, who currently serves as Atom’s Scientific Advisor, called the new facility an important addition to the quantum ecosystem.

"It is extremely gratifying to see our recent CU graduates emerge as the early trailblazers of the rapidly growing quantum industry,” said Ye, a physics professor at CU Boulder. “This creates a powerful ecosystem for the best science and technology to develop side-by-side, providing outstanding opportunities for Colorado students to lead the next wave of innovations in quantum research and the market.”

To learn more about Atom Computing visit: https://atom-computing.com.

###

About Atom Computing

Atom Computing is building scalable quantum computers with atomic arrays of optically-trapped neutral atoms, empowering researchers and companies to achieve unprecedented breakthroughs. Learn more at atom-computing.com, and follow our journey on LinkedIn and Twitter.