👋 Hi there! My name is Felix and welcome to Deep Tech Demystified! Once a month I will publish topical overviews or deep-dives about Deep Tech topics that excite me. I will try to make the writings comprehensible and accessible for people from all backgrounds, doesn’t matter if it’s business, engineering or something completely else.
🙋🏻♂️ A Short Preface
Hi there, glad you’re interested in this edition of Deep Tech for Dummies on quantum computing! As quantum computing is quite a hefty topic to cover, I will split it over more than one blog posting. This first one will cover a surface-level perspective on the quantum computing/mechanics history and concepts, different technology stacks and their use cases. The second blog posting will cover universal quantum computing and the underlying qubit technologies in more depth. A third blog posting might cover algorithms, software and adjacent topics. Please let me know if you’re interested in other areas of quantum and want it covered here!
PS: Check out Quantum Computing for Dummies by whurley and Floyd Smith, it’s a great first deep dive into the realm of quantum, even if you’re not technical.
📜 The History of Quantum Computing
Quantum computing, as an approach to solve problems beyond the reach of classical computers, is deeply rooted in discoveries in physics in the early 20th century. These discoveries started two large waves of technology disruption as explained later on. It promises to solve the most complex problems we currently face as humanity, where classical computing falls short, like climate modelling and materials discovery.
I previously covered the broader space of (classical) computing, the current limitations and their implications. If you’re completely new to computing, check out this article first to get a sense of the importance and the timing of quantum!
Quantum computing is based on quantum mechanics, a branch of physics that describes nature at the smallest scales, such as atoms and subatomic particles like photons and electrons. At this scale our style of thinking from classical physics fails as these particles behave fundamentally different. Simply take a look at Albert Einstein’s experiments in 1905, showcasing that “physical entities” at this scale posses wavelike and particle-like characteristics alike. This makes quantum mechanics - even for well trained physicists and scientists - so difficult to comprehend.
🔬 Quantum Technology 1.0
The triggering concept behind the first wave of quantum technologies was the quantum mechanical principle of quantization, the realization that the universe - like matter, time and energy - is “granular”. Atoms, as the basic unit of matter, consist of protons (positively charged particles) and neutrons (particles without charge) at the core with electrons (negatively charged particles) orbiting around said core. Max Planck formulated in 1900 that electrons occupy distinct energy levels or orbitals around the core of an atom. If an electron moves closer to the core towards a lower (discrete) energy level, a photon of that exact “quantized” energy is emitted. With those insights, radiation - such as heat and light - can be broken down to tiny discrete and indivisible units, challenging a continuous assumption from classical physics.
Scientists and engineers made use of this quantization in the first wave of quantum technologies working with large groups of quantum particles like photons and electrons, focusing on influencing their collective behavior rather than manipulating individual particles. This approach was chosen primarily due to the lack of sophisticated tools necessary for single-particle manipulation at the time, as you can imagine that it is very difficult to handle, for example a single photon. A photon is also called the “quantum”, or fundamental unit of electromagnetic radiation. The primary goal in the first phase was to ensure that these particles acted uniformly as a group. Despite these limitations, the foundational phase of Quantum Technology 1.0 led to the development of crucial tools and techniques that paved the way for more advanced quantum technologies of the next wave. A significant invention emerging from this era are solar cells that basically generate electricity by emitting electrons when hit by electromagnetic radiation, i.e. photons. Other inventions are electron microscopes, transistors and lasers.
Another concept of quantum mechanics discovered in that era that will be relevant shortly is called uncertainty. It basically states that we can’t know position and speed of a fundamental particle (like a photon) at the same time. The closer we measure one of the properties, the more uncertain we are about the other. This doesn’t mean, that measurements at this scale are inaccurate, but rather reflects the inherent uncertain behavior of anything that is wave-like (remember, anything at this scale behaves as particle as well as wave!). Imagine you have a wave with many ups and downs. Where is the wave exactly? Well, it is spread out over many periods, so you can’t really tell where the wave is because it is essentially everywhere. But by the many peaks you can easily tell the frequency of its occurrence, i.e. its speed. On the contrary if you have a short wave pulse you can tell the position of the (short and pulsed) wave, but it is really hard to make out its frequency. This way of thinking fundamentally differs again from classical physics, where it is easy to measure the position and speed of a body of dense matter.
⚛ Quantum Technology 2.0
The second wave of quantum computing innovation currently takes place and can be labeled as Quantum Technology 2.0. It’s a combination of knowledge from classical computing, quantum mechanics and the technologies of the first quantum computing wave. Out of it emerged an interdisciplinary field called quantum information science (QIS) that essentially studies how quantum systems can be used to encode, transmit, manipulate and measure information in a fundamentally different way from classical systems. The basis for those quantum systems are the so called qubits (“quantum bits”), the quantum computing version of bits. They can be made up of different small scale particles like photons and electrons but also chunks of superconducting material. The next blog posting will cover the underlying qubit technologies in detail, so stay tuned!
One of the fundamental differences between qubits and bits is superposition. It’s the possibility of qubits to be not in a single defined state but rather a in combination of several possible states in computation until being measured. This means, that qubits are in multiple states at once! Thus, a qubit can hold for example both values of 0 and 1 simultaneously, while a bit can only hold either values of 0 or 1. The possible states of a qubit are represented in a bloch sphere as shown below. Once the computation result is measured, the qubits will be either in states 0 or 1. This principle of superposition during computation gives quantum systems a massive advantage in parallel computation. A quantum computer with n qubits can represent and process 2n states simultaneously, thus allowing a quantum computer to explore a vast computational space in parallel with relatively few qubits.
Another important concept of quantum systems is called entanglement. In classical computing bits are separated from another meaning that a bit will be independently from another bit in the state of either 0 or 1. Qubits on the other hand can be entangled with each other, meaning that the state of one qubit can be dependent on the state of another qubit and they will remain that way even if separated over vast distances. This connection is instantaneous, meaning if you measure something about one particle, you immediately know the measured property about the other entangled particle even if they are on different planets! This phenomenon was famously called “spooky action at a distance” by Albert Einstein in 1947.
Adding now entanglement to superposition which - as stated earlier - allows for the simultaneous representation and processing of information across multiple states, entanglement enables coordinated, complex operations across these states, amplifying the benefits of superposition. This synergy allows quantum computers and systems to perform certain calculations much more efficiently than classical computers, potentially solving problems that are currently beyond our reach.
🧗🏼♂️ Current Challenges to be Overcome
As promising as quantum may seem to you now, there are several technological roadblocks in its way to become truly world changing. Qubits as the building blocks of quantum computation are highly sensitive devices that only work under certain constraints, are prone to errors and on a device level hard to scale. Imagine wanting to do computations on a single photon that wants to speed around in the universe at nearly 300.000 km/s. And to become computationally interesting you need dozens of those in an entangled state. Not the easiest task at hand.
To be able to compute on qubits, they need to be kept in a state of coherence during the whole process of computation. Coherence essentially means free from interaction with its environment. What causes decoherence of a qubit? Well, essentially everything, from heat to vibration and collisions. Thus, qubits need to be kept really cold, usually at absolute zero, the lowest temperature possible (−273.15 °C). Also, strong magnetic fields are often required to keep the quantum particles from colliding with its environment. Additionally, quantum operations on qubits are quite prone to errors due to the very fragile nature of qubits as well as imperfect control mechanisms. The current approach to combat this problem is to just add more qubits to the systems to detect and correct errors in calculation, while at the same time it is not that easy to add additional qubits with a reasonable footprint to the quantum system. So to make it short, technical challenges make it currently very difficult to reliably scale quantum systems to a level where they really shine and outperform today’s high-performance computers.
⚛️ The Quantum Computing Tech Stack
Out of the progress that has been made in quantum mechanics and technology, three main pillars emerged from QIS that will be covered here shortly:
Quantum Computing: Exploiting superposition and entanglement for computation and simulation.
Quantum Sensing: Exploiting quantum mechanics to enable high fidelity measurements.
Quantum Communications: Exploiting entanglement for communication.
🤓 Quantum Computing
As mentioned in the section about challenges in quantum computing, we are still a few steps away from realizing error-free (also called “fault-tolerant”) quantum computing at scale. We are currently in the noisy-intermediate scale quantum (NISQ) era of quantum computing. Noisy, because qubits are still limited in terms of reliability of their computation results and are prone to errors. As explained earlier, qubits are extremely sensitive to their environment, which can result in inaccuracies. Intermediate scale, because current systems deploy around 50-100 functioning qubits in a single computer, while we will need a lot more to reach a “quantum advantage” - the milestone when a quantum computer can solve problems that are beyond the reach of our mosts powerful classical computers.
But the principles of quantum computing are already useful in today’s day and age. Two approaches are quantum-inspired computing and quantum annealing.
💡 Quantum-inspired Computing
Quantum-inspired approaches are, as the name suggests it, algorithms and computational techniques inspired by the functional principles of quantum systems. But instead of requiring qubits to work, they run on classical computers. They draw on inspiration, for example, from superposition to solve specific problems in a more efficient way than traditional approaches, while running on error-free hardware, i.e. classical computers.
Quantum-inspired algorithms are particularly useful for optimization problems, where a vast space of possibilities needs to be searched to find the optimal solution (just like with superposition). These types of algorithms already provide a speedup today on classical high-performance computers. Quantum-inspired solutions are also a great way for companies new to quantum to explore the possibilities that quantum computing offers while being easily accessible through current cloud offerings from AWS, Azure and others. But of course the speed-up they provide is not nearly what a functioning, fault-tolerant, universal quantum computing may provide in the future. We’ll come to universal quantum computing in a minute.
🧮 Quantum Annealing
Quantum annealing is a specialized quantum computing technique, also called adiabatic computing, designed to tackle optimization problems. It operates on the principle that physical systems naturally seek to minimize their energy states, a behavior mirrored in quantum systems, i.e. the qubits. Quantum annealing processors exploit this tendency to find the lowest-energy solutions to complex optimization challenges, such as the travelling salesman problem. This is done in annealers by framing these optimization problems as energy minimization tasks, where the goal is to identify the optimal or near-optimal combination of elements corresponding to the problem's lowest energy state. Near-optimal, because they not always identify the global or absolute minimum, but a local minimum. This means the solution will not be to most optimal, but a good first guess that is near-optimal. Quantum annealers deploy few thousand qubits to reach these results. But the qubits in annealers are nowhere near as capable as qubits in a fault-tolerant, universal quantum computer. You have to consider qubit count for annealers and computer separately. Confusing, yes.
💻 Universal Quantum Computing
Universal quantum computers are what is really desired and - if fault-tolerant - what will enable large scale commercialization of quantum computing. The term universal simply means, that the quantum computer is a Turing machine - a machine that can carry out any imaginable mathematical or logical operation. Just like today’s classical computers (yes, they are also universal computers), just with a quantum advantage.
To perform operations on qubits, universal quantum computers use logic gates. These gates manipulate the states of the qubits, enabling complex algorithms. Manipulation can for example be done on certain qubits by using laser pulses (a technology from the first quantum wave!). The tricky thing here is, that the qubits must stay coherent while passing those logic gates. As qubits are extremely sensitive, the coherence time is usually very short, thus limiting the amount of logic gates that can be implemented, ultimately limiting computational complexity and thus usefulness. Also, the accuracy - called fidelity - of a qubit limits computational usefulness. The accuracy of the result of applying a logic gate depends on the qubit fidelity and if you apply several logic gates to reflect an algorithm, the error multiplies with every gate. Think of a qubit with a fidelity of 90%. If you apply ten logic gates, the result would have an accuracy of 0.910 = 35%. Quite useless! Choosing a qubit technology is currently a trade-off between coherence time, fidelity of the qubit itself, and scalability of the architecture, but we will dive deeper into that in the next post!
Quantum computers hold the promise of delivering incredible value in various fields. One of the main cases for use (and concern) is the capability of quantum computers to crack current approaches to securing information via cryptography. We will come back to that in the chapter about quantum communication! They will also be incredibly useful in simulating complex quantum systems like materials and drugs. The finance industry will also tremendously benefit from quantum computer’s ability to optimize strategies and simulate complex portfolios. Further, machine learning will be hugely impacted, since machine learning is in its essence an optimization problem. Opportunities seem nearly endless!
⚖️ Quantum Sensing
Quantum sensing harnesses the quantum mechanical properties of matter to perform measurements that are more sensitive than any classical sensor can perform. Diamonds have been proven to be good candidates as implementation, using Nitrogen Vacancy Centers (NVCs), which are the size of an atom. These centers are a type of defect within the diamond’s crystal structure where a nitrogen atom replaces a carbon atom, and an adjacent site in the lattice is left vacant. This defect creates a unique quantum system within the diamond.
These NVCs in diamonds exhibit unique optical and spin properties that enable them to absorb and emit light (fluorescence) and have their electron spin states manipulated and measured. An electron spin is a quantum mechanical property of an electron analogous to angular momentum from classical physics. When illuminated by laser light, NVCs can be excited to a higher energy state and emit photons upon returning to their ground state, a process influenced by their quantum state, allowing for optical readout. Additionally, their spin properties can be altered by applying a magnetic field, affecting their fluorescence and enabling the detection of magnetic field changes through variations in emitted light.
This combination of properties makes NVCs powerful tools for a vast spectrum of sensing applications. For example quantum sensing will be useful for “quantum clocks”, a new type of atomic clock, that is way more precise and thus would allow for a quantum GPS system to be in place. Also, quantum temperature sensors provide the capability for highly sensitive monitoring of temperature fluctuations, which is vital in areas like materials science, where grasping thermal behavior at the nanoscale is critical to fully understand the underlying processes. Quantum sensors are also interesting for imaging tasks, gravity sensing, and lots of other areas.
📡 Quantum Communication
The fundamental principle behind quantum communication is entanglement between qubits and the instantaneous transfer of the qubit’s state. This “spooky action at a distance” as Einstein coined it, works faster than the speed of light and has already been demonstrated in space in 2017. There, researchers “teleported” a qubit state of a photon from the ground to a satellite. Thus, quantum communication holds the potential to offer new paradigms for transmitting information.
🔐 Quantum Key Distribution
Quantum Key Distribution (QKD) is one of the most well known applications of quantum communication. The idea behind QKD is to generate a shared, secret, and random key between two parties. This key can be used to encrypt and decrypt messages. Specifically, QKD is the method of distributing the key between parties. A fundamental problem with other (non QKD) cryptography protocols to protect such keys is that they rely on math, meaning that if you have enough computing power, you can crack the algorithm that protects a shared key to encrypt and decrypt messages. In contrast in QKD, the key is shared via the quantum state of qubits. So, if a third party wants to eavesdrop, said party has to measure the quantum state of the qubit, inevitably altering the state and thus resulting in detection that somebody unauthorized tried to “steal” the key. To make it short, QKD relies on physics instead of maths unlike other security protocols and thus offering the only “real” security. However practical challenges like transmission distances, error rates, and integration with existing infrastructure currently hinder widespread implementation QKD for security-relevant applications.
🔒 Post-Quantum Cryptography
Post-quantum cryptography is different approach to increase information security through cryptography. The motivation behind this approach is quantum computing itself. Because quantum computers are especially suited to crack our current (math-based) algorithms that are mostly based on prime number factorization, post-quantum cryptographic algorithms are designed to be resilient to attacks by quantum computers (and classical computers alike), making them practically safe against the current and future level of computing power. A simple alternative approach to this is to add more bits to a cryptographic key, thereby increasing the number to be factorized and lengthening the time needed to crack it, outrunning progress in quantum computing. A potential problem of post-quantum cryptography might be though, that sensitive data secured with such a key is relevant for a long time. Think of data that contains a detailed plan on how to build military weapons that are relevant for several decades. As a result attackers might just store them until progress catches up and they have enough compute power to crack the key, even if it takes a few years down the line.
🏁 Concluding Remarks
For those of you interested in delving deeper into the rabbit hole of quantum technology, check out the book Understanding Quantum Technologies by Olivier Ezratty. It's a yearly updated extensive library about quantum computing, with more than 1,000 pages. It covers everything from the history of quantum mechanics and the different computing technologies to some hilarious examples of fake quantum claims.
And of course here a list of some cool companies to check out:
planqc - neutral atom quantum computer
eleQtron - trapped ion quantum computer
semiQon - silicon quantum computer
QuantumDiamonds - quantum sensors
Qnami - quantum sensors
CryptoNext Security - post-quantum cryptography
QphoX - quantum modem
And that brings us to the end of the 3rd episode of Deep Tech Demystified. I hope you enjoyed this first glance into the exciting topic of quantum computing! If you don’t want to miss the next content drop about universal quantum computing and qubit architectures, click below:
If you are a founder or a scientist that is building in the field of quantum computing, please reach out to me via mail (felix@playfair.vc) or Linkedin, I would love to have a chat! Cheers and until next time!