What is Quantum Computing?
An introduction to how quantum systems work, and how they may reshape the world.
Antikythera
The Antikythera mechanism is the first ever analog computer in human history. Estimated to be 2,000 years old, it was discovered in a shipwreck in 1901. Its purpose was to simulate the movement of celestial bodies with extreme precision. It was so advanced for its time, that machines with similar complexity did not reappear until the 14th century, approximately 1,400 years later.
The mechanism was made up of approximately 37 different bronze gears. By turning a single hand-crank, the gears would rotate at different ratios to model the positions of the Sun, Moon, and the five known planets of the era. It could track lunar phases, and predict solar and lunar eclipses, as well as planetary alignments.
The Antikythera mechanism had the ability to condense the vast and complicated movements of our solar system into a single, simply operated device.
Simulation
The Antikythera mechanism is proof that humans have always been obsessed with simulation, from drawing maps of the world, to building models of the solar system, to creating digital twins of entire cities. We have a constant desire to understand how things work by recreating them. Nearly every major technological breakthrough over the last century, such as computers, video games, AI, and VR, have all been another step toward more advanced, accurate simulations of reality.
Quantum computing represents the next leap in this progression. Unlike classical computers, which struggle with deeply complex systems, quantum computers are built to handle enormous complexity effortlessly. Simulations with quantum computers will revolutionize entire industries, as it gives us the ability to simulate almost anything, no matter how intricate or unpredictable.
Classical computing
As humanity continued its pursuit of more accurate simulations, our tools evolved from mechanical devices like the Antikythera mechanism into machines that relied on electricity rather than gears. Classical computers were the next major step in our attempt to model, predict, and manipulate complex systems. While ancient mechanisms simulated the universe using gears and ratios, classical computers simulate logic, numbers, and information through electronic switches. This leap allowed us to move from physical models of the universe to digital models of virtually anything, setting the stage for the computational world we rely on today.
However, we're now reaching the limits of how small transistors can become. Today's transistors are only a few nanometers wide, so small that they are only a handful of atoms thick. At this microscopic scale, chips are becoming significantly more error prone. Physically, we can’t shrink transistors much further, which puts a cap on how much more powerful classical computers can get using the same approach.
This system is extremely effective for structured problems, but struggles with complexity as systems scale.
Bits (classical)
Classical computers operate using binary logic, so every piece of information is represented as either a 0 or a 1. These values are controlled by transistors, which act like tiny on/off switches that regulate the flow of electricity. Over the last several decades, tech companies have continually reduced the size of these transistors, allowing more of them to fit on a chip. More transistors mean more computational power.
Because they rely on strict binary values, classical computers excel at structured, logical, and arithmetic calculations such as word processing, graphics rendering, spreadsheets, web browsing, machine learning, and more. However, this binary structure also makes them inefficient for problems involving massive complexity, such as simulating chemistry, predicting global weather systems, optimizing gigantic networks, or solving certain mathematical puzzles. These types of problems require exponential computation that classical machines struggle to handle.
Quantum fundamentals
Quantum computers take a completely different approach. Instead of bits, they use quantum bits, or qubits, which operate according to the rules of quantum mechanics. Unlike a classical bit, which must be either a 0 or 1, a qubit can be 0, 1, or a combination of both at the same time. This ability dramatically expands the kinds of calculations a quantum computer can perform.
There are three distinctive behaviors people talk about most when explaining qubits: superposition, entanglement, and the constant fight against decoherence (losing quantum behavior to the environment). Together, they shape what quantum programs can do—and how hard they are to run in the real world.
Qubits
Quantum computers use quantum bits, or qubits, governed by quantum mechanics. Unlike a classical bit, which must be either a 0 or 1, a qubit can be 0, 1, or a combination of both at the same time.
Qubits are not simply “either 0 or 1”—they can sit in weighted combinations until they are measured. A common picture is that they are “0 and 1, and everything in between simultaneously,” in the sense that probabilities—not definite classical values—describe the state until a measurement forces a more definite outcome.
This is part of what enables quantum algorithms to explore many possibilities in parallel for suitable problems, compared with classical search that often must try routes one after another.
Measurement
In quantum computing, measurement is the step where a qubit’s uncertain state is turned into classical information you can read out—typically something closer to a definite 0 or 1 for each measured qubit.
Before measurement, superposition describes possibilities and weights; measurement samples from those possibilities according to quantum rules. That is why algorithms must be designed carefully: some information is lost or randomized when you measure, and the timing of measurement is part of the program itself.
Superposition
Superposition is the idea that qubits can exist in multiple different states at once. While a bit is either 0 or 1, a qubit can behave as if it is carrying 0 and 1 at the same time in a weighted way. One way to see this is that a coin has heads or tails, but when you toss it up to flip it, it’s in a state where it could either be heads or tails, but you’re unsure of which yet. Qubits work on probability, meaning they represent 0, 1, and every single state in between, representing many possibilities simultaneously. This ability to hold and process multiple states at once is what gives quantum computers the potential to explore several solutions at once, and solve certain problems much faster than classical machines.
An example of how this would work is the maze example. When you get a bit to solve a maze, it can essentially only keep trying one route at a time, and back out if it gets stuck, trying another. A qubit can simultaneously take every single path to reach the ending faster.
Entanglement
Entanglement is when two or more qubits connect, and share information between each other. The moment entanglement happens, both qubits immediately have the same information, and they react together, regardless of where they are. Entanglement provides two main functions to qubits: it helps it compute faster, and reduces errors.
By sharing information, qubits can perform different mathematical tasks simultaneously and then share the information, meaning a solution can be achieved more quickly. The general process for most tasks with a quantum computer look like this:
• Superposition allows the qubits to explore many probabilities at once.
• Entanglement takes the many probabilities, and shares them between qubits to coordinate; all qubits will share the same data calculated by many qubits, and will identify patterns between all the calculations.
• Qubits will then disentangle, allowing the data to become more useful and easily understood.
Since qubits are solely based on probabilities, they can be prone to errors at times. Many of the errors will occur during superposition, when qubits are exploring many probabilities. In the process as explained above, when entanglement occurs, all qubits will share their properties between each other and identify patterns. This means that any potential outlier (error) will be easily identified by all the other qubits, and the output will be more accurate. Entanglement is extremely important for error reduction in quantum computers.
Decoherence
One of the biggest challenges that is preventing quantum computers from becoming mainstream is decoherence. Decoherence is when a qubit loses its quantum properties and collapses into either a 0 or 1 state. Qubits are extremely sensitive to nearly everything, and decohere quite easily.
Currently, quantum computers are huge machines, around 10x10ft, or larger depending on the type of computer. The quantum chips themselves are only a few centimetres, so most of its size comes from the cryogenic machine surrounding it. Qubits need to be kept in complete darkness, at near absolute zero temperatures at all times for them to remain stable. They can also decohere simply by being observed in any way, and observed does not necessarily just mean seen by a human. Observation can also mean interacting with any thermal noise (heat), or stray photons (light rays). This is why it’s crucial that quantum computers are kept in the most optimal conditions.
Quantum systems
Quantum supremacy is the point in which a quantum computer can perform a task, that is deemed impossible for a classical computer, at a reasonable speed. While this may seem extremely significant, it actually doesn’t mean much as there aren’t any guidelines on the task. This means that the ‘task’ that deems quantum supremacy, can actually be a completely pointless task. Often, the task is made up solely to prove the point of quantum supremacy.
The first computer to achieve quantum supremacy was Google’s Sycamore in 2019, though it’s a debated topic. This is for the exact reason, that the task it achieved faster was made up, and solely used for the purpose of proving quantum supremacy. Google reported that they performed a particular “random circuit sampling” task in ~200 seconds and they estimated it would take the world’s fastest classical supercomputer ~10,000 years. The problem solved by Sycamore was not a practical problem like encryption-breaking, optimization, or simulation of chemistry. It was a specially designed benchmark that was chosen because it is known to be extremely difficult for classical machines. Soon after Google claimed this milestone, researchers developed a way to complete the exact same task on a classical computer within days-weeks, rather than millenia.
Since then, a new benchmark has been set called quantum advantage, which is when a quantum computer is able to achieve a genuinely useful task that is deemed impossible to complete with a classical computer. In late 2025, D-Wave and Google have both claimed to achieve quantum advantage.
Types of hardware
There are 6 types of quantum computers that companies are building, each with their own pros and cons. The 3 biggest ones are superconducting, trapped-ion, and photonic.
The most common type today is the superconducting quantum computer, used by Google, IBM, and SpinQ. These machines use small electrical circuits made from materials that only work at near absolute zero. This allows the electricity to flow without resistance, which prevents qubits from decohering. The advantage is that this technology works right now and can operate very fast, but the downside is that qubits decohere easily, so the machines need huge, cryogenic machines to stay functional.
Another major approach is trapped-ion quantum computers, used by companies like IonQ and Quantinuum. Instead of circuits, they trap individual atoms using laser beams. Each atom acts as a perfect qubit because nature already made it stable and identical to the others. These systems are extremely accurate, but they tend to be slower and are hard to scale into larger computers.
A third approach is the photonic quantum computer, which uses photons (light particles) as qubits. These systems guide light through tiny optical chips, making them great candidates for the future of the quantum internet. Photons travel easily over long distances and the machines can operate at room temperature, which makes them more practical. However, photons have difficulty when it comes to entanglement, which removes one of the most important aspects of quantum computing.
Beyond these top three, there are also more experimental types such as topological, neutral atom, and quantum annealers. These aren’t used as frequently, but research is still being done to test their viability.
Encryption
One of our biggest concerns related to quantum computing and its impact on the world is encryption. Since quantum computers can do more complex mathematical tasks, once functional, they have the ability to decrypt everything we have today.
Encryption today is based on RSA, which is a system that relies on multiplying two large prime numbers. Each of the individual numbers remain private and only belong to the individuals, but the multiplication result is shared publicly. The information can only be decrypted using the two private keys, which are extremely hard to calculate with a classical computer and would take millions of years. A quantum computer could complete this calculation in seconds.
This is why many people are working towards post quantum encryption methods. In 2016, the National Institute of Standards and Technology launched a competition for developing post quantum encryption methods. After several years of research and testing the submissions, they selected one chosen method in 2024.
NIST has focused on creating and standardizing a new generation of post-quantum cryptography (PQC), which are mathematical encryption algorithms that can run on today’s classical computers but are designed to remain secure even against future quantum attacks. Their main choice for secure communication is ML-KEM, a strong and practical encryption method. For proving identity and signing documents, NIST approved three options: Dilithium (the main one), FALCON (a smaller, lighter version), and SPHINCS+ (a safe backup method). They also selected HQC as an extra backup for encryption. While quantum cryptography remains a niche technology, NIST’s official guidance emphasizes adopting these new post-quantum algorithms to secure communications today and protect them from the quantum computers of tomorrow.
Quantum Key Distribution (QKD) is another type of quantum cryptography, which uses photons to transmit encryption keys, and information in general. This would require photonic quantum computers, but it is beneficial as it could run with our current internet systems. The interesting part about this method is that if a hacker tries to intercept anything, the photons change their state, instantly revealing the breach. This is the first form of communication that is provably unhackable by physics, not just mathematics.
While the encryption problem will be solved once post-quantum cryptography is fully implemented, there are still many concerns that exist during this in-between period. Firstly, the adoption of post-quantum cryptography is extremely slow, and most companies have not yet made the move to switch, and are showing no interest in it either. This is because many organizations do not see the potential in quantum computing, or they see it as a far away problem that they can deal with later. Unfortunately, this is putting our personal information at risk.
Hackers are currently using a tactic called “Harvest Now, Decrypt Later” (HNDL). This is when they take information that is encrypted via RSA, and store it for the long term, so they can decrypt it once quantum computing technologies become openly available to them. Many organizations may have been breached and had their data stolen, but they have no idea because these hackers do not make ransom threats or inform the institutions at all. Instead they quietly take the encrypted data and store it for later. This is why it’s crucial that institutions adopt post-quantum cryptography methods immediately, rather than putting it off.
Overall, quantum computing is completely reshaping the way encryption and data protection works, and only time will tell how protected our information really is.
Climate & energy
One of the biggest barriers in green energy is the inconsistency of power availability. Renewable energy sources such as wind, solar, and water power are heavily dependent on unpredictable natural conditions. If a storm blocks sunlight or wind slows down, the grid can become unstable and energy supply becomes inconsistent.
Quantum computing has the power to significantly improve how we forecast weather and energy usage. Classical computers can provide predictive models, but they struggle when too many variables and real-world uncertainties are involved, and this variability is increasing with climate change, complicating everything further. Quantum computers can analyze countless environmental variables simultaneously, such as cloud cover, wind shifts, ocean currents, and temperature fluctuations, producing weather models that are significantly more accurate.
With better forecasting, energy companies could optimize how and where power is stored and distributed. This could lead to smart energy grids, which are systems that are automatically aware of upcoming weather patterns and prepare for energy loss or surges ahead of time. This would minimize waste, reduce outages, and make renewable energy a truly reliable and sustainable source of energy.
Additionally, a technique known as quantum annealing could help optimize grid management by reducing energy loss in transmission. Better energy efficiency and prediction capabilities will help accelerate the transition away from fossil fuels and create more sustainable energy infrastructure across the planet.
Accurate weather prediction is not only important for renewable energies, but can also save countless lives. With more advanced weather prediction models, quantum computers could detect natural disasters significantly earlier, and affected areas could take action to protect the population sooner. This means faster evacuation/shelter orders, earlier preparation of resources, and potentially less aid needed further into the natural disaster thanks to better preparation practices.
Another sector that could be dramatically transformed by quantum computing is material design. Today, developing new materials, whether for batteries, clothing, airplanes, or medical devices, involves a lot of guesswork, trial-and-error, and extremely energy-intensive simulations. Classical computers struggle to accurately model the behavior of molecules, so companies often rely on physical prototyping, which is slow, expensive, and produces a significant amount of waste and emissions.
Quantum computing changes this because it can simulate molecules and chemical reactions at an atomic level with far greater accuracy. Instead of approximations and best guesses, quantum algorithms can help scientists “see” exactly how atoms behave, bond, and interact. This opens the door to designing entirely new materials with properties we’ve never been able to engineer before such as stronger alloys for construction, higher-capacity batteries, biodegradable plastics, and fabrics that self-repair. These advances extend to the manufacturing process too. With better simulations, companies could optimize how materials are produced, reducing energy usage, cutting down on waste, and lowering carbon footprints across supply chains.
Quantum computing also plays a major role in accelerating research in climate-related technologies. By modeling complicated chemical processes, quantum tools could unlock more efficient methods for carbon capture, storage, and conversion. In short, quantum computing could help reinvent the materials we use, the way we manufacture them, and the technologies we rely on to repair the planet, all of which pushes us toward a cleaner, more sustainable future at a pace classical computing alone can’t achieve.
Computational chemistry
Chemistry is fundamentally a quantum process as atoms behave according to quantum physics. Classical computers struggle to simulate molecules with high precision, because the number of atomic interactions grows exponentially as molecules get larger. That’s why developing new medicines, batteries, and materials is still incredibly slow and expensive.
Quantum computers can simulate the behavior of atoms and molecules directly, allowing us to see how particles interact at the quantum level. This could lead to many breakthroughs, the first being new materials for manufacturing that are stronger, lighter, and more sustainable as mentioned in the material design section. Another one would be new batteries which would be able to store more energy, have less environmental impact, and potentially allow for longer lasting EVs. Finally, this could also result in cleaner chemical processes in manufacturing, resulting in reduced waste and energy consumption.
For example, battery chemistries are currently trial-and-error based, with long testing cycles. Quantum simulations could rapidly predict which combinations will store the most energy safely, accelerating innovation in renewable tech and electronics.
In addition, modeling molecular reactions efficiently could also help us understand illness development and create antidotes or treatments faster than classical computation allows. Computational chemistry is considered one of the earliest industrial fields that quantum computing may fully transform.
Healthcare
Today, classical computers can analyze genes and detect which ones may contribute to disease. However, they struggle to determine why those genetic issues occur, as they can only identify patterns in data, not the quantum-level molecular interactions that cause the issues in the first place.
Quantum computers can simulate biological systems down to the electrons, helping us uncover the true cause of diseases. This could lead to rapid advances such as faster drug discovery, by predicting which molecules will effectively target an illness. Quantum would also support more personalized medicine, providing treatments tailored to a patient's genetic makeup. With this, patients may experience less of a trial-and-error period with drugs, and be able to cure their symptoms faster.
Advanced simulations could also support better immunotherapies, and support cancer treatment, as well as faster diagnoses and prevention technologies. With a better understanding of the disease itself, we can help catch it and prevent it in much earlier stages.
A disease like Alzheimer’s is a perfect example. Scientists know the mutation that causes it, but not the molecular mechanics of how the disease originates. Quantum simulations could finally reveal the missing steps, unlocking more effective treatments.
Ultimately, healthcare becomes proactive rather than reactive, improving quality of life and reducing pharmaceutical development timelines from decades to possibly years.
Finance
Modern finance is built on two pillars, encryption that keeps data secure, and prediction that helps institutions understand markets and risk. Quantum computing will reshape both. As mentioned earlier, quantum is already reshaping cybersecurity and encryption as we know it today. On the prediction side, quantum processors can evaluate an enormous number of variables simultaneously, allowing for far more powerful risk modeling, fraud detection, and anomaly spotting than current systems allow. Banks could run simulations that capture entire market ecosystems in real time, improving everything from credit scoring to stress testing. Quantum algorithms also excel at optimization problems—meaning investment portfolios, trading strategies, and financial logistics could be fine-tuned at speeds and precision that exceed even the largest classical supercomputers. This leads to smarter financial automation and more adaptive economic systems.
The other side of quantum’s impact is the security threat. As mentioned earlier, quantum poses a major threat to all of our encrypted data today, but while we’ve been developing solutions for this, crypto faces an even bigger threat.
In 2020, an analysis showed that approximately 25% of all Bitcoin (worth around $40B at the time) existed in wallets that were directly vulnerable to quantum attacks due to exposed public keys. Similarly, about 65% of Ether was vulnerable in 2021, with the percentage increasing due to protocol changes that reveal more public-key material. The issue with cryptocurrency is that due to its de-centralized nature, it’s not only up to the institution to implement safety measures. For example, for bitcoin, one of the main recommendations is to move your Bitcoins to a new p2pkh address. This action is entirely up to the individual owner, but their lack of action poses an issue for all Bitcoin owners. If many owners don’t make the change to keep their coins safe, this risks a mass theft of Bitcoins. In the case of this happening, trust in the coin will go down, and the value will drop dramatically, causing losses for all owners. Because cryptocurrencies rely entirely on public-key encryption for ownership and transactions, the industry must migrate to post-quantum cryptography to remain secure.
Across climate modeling, medicine, chemistry, finance, and cybersecurity, quantum computing unlocks capabilities that are fundamentally impossible with classical systems. It will enable simulations of complex molecules, more accurate weather and climate predictions, faster drug discovery, and new materials with unprecedented properties. These technologies are still in their early stages, but once quantum computers become stable, error-corrected, and widely accessible, they will reshape the world in profound ways.
Simulation & future
This means that it doesn’t just apply to chemistry, climate modeling, or drug discovery, it also intersects with the concept of simulation theory. Simulation theory proposes that our universe might itself be a simulation, and that many parallel universes could exist, each running with slightly different conditions. The reason for this is that humans love simulations. If we ever develop the technology to simulate an entire universe or conscious beings, it’s unlikely that we’d stop at just one. Throughout history, whenever humans have gained a new tool, we’ve scaled it, and replicated it. A future civilization with advanced quantum computing might create not one but millions of simulated worlds, each exploring different versions of history, physics, or human behavior.
Quantum computing is the exact kind of technology that could make this possible. Classical computers can’t model reality at the level needed for a full-scale universe simulation, but quantum systems naturally operate using the same principles that govern the physical world. They’re fundamentally better suited to recreating the rules of the universe because they operate by similar rules. With a large enough, error-corrected quantum computer, we could theoretically simulate entire ecosystems, societies, or even consciousness with high accuracy.
Even if you don’t believe we’re currently living in a simulation, it’s realistic to assume that we will eventually create simulated worlds of our own. Once quantum computers become stable, scalable, and powerful enough, humanity will likely use them to build the simulated environments we’ve only imagined in philosophy, science fiction, and theoretical physics. Whether for scientific research, entertainment, exploration, or curiosity, quantum-enabled simulations could allow us to create and explore worlds that don’t exist physically, but feel just as real.