## The Key to Unlocking the Future

The largest supercomputers that solve the world’s most difficult problems are power hungry. And that appetite is growing at an alarming rate.

In June 2005, the number 1 top 500

computer had a power rating of just **716KW** (BlueGene/L).

In November

2023, the number 1 top 500 computer had a power rating of **22,703KW**

(Frontier).

**That is 22.7 MegaWatts.**

And with the current trend in the growth of AI, this is only going to increase.

If the field of supercomputing is to continue to grow in terms of capability, surely something must be done to stem the huge demand for power.

Is there another form of computing on the horizon that could be as power-ful but not power-hungry as today’s supercomputers?

**What about Quantum?**

Before delving into the subject of quantum computing, a brief overview of some concepts from quantum mechanics is necessary, as without it, quantum computing could not exist.

Quantum mechanics deals with physics at the sub-atomic level. It is the study of very small objects and how they have the characteristic of both particles and waves simultaneously (where a wave is a variation or disturbance that aids energy transfer). This behaviour is known as “wave-particle duality”.

Since the 1980s scientists have discussed the potential application of quantum mechanics in the field of computation.

The exchange of ideas and ongoing discussions between the two scientific disciplines led to the field of quantum computing as we know it today.

In some ways, quantum computing could be loosely compared with the advent of electric cars; the theory of electric cars is well known, and there are several models available, but the main limitation is that battery range, on the majority of models, its still not comparable to their fossil-fuel counterparts.

So much so that some manufacturers now are moving away from pure electric models, back to hybrid ones. This is in addition to the fact that the low number of charging stations has coined the phrase “range anxiety” among electric car drivers.

The same could be said for quantum computing. Like battery technology, quantum computers are not so advanced that they are commonplace in the majority of businesses and datacentres, even though quantum computers have been talked about for many years.

However, in the last two or three years, momentum seems to be gaining pace with more and more suppliers, and more and more variants of quantum devices appearing on the horizon. Where will this period of acceleration lead ?

Quantum computing utilises features of quantum mechanics that are essential for a quantum computer to function. These are :

**Superposition **with quantum mechanics, particles can exist in multiple states *simultaneously. *When applied to quantum computing, this theory is represented with qubits (quantum bits), which can be 0, 1, or 0 and 1 at the same time.

This differs from conventional computing bits that are either 0 or 1. It is for this reason that quantum computers possess the ability to process huge amounts of data in parallel which exponentially increases their computational power.

**Entanglement **is where two or more particles are related in terms of state, meaning the state of one particle becomes dependant on the state of another.

This is true even when the particles are separated by large distances. This feature means that parallel processing lies at the heart of quantum computing.

**Wave-particle duality **is where objects exhibit both wave-like and particle-like behaviours.

This theory plays an important role in the creation of quantum algorithms that utilise the wave-like behaviour of qubits to perform complex calculations.

It is the creation of these algorithms that led to the creation of quantum logic gates and the subsequent design of a physical quantum computer, whereas all the research carried out previously was purely theoretical

### How does Quantum compare to conventional computing?

Having touched on the fundamentals of a quantum computer, how does that compare against what we already know about traditional computers ?

We know that a traditional computer operates in binary, and uses binary arithmetic in the way it operates, and comprises a series of binary digits, or “bits”. These “bits” as the term “binary” would suggest, can have only two states. Compared to a quantum computer, this now sounds rather restrictive. However, the power of a traditional computer is based on *quantity*

When the first IBM personal computer was released in 1981 it utilised a 16-bit microprocessor. This means that without using any fancy memory-mapping techniques, it can only address 64 kilobytes (where a byte = 8bits) of RAM. This limits the power and usefulness of the machine.

Today’s desktop computers use 64 bits. Given the fact that each additional bit will double the possible address space, a 64 bit computer can address a whopping **18.4 Exabytes** of memory (18,400,000,000 Gigabytes) – far more than would be practically possible to fit in a desktop or deskside platform. Today’s desktop systems will typically be fitted with 16 – 64GB or RAM, plenty for personal use.

The additional bit width has also allowed additional compute cores so even home computers now are equipped with multiple processor cores, sharing common memory and enabling the computers to carry out more than one task at any point in time.

This building block of modern computing is also utilised in some of the world’s largest supercomputers, but multiplied many times to produce a multi-node parallel machine capable of solving some of world’s most difficult problems.

**Frontier**, currently the world’s largest supercomputer uses the AMD Epyc processor, which is based on the same 64-bit architecture you would find in a modern desktop PC.

This is where the ** quantity** statement comes in as it uses 9,472 of these processors, one in each computer system, or node.

All of these compute nodes are connected together with a high-performance network to make up a powerful computer capable of running parallelised workflows.

This huge supercomputer comes at a cost in terms of power. It requires **22 Megawatts** of power to function. This equates to roughly **seven thousand 3KW heaters**.

If we reconsider the qubit, which is the quantum version of the conventional bit, it can exist in more states than the conventional bit, and also there are the Entanglement and wave-particle duality features which leads one to think that computers utilising multiple qubits could form the basis of creating a powerful quantum supercomputer.

Although building a quantum computer which could rival the power of a supercomputer like **Frontier** has not been attempted yet (that we know of) you could suggest that the world has been waiting for a computer which has the power of **Frontier** that does not require the same level of electrical power and infrastructure to operate it.

## Why Quantum?

Quantum has the potential to revolutionise computing due to the fundamental difference in construction. A qubit has a far greater scope and power than the traditional “bit” at the core of binary computers.

We have already discussed that today’s supercomputers which perform the largest and most complex tasks, require vast numbers of compute cores, GPU’s, memory, storage and all that associated with a large datacentre, taking up large amounts of power not just to power the compute hardware but also to keep it at the correct temperature and humidity.

The most complex tasks such as mathematical modelling and measuring the effects of climate change now require almost as much power as a small town, and the irony of the latter example is that it is contributing to the problem that it is modelling.

Whilst it is unclear how much power a comparable in capability quantum computer will require, the core of the machine i.e. the qubit requires very little power to operate.

Early forms of quantum hardware required strict laboratory conditions to be able to produce any meaningful and consistent results.

As experience has been gained working with this new technology, improvements are being made to enable a more inclusive environment which opens the possibility of operating in standard datacentre conditions.

The most common types of qubits still require an ultra-cold environment to enable predictable behaviour.

As quantum computing becomes commercialised, suppliers will engineer their products to be built and operated in an economically viable manner.

Qubits are now produced at scale and on-chip named “QPU’s” which enable compact packaging designed for the cooling environment it requires.

The superconducting qubit variant still requires a cold environment, approaching absolute zero. The temperature is required to be constant; any slight change in temperature will affect the qubit state and produce errors.

## Who is using quantum computers?

At the end of 2023 the U.S. Department of Defense released a statement to the effect that they are aware of the developments in the field of quantum computing but after review they are taking a watching stance rather than making any moves to look at any procurements.

However, DARPA – part of DoD is partnering with Rigetti to advance quantum algorithms for solving complex optimisation problems, so there are differing views from some U.S agencies as to the appetite for quantum.

Shanghai University are experimenting with quantum and CFD where they observed greater accuracy using quantum than conventional computers. This is solving an FEA problem.

ExxonMobil has partnered with IBM quantum division to look at not only chemical and scientific problems but also logistic ones such as the best delivery strategy for their LPG product.

A number of financial institutions are trialling quantum machines including JP Morgan, HSBC, BNP Paribas and Credit Agricole. Common uses across most of them include risk analytics, optimisation and cryptography.

## What problems can a quantum computer solve?

The important thing to consider is that right now, you cannot just replace your conventional computer with a quantum computer. It would be like replacing a Formula 1 car with a jet plane. They both travel fast, but in a different way.

Quantum computers are expected to perform well with specific tasks. The most general one being able to compute very large numbers that would be impractical or impossible on a conventional computer.

At the very beginning of the quantum journey, a quantum computer came about because a conventional computer could not simulate the physics field of quantum mechanics.

Monte Carlo simulation software is commonly used where you are dealing with complex systems that have high uncertainty and a large quantity of variables.

This would include finance, engineering risk analysis, and of course F1 lap simulation. These workloads would be suited to run on a quantum machine.

The field of logistics, such as the optimisation of delivery routes and the well-known “travelling salesman problem” would be a suitable workload for a quantum machine.

Weather prediction relies on the computation of thousands of meteorological data points from all over the world in order to produce an accurate forecast. Quantum would be ideally suited to a task such as this.

## Could quantum unlock all the world’s cryptographic keys?

In an earlier section, I mentioned the term algorithm. When referenced in the context of quantum computing, an algorithm is a set of instructions that when executed on a quantum computer can solve a particular problem.

One of the most widely known quantum algorithms is Shor’s algorithm, named after Peter Shor. In 1994 Peter Shor developed his algorithm for finding prime factors of an integer.

It is one of the few algorithms which is expected to perform very well on a quantum computer, more so than on a classical computer.

It is this algorithm which has the potential to break quite easily some of the security keys used for the secure transmission of data across the internet.

For a long time this threat has been theoretical, as working quantum computers did not exist to a degree where they could be used by hostile actors for the purposes of breaking the keys to steal information.

Over the next few years that threat will increase as quantum computers become more prevalent. Steps are already being taken to mitigate this threat and the age of “Quantum-Safe Cryptography” has already begun.

This has taken the form of a series of initiatives to ensure that secure data is as safe as it can be once quantum becomes the norm.

Public key cryptography will be at the most risk, so the advice from organisation such as the NCSC is to replace these keys wherever possible with symmetric cryptography keys such as AES and to use the largest key sizes wherever possible.

This is another example where quantum is becoming more of a reality. If you are concerned about the threat to cryptography, take a look at the NCSC website where you will find more information. Look for “Preparing for Quantum-Safe Cryptography”.

## What does “Quantum Supremacy” mean?

Google have long been experimenting with and developing quantum computers. In 2019 they announced they had achieved “Quantum Supremacy”.

What this actually meant was they ran a problem on their 53-qubit Sycamore quantum computer which took 200 seconds to solve.

They state that on a conventional computer, the same problem would take approximately 10,000 years to complete.

This was the first time a quantum computer had outperformed a conventional computer on a clearly-defined task.

One must assume that Google estimated the time for the conventional computer to complete the task, unless they have already invented time travel!

## Quantum and classic computing duality

Quantum advantage, or quantum supremacy are phrases commonplace in online publications.

What they mean is where you have a problem or workload which has been proven to work better, or faster, or even at all (where conventional computers cannot) on a quantum environment rather than a conventional computing one. This is where quantum computing is becoming a reality, rather than just a theory.

The most likely way quantum will find its way into mainstream commercial computing will be via the accelerator or co-processor route, similar to GPU and FPGA products.

There are some vendors already providing such products, for example, quantumbrilliance.com.

At the moment, the most common quantum machines require special environments to make the qubit work properly and reliably.

Quantum is a lot younger down the evolutionary scale than the conventional or “classic” computer so it is hoped that once more powerful and reliable hardware is developed, quantum can replace the larger conventional supercomputers in terms of power and speed.

Once you look, it is surprising how many organisations are already using quantum computers to solve real problems.

HPC used to be (and still is) a niche technology within the IT field.

Quantum is still a step behind in terms of bringing the latest developments in quantum to a general audience.

The quantuminsider.com has published a list of vendors supplying quantum equipment and it has over 70 entries in it.

The online HPC news sites are starting to print more and more articles on quantum whereas a while ago there were hardly any. This is an indication of how real quantum is starting to become.

50 years after they produced the world’s first desktop PC, could IBM produce the world’s first quantum desktop computer, or at least a conventional computer with a quantum accelerator? They still have time …….

*Red Oak Consulting have been monitoring the development of quantum for some time. Should you wish to discuss any aspect of the developing landscape of quantum computing, don’t hesitate to get in touch.*

* *

**Paul Ingram**

Senior Principal Consultant**Red Oak Consulting**