Quantum advantage: near-term breakthrough or long-term challenge?
Quantum computing providers are engaged in a race to be the first to demonstrate quantum advantage (also referred to as quantum supremacy). Many claims of quantum advantage have been made in recent years, most of them disputed. This article looks at the landscape in 2025, why this milestone matters, and what the future might hold.
What is quantum advantage and why is it important?
Quantum advantage can be defined in a number of different ways. Initially, quantum advantage was defined as the use of a quantum computer to solve a problem that cannot be solved by a classical computer (at least in a reasonable amount of time). This milestone was claimed by Google in 2019, though that claim has since been disputed. However, as artificial problems can be designed for current noisy intermediate-scale quantum (NISQ)-era computers for which quantum advantage can be demonstrated, much of the focus in this sector has shifted towards quantum advantage for what are deemed to be useful, real-word problems. As such, achieving quantum advantage can be seen as a sign that quantum computers are approaching an era where they have utility in solving real-world problems, providing commercial value.
D-Wave Systems, who develop quantum computers which implement quantum annealing, have claimed in 2025 to have achieved quantum advantage on such a real-world problem. The problem involves the simulation of magnetic materials in agreement with the Schrödinger equation, and D-Wave suggest their Advantage2 quantum computer managed to perform this simulation in minutes, when a classical computer would take nearly one million years. While some in the field are sceptical of D-Wave’s claim, this is nonetheless an exciting development for the industry as a whole.
Why is quantum advantage so challenging?
Quantum computers are, despite the vast amount of research and development in this area, in their relative infancy. Various different architectures exist for quantum computers, such as superconducting qubits, trapped ion qubits, neutral atom qubits, and photonic qubits. Each of these technologies has its own associated challenges in achieving quantum advantage, and currently there is no clear indication of which of these technologies, if any, will emerge as superior.
A primary problem of many quantum architectures (though notably not for photonic qubits) is qubit stability: keeping the physical qubits of the system in the correct quantum state for long enough to perform a calculation. Qubits are inherently unstable and prone to errors such as bit-flips and phase-flips, where their state drifts from the ideal. These errors can be reduced through techniques such as improved cooling, ultra-high vacuum, and electromagnetic shielding, amongst others, but are not perfect solutions.
As the number of qubits in quantum computers increases, reduction of errors becomes increasingly challenging. Scalability is therefore also a challenge for current quantum computers. For many real-world problems, quantum computers are required to have many more qubits than what is currently possible. Moreover, due to the error-prone nature of quantum computers, most systems utilise a many-to-one approach where multiple physical qubits correspond to a single “logical” qubit, which is used for the calculation being performed. However, this only further increases the number of qubits required to perform the calculation, exacerbating the stability challenges.
In addition to qubit stability, qubit control presents a further technical challenge. Initialisation, performance of gate operations, and qubit measurement for qubits in a synchronised manner only increases in difficulty when the quantum computer includes more qubits. In addition, the application of quantum gates to qubits is an imperfect process, and so increasing the gate fidelity (the correspondence between the “ideal” gate and the gate implemented by the control hardware of the quantum computer) is a critical task. A significant milestone in gate fidelity was reached in 2024 when Quantinuum announced a 2-qubit gate fidelity (indicating the stability of a qubit) of greater than 99.9% for a 2-qubit gate. However, many in the industry believe that yet greater gate fidelity is a necessity for large-scale quantum computers.
It is evident that the problem of errors in quantum computers will persist for many years to come. As such, in order for quantum computers to provide any meaningful results, identification and correction of errors in quantum computers is of paramount importance. The large number of qubits required to solve real-world problems and the speed at which these quantum computers are required to operate means that such quantum computers will output vast volumes of data which will need to be rapidly analysed for errors. Companies such as Riverlane therefore focus on developing specialised systems for addressing this critical challenge, and their DeltaFlow 2 system allows for error correction on up to 250 physical qubits. While this is an exciting development, as quantum computers continue to improve, the demands on such error correction systems will only increase.
Patentability in quantum computing
To its benefit, the quantum computing sector is extremely collaborative, with companies operating in different layers of the so-called “quantum stack” sharing their expertise on a regular basis. However, this should only be done once a patent application has been filed for the invention, to ensure that companies can maintain ownership over their intellectual property. The same rules of patentability apply to patent application for quantum computing inventions as for all other fields. In particular, the invention must be new and inventive over the state of the art. As such, companies seeking to patent their inventions in this space must be careful not to disclose their inventions or implement their inventions outside of confidential environments. Any collaboration should be done under strict confidentiality, with non-disclosure agreements in place for any non-public information, particularly if a patent application for the invention has not yet been filed.
Software for quantum computing-related inventions is also subject to country-specific patentability requirements for computer programs. In Europe, this means that any software inventions must provide a technical effect, for example in a technical field. As an example, if the software results in an improved quantum computer, then a technical effect is achieved and patent protection can be obtained. Given the relative infancy of the quantum computing sector, there are generally a large number of software inventions which fall under this category and are therefore potentially protectable by patents. Conversely, if the computer program is simply an improved quantum computing algorithm for a non-technical purpose, then patent protection is, in general, not obtainable.
Conclusion
Despite D-Wave’s claim to have achieved one instance of quantum advantage, widespread and large-scale quantum advantage remains an ongoing aspiration. Improvements in qubit stability, scalability, qubit control, and error detection and correction are each crucial in order to achieve quantum advantage and will likely remain so for at least the next decade. Collaboration is likely to reduce this time, with the possibility of patent protection facilitating this collaboration, however companies must still be mindful of the requirements to obtain patent protection when doing so. In any case, with investment in quantum computing increasing year-on-year and technology developing at lightning pace, it remains an exciting time for the sector as a whole
