SBA

Information | Process | Technology

EU e-Privacy Directive

This website uses cookies to manage authentication, navigation, and other functions. By using our website, you agree that we can place these types of cookies on your device.

You have declined cookies. This decision can be reversed.

You have allowed cookies to be placed on your computer. This decision can be reversed.

A Quantum of Progress

Apologies to the geeks, nerds, physicists and mathematicians, some concepts in this article are massively simplified for a non-technical readership. 

 

What’s a digital computer processor? Crudely it’s an electronic adding machine which works in binary (base 2). Each binary digit (bit) has one of two values (on or off, one or zero). There is no magic, everything in a digital computer processor such as the one on your desk or in your smartphone is ultimately achieved by using binary arithmetic to calculate values. The largest single value that can be manipulated in an 8-bit (one byte) processor such as that used by the Apple II, released in 1977, ran is 11111111, which is equivalent to 255 in denary (base 10).  If you want to process a bigger value you chop the problem into pieces or factor it and process the problem iteratively by using memory to hold your partial products. Modern desktop computers typically use 64-bit processors so the largest value they can process in one go is 1111111111111111111111111111111111111111111111111111111111111111 which is 18,446,744,073,709,551,615 in denary (quite big), hence most modern programmers rarely need to use fancy techniques to chop up big numbers. 

 

 

Because digital computers work in binary, arithmetic is very simple. One plus one equals zero and carry one to the next place, i.e. 10.  

 

What this boils down to is that if you want to multiply three (binary 11)  by five (binary 101), the processor starts with zero and has to add three to this five times (3 + 3 + 3 + 3 + 3). The better programmer will of course optimise this as (5 + 5 + 5). The clever programmer will realise that in binary he can just shift left (add a zero to the end) to double a binary number, repeatedly, and add the few remaining iterations in order to reduce the number of steps. Modern microprocessors include special logic circuits to perform the steps needed for long multiplication or division thus saving the programmer from all this tedium by doing it for them, but it still takes time. 

 

The time needed to process many iterations of binary numbers is why modern encryption cannot realistically be cracked by digital computers. Cracking encryption means testing many unknown possibilities; breaking a 256-bit encryption key with modern digital computers could take longer than the length of time the universe has existed, because so many possible combinations must be evaluated (we call these “intractable problems” - they are too big for human technologies and methods to solve). It’s also why computers are practically too slow to solve many other complex multi-variable mathematical problems such as the what-if calculations used in physics, genetics, large network optimisations (e.g. bus or rail transport scheduling), and pattern recognition such as image (photo) analysis, because the bigger of these problems could only be solved in a digital computer by running huge numbers of iterations (to get a handle on huge, think of the number of atoms in the universe). In this context smaller multi-variable problems may only require a few hundred billion steps. 

 

One theoretical solution to the problem of needing so many iterations to perform a complex calculation or match a complex pattern is a “Quantum Computer”, a computer which can handle multiple values per “quantum bit” - at the same time.  The term quantum bit, or qubit, refers to the theory of Quantum Mechanics which explain the multi-state behaviours of sub-atomic particles and waves; a qubit can be both one and zero (and potentially other intermediate values) simultaneously. Such a computer could speed up complex multi-variable computations enormously by effectively testing all possible combinations in one operation instead of iteratively. Physicists and computer scientists first theorised this concept at the beginning of the 1980’s as a possible tool to make the next stride forward in understanding sub-atomic physics, but making such a processor is “difficult”.

 

Difficult maybe, but it seems it’s been done. A Canadian computer company, D-Wave, has been selling a “quantum computer” for a few years now. Not cheap, but both Google and the US National Security Agency are known to have bought one. The assessment of the leading academics in quantum computing and physics is that the D-Wave machine is not a “real” quantum computer - in that it is not a “universal quantum computer”; it cannot process all types of quantum computing algorithms, and does not operate according to the classical theory of quantum mechanics as applied to computing. Nevertheless Google has reported that for some computations with up to 1,000 binary variables the 500 qubit D-Wave computer it has purchased is proving to be over one hundred million times faster than a modern digital microprocessor (only for some types of multi-variable problems, for simpler calculations it is no faster than a “normal” computer). So, progress on the quantum computing front, albeit not the classical theory of quantum computing, only suitable for a subset of problems, and probably at around US$15,000,000 a bit expensive for most businesses.

 

In parallel with the emergence of this alternative quantum computing invented by D-Wave, academics have been making progress on how to build “real” universal quantum computer processors which do obey the rules of the classical theory and can be used to solve a wider range of problems. Several universities have produced experimental small-scale processors of up to twelve qubits based upon classical universal quantum computing theory and these small processors are also blazingly fast at solving multi-variable problems - but small processors are only useful for smaller problems. 

 

Making large quantum processors based on the classical theory has been problematic, but last month a UK researcher, Professor Winfried Hensinger at the University of Sussex, revealed that his team has developed a method of stitching small quantum processors together effectively. This is crucial because the qubits in a quantum computer must all work in conjunction, so without an effective way of joining them having two quantum processors is nothing like as powerful as having one twice the size. This opens up the possibility of large-scale (incredibly powerful) and affordable “real” quantum computers within the near future. IBM has a stable five qubit design of universal quantum computer which it has connected to the Internet to help programmers understand how to develop software for quantum computing, and last week announced that it will offer a fifty qubit universal quantum computer as a product for sale in the next few years (expected to be at a similar price to the D-Wave machine).

 

So what? Personally I suspect the advent of real, large scale, quantum computing will prove to be the most important human advance in my lifetime. In terms of significance to human progress so far it is definitely up there on a scale with atomic physics, digital computing, cracking the human genome etc. Google’s findings which show a one-hundred million fold increase in performance with the limited D-Wave machine merely give us a foretaste of what a “real” quantum computer can achieve.

 

Quantum computers will be able to quickly provide solutions to intractable mathematical and pattern matching problems which to date, even with the most powerful digital supercomputers, have been far beyond the ability of mankind to solve. Physics, chemistry, genetics, medicine, cryptography, artificial intelligence (and actuarial modelling) - pretty much any discipline which must evaluate many input variables in parallel to obtain a solution - are, within the next decade, set to take a huge leap forward. Quantum computing is possibly the biggest step towards understanding the universe and everything within it, including our own biology, that mankind has achieved in its entire history. Professor Winfried Hensinger’s development of a method to join up real quantum computer processors into something big enough to be useful, and IBM’s announcement that it has a timeline for bringing a commercial offering to market, will change our lives as dramatically as the development of the microprocessor (which dates back to 1971) and probably more quickly. 

 

Don’t expect to see one on your desk for a while though. At the moment we can only make quantum computing work at very, very low temperatures. Much of the huge cost of a quantum computer is the large super-cooling mechanism needed to reduce the processor’s temperature close to absolute zero (-273.15 degrees Celsius, the coldest cold in the universe). 

You are here: Home Thinking(s) IT Matters A Quantum of Progress