Quantum Computing, Part II
Posted by on Monday, April 18, 2016
In my previous article, I wrote about the exciting world of quantum computing and outlined some of the many benefits that it has the potential to bring to society. However as I alluded to previously, there are still a number of technical challenges that need to be overcome. Today I will try to give brief overviews of two of the most difficult challenges.
The first difficulty is known by many names and covers many related technical challenges, but is usually called decoherence. A quantum computer operates by storing data in a quantum mechanical wavefunction, which is then operated on and manipulated to process the data. When the algorithm is completed, the quantum computer generates another wavefunction and measurements of it cause it to collapse to a final answer, which is then given as the output of the calculations. The problem is that the collapse must not happen during the calculations, and yet almost anything that is done to the wavefunction will cause such a collapse. A stray air molecule can ruin the calculation. A photon of light leaking into the CPU, or a gamma-ray from space colliding with the CPU. Even two bits of data in the same computer can interact with each other and ruin the quantum effects. They are very delicate!
The only two ways of getting around this problem are to either make your quantum computer very isolated from the world around it, so that nothing from the environment can interact with it, or to simultaneously process so many copies of the data so that even the loss of billions or trillions of copies to decoherence effects will not be a problem. At present both methods are being developed and both have had some success.
The second obstacle to overcome is the probabilistic nature of the output. A classical computer is given two numbers and generates a third number. There is no uncertainty or dispute about the result. A quantum computer might be given a mixture of thousands of numbers, and outputs another mixture of numbers. And measuring the output gives you one number, but it is randomly selected from the possible outputs of the quantum computer. And sometimes it is wrong.
The ideal solution would be to make numerous copies of the output, and then measure each one separately to determine the answer. All of the right answers will be produced eventually, and the right answers will occur much more often than the wrong answers. It would seem to be the perfect solution, and it is also impossible. One of the fundamental properties of quantum mechanics is the non-cloning theorem, which for quantum computers means that you receive a single copy of the output and there is no possible way to make more copies. (Although you can run the data through multiple quantum computers to get multiple outputs, but there is no guarantee that these will be exact copies).
And so once more there are two possible solutions to this problem. One is to find an algorithm that amplifies the right answers and weakens the wrong answers. Then you would know that the measured output is much more likely to be correct. However this still does not resolve the issue of needing to find all of the right answers.
The second, and possibly more feasible option, is to design a quantum computer that can simultaneously process many copies of the data. For example some researchers have tried placing a pure liquid in a strong magnetic field, and each molecule in the liquid acts as an independent quantum computer. Then by measuring the liquid, you would find several trillion molecules giving one correct answer, several trillion giving another answer, and maybe a few hundred that give a wrong answer which can be ignored. This method works well in experiments in that it can be used to measure a large number of outputs, but the decoherence effects are worse because each molecule can interact with all of its neighbours.
So for the near future, quantum computing will be limited to laboratories where scientists work on tweaking the methods to produce a usable product. However don't be surprised if in another decade or two you have a quantum computer on your desktop!
The first difficulty is known by many names and covers many related technical challenges, but is usually called decoherence. A quantum computer operates by storing data in a quantum mechanical wavefunction, which is then operated on and manipulated to process the data. When the algorithm is completed, the quantum computer generates another wavefunction and measurements of it cause it to collapse to a final answer, which is then given as the output of the calculations. The problem is that the collapse must not happen during the calculations, and yet almost anything that is done to the wavefunction will cause such a collapse. A stray air molecule can ruin the calculation. A photon of light leaking into the CPU, or a gamma-ray from space colliding with the CPU. Even two bits of data in the same computer can interact with each other and ruin the quantum effects. They are very delicate!
The only two ways of getting around this problem are to either make your quantum computer very isolated from the world around it, so that nothing from the environment can interact with it, or to simultaneously process so many copies of the data so that even the loss of billions or trillions of copies to decoherence effects will not be a problem. At present both methods are being developed and both have had some success.
The second obstacle to overcome is the probabilistic nature of the output. A classical computer is given two numbers and generates a third number. There is no uncertainty or dispute about the result. A quantum computer might be given a mixture of thousands of numbers, and outputs another mixture of numbers. And measuring the output gives you one number, but it is randomly selected from the possible outputs of the quantum computer. And sometimes it is wrong.
The ideal solution would be to make numerous copies of the output, and then measure each one separately to determine the answer. All of the right answers will be produced eventually, and the right answers will occur much more often than the wrong answers. It would seem to be the perfect solution, and it is also impossible. One of the fundamental properties of quantum mechanics is the non-cloning theorem, which for quantum computers means that you receive a single copy of the output and there is no possible way to make more copies. (Although you can run the data through multiple quantum computers to get multiple outputs, but there is no guarantee that these will be exact copies).
And so once more there are two possible solutions to this problem. One is to find an algorithm that amplifies the right answers and weakens the wrong answers. Then you would know that the measured output is much more likely to be correct. However this still does not resolve the issue of needing to find all of the right answers.
The second, and possibly more feasible option, is to design a quantum computer that can simultaneously process many copies of the data. For example some researchers have tried placing a pure liquid in a strong magnetic field, and each molecule in the liquid acts as an independent quantum computer. Then by measuring the liquid, you would find several trillion molecules giving one correct answer, several trillion giving another answer, and maybe a few hundred that give a wrong answer which can be ignored. This method works well in experiments in that it can be used to measure a large number of outputs, but the decoherence effects are worse because each molecule can interact with all of its neighbours.
So for the near future, quantum computing will be limited to laboratories where scientists work on tweaking the methods to produce a usable product. However don't be surprised if in another decade or two you have a quantum computer on your desktop!