Business Today

Quantum Quandary

An exponential leap in processing power is around the corner, but will we use or abuse it?
Team BT   New Delhi     Print Edition: March 25, 2018
Quantum Quandary

Traditional computers understand the instructions given to them in bits - binary digits zero or one - and are able to perform rather complex tasks. Magical and transformative as this has been for many years, the next leap in computing is around the corner. Quantum computing involves working with zeros and ones at the same time, known as qubits. These qubits are clubbed together in a process called entanglement, making the computer more powerful with every addition of a qubit.

Qubits are, however, not easy to put together and seem vulnerable to events external to them and can cause computers to go awry with their calculations. Quantum computers need many qubits to work. If done right, they are powerful enough to perform massive calculations, that take regular computers months to arrive at, in minutes; but if not, errors make the whole process futile.

To make qubits quickly and reliably, researchers at the University of Technology in the Netherlands have used silicon in something of a breakthrough, revealed a report in the journal Nature. Intel and QuTech have begun designing quantum computers with two spin qubits on silicon. The algorithms arrived at are still at a simple stage, but the possibilities with this type of processor are immense. In fact, there's now a veritable arms race between companies and even countries to build quantum computers as it is believed that this will lead to the solving of many of the biggest problems in the world and even unlock mysteries of the universe.

Although true usable quantum computing is still far from achieved, experts say it s implications are more worrying than Artificial Intelligence. If millions of combinations of numbers can be searched through in minutes, it is going to be possible to crack open secure databases, making the world rethink its current methods of making data private, including encryption. Think bank accounts, too, that are already not secure enough. When a processor achieves 'quantum supremacy' (perhaps possible with 50 qubits) it is thought that calculations for genetic sequences, drug discovery, space exploration, and many other areas will leap forward. But data security as we know it today will also have to see a step change. There is work on 'quantum safe' methods, but that too is in the distance for now.

"The fundamental theory behind data encryption is that we can hide data by applying an encryption algorithm and thus transform it into an indecipherable format. Someone can only then decipher the data by using a unique private key in the sole possession of approved personnel," says Bob Reselman in TheServerSide. "Trying to unlock the data without the key can take decades of trial-and-error computation on a classical computer. The inherent deterrent is that nobody can or wants to spend years trying to crack the algorithm.

But what if the period of trial and error went from decades to minutes? In this case, the allure of cracking the code becomes compelling. Increased levels of malicious intention become inevitable."

 

Earthquake Ahead

Artificial Intelligence has done a good job of out-thinking humans and their traditional methods. Now it's time to see how they fare at solving one of the world's most challenging problems: how to tell when an earthquake is coming.

Predicting and actually doing something about earthquakes may get a little easier because of an unfortunate fact: humans are causing some of them. Research in central United States has seen that seismic activity is on the rise - exponentially so - because of human activity, such as waste water getting into the ground. ConvNetQuake, a system to use algorithms to detect and locate waveforms that signal heightened chances of earthquakes, is being tried out there, according to a report in Science Advances. It has been able to detect 17 times more earthquake activity than the Oklahoma Geological Survey. This convolutional neural network method is readily scalable, according to the researchers.

 

Youtube
  • Print

  • COMMENT
BT-Story-Page-B.gif
A    A   A
close