Quantum Computing


Introduction- Quantum Computing is a form of computing that takes advantage of quantum phenomena like superposition and quantum entanglement.
This amazing technology trend is also involved in preventing the spread of the coronavirus, and to develop potential vaccines, thanks to its ability to easily query, monitor, analyze and act on data, regardless of the source. Another field where quantum computing is finding applications in banking and finance, to manage credit risk, for high-frequency trading and fraud detection.
Quantum computers are now a multitude times faster than regular computers and huge brands like Splunk, Honeywell, Microsoft, AWS, Google and many others are now involved in making innovations in the field of Quantum Computing. The revenues for the global quantum computing market are projected to surpass $2.5 billion by 2029. And to make a mark in this new trending technology, you need to have experience with quantum mechanics, linear algebra, probability, information theory, and machine learning.
Quantum computing harnesses the phenomena of quantum mechanics to deliver a huge leap forward in computation to solve certain problems. Quantum computing is the processing of information that’s represented by special quantum states. By tapping into quantum phenomena like “superposition” and “entanglement,” these machines handle information in a fundamentally different way to “classical” computers like smartphones, laptops, or even today’s most powerful supercomputers.

Why is Quantum Computing Important:-

  • Researchers have long predicted that quantum computers could tackle certain types of problems — especially those involving a daunting number of variables and potential outcomes, like simulations or optimization questions — much faster than any classical computer.
  • But now we’re starting to see hints of this potential becoming reality.
  • In 2019, Google said that it ran a calculation on a quantum computer in just a few minutes that would take a classical computer 10,000 years to complete.
  • A little over a year later, a team based in China took this a step further, claiming that it had performed a calculation in 200 seconds that would take an ordinary computer 2.5B years — 100 trillion times faster.

The Rise of Quantum Computing:-
Computing Beyond Moore’s Law-

  • In 1965, Intel co-founder Gordon Moore observed that the number of transistors per square inch on a microchip had doubled every year since their invention while the costs were cut in half. This observation is known as Moore’s Law.
  • Moore’s Law is significant because it predicts that computers get smaller and faster over time. But now it’s slowing down — some say to a halt.
  • More than 50 years of chip innovation have allowed transistors to get smaller and smaller.
  • Apple’s latest computers, for example, run on chips with 5 nm transistors — about the size of just 16 oxygen molecules lined up side-by-side. But as transistors start to butt against physical limitations, Intel and other chipmakers have signaled that improvements in transistor-based computing might be approaching a wall.
  • Soon, we will have to find a different way of processing information if we want to continue to reap the benefits of rapid growth in computing capabilities.

What is Qubit?

  • Qubits, the quantum version of bits used in classical computing, are the basic units of information in a quantum computer.
  • They make use of a quantum mechanical phenomenon called “superposition,” where some properties of a particle — such as the angle of polarization of a photon — are not defined for certain until they’re actually measured. In this scenario, each possible way these quantum properties could be observed has an associated probability.
  • This effect is a bit like flipping a coin. A coin is definitely heads or tails when it lands, but while in the air it has a chance of being either.
  • Quantum computers conduct calculations by manipulating qubits in a way that affects these superimposed probabilities before making a measurement to gain a final answer.
  • By avoiding measurements until an answer is needed, qubits can represent both parts of binary information, denoted by “0” and “1,” at the same time during the actual calculation.
  • In the coin flipping analogy, this is like influencing the coin’s downward path while it’s in the air — when it still has a chance of being either heads or tails.
  • A single qubit can’t do much, but quantum mechanics has another trick up its sleeve. Through a delicate process called “entanglement,” it’s possible to set qubits up such that their individual probabilities are affected by the other qubits in the system. A quantum computer with 2 entangled qubits is a bit like tossing 2 coins at the same time, while they’re in the air every possible combination of heads and tails can be represented at once.
  • The more qubits that are entangled together, the more combinations of information that can be simultaneously represented. Tossing 2 coins offers 4 different combinations of heads and tails (HH, HT, TH, and TT) but tossing 3 coins allows for 8 distinct combinations (HHH, HHT, HTT, HTH, THT, THH, TTH, and TTT).
  • This is why quantum computers could eventually become much more capable than their classical counterparts — each additional qubit doubles a quantum computer’s power.
  • At least, that’s the theory. In practice, the properties of entangled qubits are so delicate that it’s difficult to keep them around long enough to be put to use. Quantum computer makers also contend with lots of engineering challenges — like correcting for high error rates and keeping computer systems incredibly cold — that can significantly cut into performance.



Mr. Arshad Hussain, Assistant Professor, School of Computer Applications, Career Point University, Kota

Leave a Reply

Your email address will not be published. Required fields are marked *