The development of computers has been one of the most important achievement of the modern world. Computers have definitely been important to making life better in different ways.

However, what we know as computer today is not always what a computer has been. There have been tremendous developments over the years. These developments has been majorly on three fronts: **speed**, **memory** and **size**. Though a computer is any system that collects, stores and processes data to produce information, the dynamics of that has been shaped in different ways over the years

The first computers were very big in accordance with the memory they afford, were contained in large spaces, limited in memory capacity and processing only so much amount of data. It was the age of the vacuum tube which served then as the switching and memory unit of the computer, controlling speed and memory. They were big and limited in capacity. The development of the transistors has led to a revolution in computing. The transistors were smaller yet with higher memory and greater speed. But even more was the Integrated Circuit which consisted of hundreds of millions of transistors. With the integrated circuit has come the age of the microprocessors, microcomputers and super computers. Now, computing affords us the opportunity to meet many computing needs with larger memory, yet very minute size and a very high computing or processing speed. It is this development that has made possible the different advances in technology

However, even with these developments in computing, there have also been challenges. These challenges relate to: space, speed, energy usage and complex, intractable problems. In regards to space, first of all. The transistors are the switching and memory unit of the computer and they determine storage and processing capability. That is, for a computing system to handle more complex problems, there is need for more and more transistors and these transistors must become smaller and smaller in order to ensure the balance between space (too many, too large transistors will compromise the essence of microprocessors) and processing capacity (need for more and more to increase processing capacity) .The problem in this regard is that, the transistors are approaching the place where they are getting as small as an atom, the smallest physical component. How smaller can they got to achieve the balance between space, memory, processing speed and capacity?

Secondly, our computers are arranged according to the binary system of 0 and 1. Information storage is dependent on transistors which work on an either or method (either 0 when on or 1 when off). The more information that is needed to be stored, the more transistors and 0s and 1s. Many computing system requires lots of sequential steps that move from the binary digits to the logic gates, to the registers, the electronic circuits and algorithm. In this way, very complex problems will require a lot of time for them to be processed.

The sequential nature of these processes also means that computing systems use a whole lot of energy while carrying out their different processes. This is why many computers get hot as they are used. Each single bit (binary) operation uses absolute minimum amount of energy

And in spite of this, many problems still remain unsolvable by modern computers. There are still many complex problems that remain unsolvable

**Quantum Computing** is not just an advancement like Super computers to Mainframe computers. It is computing based on a uniquely different philosophy. The world of quantum is the world of atoms (necessitated by transistors getting smaller) and subatomic particles where the traditional laws of physics break down. The world of quantum unlike modern computers is not a world of either or but a world of both and. That is, quantum bits unlike normal bits work in such a way that whereas a bit can store either a zero or a 1, it can store 0, 1, an infinite number of values in between, 0 and 1 —and be in multiple states (store multiple values) at the same time comparable to light which can be a particle and wave at the same time. Quantum bits make use of superposition to represent multiple states (multiple numeric values) simultaneously.

In the same way, a quantum computer can also process these multiple values simultaneously rather than the sequential nature of modern computers. Only when you try to find out what state it’s actually in at any given moment (by measuring it, in other words) does it “collapse” into one of its possible states—and that gives you the answer to your problem.

**Quantum Computers** are therefore faster, ensuring that complex problems are processed in a simultaneous way that maximize efficiency, more energy efficient, since processing time is reduced while ensuring that smaller transistors can be accommodated in the world of quantum and thereby making it possible that problems previously unsolvable can now be solved.

One example of such is illustrated by Chris Woodford, “In 1994, while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computer could follow to find the “prime factors” of a large number, which would speed up the problem enormously. Shor’s algorithm really excited interest in **Quantum Computing** because virtually every modern computer (and every secure, online shopping and banking website) uses public-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentially an “intractable” computer problem). If **Quantum Computers** could indeed factor large numbers quickly, today’s online security could be rendered obsolete at a stroke.”

While there are prospects, there are also challenges. They include; the complexity of designs that is needed to make **Quantum Computers** a common reality; the difficulty of having them on an industrial scale for the use and benefit of everyone; the fact that quantum bits need a lot of interaction with one another; whether advancement in modern computers will make it necessary after all

As more research is undertaken in this field, and prospects and problems interact, it still remains to be seen whether **Quantum Computing** will be the next huge revolution in the world of computing and technology at large.

## 1 thought on “QUANTUM COMPUTER: THE FUTURE OF COMPUTING”