Nothing Special   »   [go: up one dir, main page]

Quantam Computers

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 21

On

Table Of Content

1. INTRODUCTION
2. HISTORY OF QUANTUM COMPUTERS
3. WORKING OF QUANTUM COMPUTERS
4. POTENTIAL AND POWER OF COMPUTING
5. OBSTACLES AND RESEARCH
6. APPLICATION
7. FUTURE OUTLOOK
8. REFERENCES
1. INTRODUCTION:

Behold your computer. Your computer represents the


culmination of years of technological advancements
beginning with the early ideas of Charles Babbage (1791-
1871) and eventual creation of the first computer by German
engineer Konrad Zuse in 1941. Surprisingly however, the
high speed modern computer sitting in front of you is
fundamentally no different from its gargantuan 30 ton
ancestors, which were equipped with some 18000 vacuum
tubes and 500 miles of wiring! Although computers have
become more compact and considerably faster in performing
their task, the task remains the same: to manipulate and
interpret an encoding of binary bits into a useful
computational result. A bit is a fundamental unit of
information, classically represented as a 0 or 1 in your digital
computer. Each classical bit is physically realized through a
macroscopic physical system, such as the magnetization on a
hard disk or the charge on a capacitor. A document, for
example, comprised of n-characters stored on the hard drive
of a typical computer is accordingly described by a string of
8n zeros and ones. Herein lies a key difference between your
classical computer and a quantum computer. Where a
classical computer obeys the well understood laws of
classical physics, a quantum computer is a device that
harnesses physical phenomenon unique to quantum
mechanics (especially quantum interference) to realize a
fundamentally new mode of information processing.
A quantum computer is one which exploits quantum-
mechanical interactions in order to function; this behavior,
found in nature, possesses incredible potential to manipulate
data in ways unattainable by machines today. The harnessing
and organization of this power, however, poses no small
difficulty to those who quest after it.

Subsequently, the concept of quantum computing, birthed in


the early 80's by physicist Richard Feynman, has existed
largely in the realm of theory. Miraculous algorithms which
potentially would take a billionth of the time required for
classical computers to perform certain mathematical feats,
and are implementable only on quantum computers, as such
have not yet been realized. A two-bit quantum system,
recently developed by a coalition of researchers, constitutes
the sole concrete manifestation of the idea.

In a quantum computer, the fundamental unit of


information (called a quantum bit or qubit), is not binary but
rather more quaternary in nature. This qubit property arises
as a direct consequence of its adherence to the laws of
quantum mechanics which differ radically from the laws of
classical physics. A qubit can exist not only in a state
corresponding to the logical state 0 or 1 as in a classical bit,
but also in states corresponding to a blend or superposition of
these classical states. In other words, a qubit can exist as a
zero, a one, or simultaneously as both 0 and 1, with a
numerical coefficient representing the probability for each
state.

2.History of Quantum Computation:

Widespread interest in quantum computation is generally


accepted to stem from Feynman's observation that classical
systems cannot effectively model quantum mechanical
systems. He proposed that the only way to effectively model
a quantum mechanical system would be by using another
quantum mechanical system (Feynman, 1982). Feynman's
observation suggests that computers based on the laws of
quantum mechanics instead of classical physics could be
used to model quantum mechanical systems.

Deutsch was the first to explicitly ask whether it is possible


to compute more efficiently on a quantum computer than on
a classical computer. By addressing this question, he
extended the theory of quantum computation further with the
development of the universal quantum computer and
quantum Turing machine (Deutsch, 1985) and with quantum
computational networks (Deutsch, 1989). Deutsch also
devised the first quantum algorithm, Deutsch's two bit
problem (Deutsch 1985), this problem can be generalised to
Deutsch's algorithm for finding out whether a function is
balanced or constant (Deutsch & Jozsa, 1992).

Until the mid 1990's, quantum computation remained a


curiosity. Although various uses had been suggested for
quantum computers and some theory had been established,
no "killer application" had been proposed. This situation
changed when Shor published his quantum factoring
algorithm for finding the prime factors of large integers
(Shor, 1994). It has since been argued that Deutsch's
algorithm is a "killer application", this shows a quantum
computer solving a problem in fewer steps than would be
needed by a classical computer. However, there is debate
about whether Deutsch's algorithm really constitutes a "killer
application", whereas Shor's algorithm is generally accepted
as being one.

Shor's algorithm was based on previous work, including


creation of a quantum algorithm, by Simon (Simon, 1994).
Simon's algorithm examines an oracle problem which takes
polynomial time on a quantum computer but exponential
time on a classical computer. Simon's work was based on an
oracle problem examined by Bernstein and Vazirani
(Bernstein & Vazirani, 1993).

Finding prime factors is the basis of many public key


encryption systems such as RSA and subsequently Shor's
algorithm caused much interest within many sections of the
scientific community. Other algorithms such as that for
discrete logarithms (Shor, 1994), an alternative factoring
algorithm (Jozsa, 1997) based on Kitaev's work on
construction of quantum algorithms based on group-theoretic
principles (Kitaev, 1995), algorithms for finding the median
and mean (Grover, 1997), Hogg's constraint satisfaction
algorithms (Hogg, 1996) and Grover's algorithm for database
search (Grover, 1997) all contribute to the relatively small
number of known quantum algorithms.

Known quantum algorithms can be partitioned into three


groups depending on the methods they use. The first group
contains algorithms which are based on determining a
common property of all the output values such as the period
of a function, e.g. Shor's algorithm, the second contains those
which transform the state to increase the likelihood that the
output of interest will be read (amplification), e.g. Grover's
algorithm and the third contains algorithms which are based
on a combination of methods from the previous two groups,
e.g. the approximate counting algorithm (Brassard, Hoyer
and Tapp, 1998). It is not known whether additional types of
quantum algorithm exist or whether every quantum algorithm
can be classified as a member of one of a finite number of
groups.

At the same time as the number of known quantum


algorithms was expanding, significant progress was being
made in developing the techniques necessary to produce
quantum hardware. Ion trap technology and nuclear mass
resonance (NMR) technology are two technologies which
have successfully been used to develop 2 and 3 qubit
systems. These tiny quantum computers have been used to
implement Deutsch's problem (Jones & Mosca, 1998) and
Grover's algorithm (Jones, Mosca & Hansen, 1998) and show
that they can run on quantum hardware. However, both ion
trap and NMR technologies appear to have limitations, it
seems probable that it will be possible to utilise them to
produce systems of up to approximately 40 qubits. For larger
numbers of qubits an alternative technology will be needed,
at present it seems likely that this technology will be solid
state.
Quantum computation simulation languages and systems
have been developed which attempt to allow simulations of
quantum algorithms, these include QCL (Ömer, 1998), Q-gol
(Baker, 1997), Qubiter (Tucci, 1998) and the simulation
system currently being developed by the OpenQubit group
(OpenQubit, 1998). Q-gol was an attempt to write a high
level programming language to allow researchers to describe
algorithms designed to run on quantum computers. Qubiter
takes as input an arbitrary unitary matrix and returns as
output an equivalent sequence of elementary operations (e.g.
controlled-nots and qubit rotations). Together with
simulations produced within mathematical toolkits e.g.
Mathematica and implementation of algorithms using actual
qubits, this has allowed verification that the known quantum
algorithms work and enabled investigation into how they
function.

As development work has progressed, additional uses have


been proposed for quantum computation, from modelling
quantum mechanical systems, breaking public key
encryption, searching databases, generating true random
numbers to providing secure communication using quantum
key distribution. It has also been suggested that quantum
mechanics may be playing a role in consciousness, if a
quantum mechanical model of mind and consciousness was
developed this would have significant impact on
computational and artificial intelligence. If the brain handles
quantum type transformations somewhere in its neural
network this could lead to future quantum computers being
biological/biochemical in nature.

At present, the majority of the research effort in quantum


computation is devoted to the physics orientated aspects of
quantum computation, in particular the development of
hardware. Within this area researchers are mainly focussing
on NMR technology. Ion trap technology is beginning to
catch up with what has been achieved using NMR but solid
state technology is still very much in its infancy. The
computer science/information science research aspects are
being pursued, but less emphasis is placed on these at
present. Principle centres of quantum computation research
include those at Los Alamos, Stanford, IBM, UCLA, the
Oxford Centre for Quantum Computation, the University of
Montreal, Innsbruck and Caltech, MIT and USC.

3.WORKING:
Quantum computers are basically works on quantum
phenomenon. This may seem counterintuitive because
everyday phenomenon are governed by classical physics, not
quantum mechanics -- which takes over at the atomic level.
This rather difficult concept is perhaps best explained
through an experiment. Consider figure a below:

Here a light source emits a photon along a path towards a


half-silvered mirror. This mirror splits the light, reflecting
half vertically toward detector A and transmiting half toward
detector B. A photon, however, is a single quantized packet
of light and cannot be split, so it is detected with equal
probability at either A or B. Intuition would say that the
photon randomly leaves the mirror in either the vertical or
horizontal direction. However, quantum mechanics predicts
that the photon actually travels both paths simultaneously!
This is more clearly demonstrated in figure b.

In an experiment like that in figure a, where a photon is fired


at a half-silvered mirror, it can be shown that the photon does
not actually split by verifying that if one detector registers a
signal, then no other detector does. With this piece of
information, one might think that any given photon travels
either vertically or horizontally, randomly choosing between
the two paths. However, quantum mechanics predicts that
the photon actually travels both paths simultaneously,
collapsing down to one path only upon measurement. This
effect, known as single-particle interference, can be better
illustrated in a slightly more elaborate experiment, outlined
in figure b below:
In this experiment, the photon first encounters a half-silvered
mirror, then a fully silvered mirror, and finally another half-
silvered mirror before reaching a detector, where each half-
silvered mirror introduces the probability of the photon
traveling down one path or the other. Once a photon strikes
the mirror along either of the two paths after the first beam
splitter, the arrangement is identical to that in figure a, and so
one might hypothesize that the photon will reach either
detector A or detector B with equal probability. However,
experiment shows that in reality this arrangement causes
detector A to register 100% of the time, and never at detector
B! How can this be?

Figure b depicts an interesting experiment that demonstrates


the phenomenon of single-particle interference. In this case,
experiment shows that the photon always reaches detector A,
never detector B! If a single photon travels vertically and
strikes the mirror, then, by comparison to the experiment in
figure a, there should be an equal probability that the photon
will strike either detector A or detector B. The same goes for
a photon traveling down the horizontal path. However, the
actual result is drastically different. The only conceivable
conclusion is therefore that the photon somehow traveled
both paths simultaneously, creating an interference at the
point of intersection that destroyed the possibility of the
signal reaching B. This is known as quantum interference
and results from the superposition of the possible photon
states, or potential paths. So although only a single photon is
emitted, it appears as though an identical photon exists and
travels the 'path not taken,' only detectable by the
interference it causes with the original photon when their
paths come together again. If, for example, either of the
paths are blocked with an absorbing screen, then detector B
begins registering hits again just as in the first experiment!
This unique characteristic, among others, makes the current
research in quantum computing not merely a continuation of
today's idea of a computer, but rather an entirely new branch
of thought. And it is because quantum computers harness
these special characteristics that give them the potential to be
incredibly powerful computational devices.
4. The Potential and Power of Quantum Computing:
In a traditional computer, information is encoded in a
series of bits, and these bits are manipulated via Boolean
logic gates arranged in succession to produce an end result.
Similarly, a quantum computer manipulates qubits by
executing a series of quantum gates, each a unitary
transformation acting on a single qubit or pair of qubits. In
applying these gates in succession, a quantum computer can
perform a complicated unitary transformation to a set of
qubits in some initial state. The qubits can then be measured,
with this measurement serving as the final computational
result. This similarity in calculation between a classical and
quantum computer affords that in theory, a classical computer
can accurately simulate a quantum computer. In other words,
a classical computer would be able to do anything a quantum
computer can. So why bother with quantum computers?
Although a classical computer can theoretically simulate a
quantum computer, it is incredibly inefficient, so much so
that a classical computer is effectively incapable of
performing many tasks that a quantum computer could
perform with ease. The simulation of a quantum computer
on a classical one is a computationally hard problem because
the correlations among quantum bits are qualitatively
different from correlations among classical bits, as first
explained by John Bell. Take for example a system of only a
few hundred qubits, this exists in a Hilbert space of
dimension ~1090 that in simulation would require a classical
computer to work with exponentially large matrices (to
perform calculations on each individual state, which is also
represented as a matrix), meaning it would take an
exponentially longer time than even a primitive quantum
computer.

Richard Feynman was among the first to recognize the


potential in quantum superposition for solving such problems
much much faster. For example, a system of 500 qubits,
which is impossible to simulate classically, represents a
quantum superposition of as many as 2500 states. Each state
would be classically equivalent to a single list of 500 1's and
0's. Any quantum operation on that system --a particular
pulse of radio waves, for instance, whose action might be to
execute a controlled-NOT operation on the 100th and 101st
qubits-- would simultaneously operate on all 2500 states.
Hence with one fell swoop, one tick of the computer clock, a
quantum operation could compute not just on one machine
state, as serial computers do, but on 2500 machine states at
once! Eventually, however, observing the system would
cause it to collapse into a single quantum state corresponding
to a single answer, a single list of 500 1's and 0's, as dictated
by the measurement axiom of quantum mechanics. The
reason this is an exciting result is because this answer,
derived from the massive quantum parallelism achieved
through superposition, is the equivalent of performing the
same operation on a classical super computer with ~10150
separate processors (which is of course impossible)!!
Early investigators in this field were naturally excited by
the potential of such immense computing power, and soon
after realizing its potential, the hunt was on to find something
interesting for a quantum computer to do. Peter Shor, a
research and computer scientist at AT&T's Bell Laboratories
in New Jersey, provided such an application by devising the
first quantum computer algorithm. Shor's algorithm
harnesses the power of quantum superposition to rapidly
factor very large numbers (on the order ~10200 digits and
greater) in a matter of seconds. The premier application of a
quantum computer capable of implementing this algorithm
lies in the field of encryption, where one common (and best)
encryption code, known as RSA, relies heavily on the
difficulty of factoring very large composite numbers into
their primes. A computer which can do this easily is
naturally of great interest to numerous government agencies
that use RSA -- previously considered to be "uncrackable" --
and anyone interested in electronic and financial privacy.
Encryption, however, is only one application of a quantum
computer. In addition, Shor has put together a toolbox of
mathematical operations that can only be performed on a
quantum computer, many of which he used in his
factorization algorithm. Furthermore, Feynman asserted that
a quantum computer could function as a kind of simulator for
quantum physics, potentially opening the doors to many
discoveries in the field. Currently the power and capability
of a quantum computer is primarily theoretical speculation;
the advent of the first fully functional quantum computer will
undoubtedly bring many new and exciting applications.
The power of computers has increased by six orders of
magnitude in the last 36 years and it will increase by a further
six orders of magnitude in the next 36 years", claimed Nick
Donofrio, IBM's Senior VP of Technology and
Manufacturing to an audience of IT analysts at IBM
Pallisades.

'Six orders of magnitude' is a math-speak for "a million-fold"


so Nick was telling us on the one hand what we already
knew, that Moore's Law has been operating since the late
1960s, and on the other hand, professing a belief that it
would continue to operate for the foreseeable future.

He has reasons for his convictions and, in a fascinating


address, he referred to various areas of research that IBM
was involved in which led him to conclude that Moore's Law
will remain on the IT statute books. Here they are:

• Nanotube technology. Nanotubes are microscopic


tubes constructed from carbon rings which can be used
to build logic circuits. Currently this technology is
between 50 to 100 times denser and therefore faster
than current silicon. So in its current infant state, it
offers about two orders of magnitude improvement and
is expected to offer more in time.
• Nanodisk. IBM has built nano-machines that can store
data on and erase data from a surface by puncturing a
hole in it (or removing it by heating the surface up),
using an array of minute cantilevered arms. This is
effectively a nanodisk which is 25 to 50 times smaller
than current disks and can probably be made even
smaller.

The Molecular Cascade. IBM has been building molecules


using an electron tunneling microscope. One of the things it
has built is a big molecule that can act rather like Babbage's
computer as originally conceived with balls rolling down
paths and passing through gates, except of course that the
balls in this instance are atoms. It is thus possible to build a
molecular computer, the smallest nano-machine yet

NEWS ANALYSIS -- IBM scientists recently furthered the


cause of quantum computing, announcing they had solved an
order-finding mathematical problem in a single cycle using
fluorine atoms -- instead of the usual silicon gates -- as the
computing elements. The achievement may be the best
evidence to date that an architecture based on atoms and
nuclei can solve multi-step problems that overwhelm even
the most powerful of traditional computers.

Although the researchers advise that the experiment was


modest, these and other advances suggest to some that
smaller and abler architectures may arise in the future to help
the computer industry push on to new levels of processing
power.

For years, concerns have grown about the limits of


semiconductor electronics as the limits of Gordon Moore's
Law have been approached.

Intel founder Moore said that the number of transistors the


industry could place on a silicon chip would double every
year or two. The conventional means of pushing hordes of
electrons around in order to do calculations has worked, of
course, and smaller and smaller chip dice have meant larger
memory and faster processing.

The most immediate obstacle to further miniaturization is


that current chip lithography techniques are nearing their
ultimate resolution. To go smaller, chip makers will have to
employ new x-ray fabrication techniques, which will be quite
expensive to implement.

But even with better chip fab technology, experts see an


eventual breakdown in that trend as the size of silicon logic
gates shrinks down to the size of atoms.

Qubit power:
While pursuing molecular computing research, IBM and
other researchers decided to explore a somewhat non-
Boolean approach that is based on the complex states of
quantum matter.
The team included scientists from IBM's Almaden Research
Center, Stanford University, and the University of Calgary.
The group fashioned fluorine atoms as qubits, computing
units that exploit the quantum physical properties of matter to
represent a spectrum of states, not just Boolean 0's and 1's as
in conventional digital computing.

Isaac "Ike" Chuang, the IBM research staff member who led
the team, said the first applications for quantum computing
will probably be on coprocessors for specific functions such
as database lookup. He also sees the technology addressing
mathematical problems such as the Traveling Salesman
problem that tries to compute the best route between many
locations. Those problems can overcome conventional
computers.

"With quantum computing, you can do the problem in linear


time. That could mean a difference in [computing] time
between millions of years and a few weeks," he said.

The era of quantum computing will begin in about 2020,


when "circuit features are predicted to be the size of atoms
and molecules," Chuang projected.

He noted that accelerating word processing or Web surfing


would not be well suited to a quantum computer's
capabilities.

5.Obstacles and Research:

▪ Obstacles seen
The complex lab experiment, not reproducible in the usual
corporate environment, entailed the building of five fluorine
atoms within a molecule so that the fluorine nuclei's spins
could interact to effect calculations. The atoms were
programmed by radio frequency pulses, and results were
detected by nuclear magnetic resonance instruments, which
according to Chuang, are "similar to those commonly found
in hospitals and chemistry labs."
Chuang said the obstacles to commercialization are "huge."
At the present time, quantum computing requires "a lot of
expensive equipment," he said. "The apparatus fills half a
lab." Moreover, only five qubits were operative in the
experiment. Many more are required to really tackle tough
tasks.

There are important potential benefits to quantum computing,


said Linley Gwennap, a microprocessor industry analyst with
the Linley Group of Mountain View, Calif.

"Silicon-based technology is going to taper off and not be


able to continue increasing in performance," he said.
"Theoretically, these quantum computers should be able to
operate much faster than conventional transistor-based
computers."

Scaling to mass production scale is likely to be the biggest


hurdle for commercial quantum computing. "Right now they
are literally dragging atoms around to create quantum
structures, then using very expensive quantum microscopes
to observe the behavior and extract the information,"
Gwennap said.

Also, researchers are uncertain about how to independently


address molecules or atoms, as increasing numbers of
molecules become part of the molecular computer.

In other molecular computing news of late, UCLA chemists


reported the first demonstration of a reconfigurable
molecular switch. Molecule-level switches to this point could
switch only once; the UCLA crew said it was able to switch
states hundreds of times. They used synthetic molecules
known as catenanenes.

The field of quantum information processing has made


numerous promising advancements since its conception,
including the building of two- and three-qubit quantum
computers capable of some simple arithmetic and data
sorting. However, a few potentially large obstacles still
remain that prevent us from "just building one," or more
precisely, building a quantum computer that can rival today's
modern digital computer. Among these difficulties, error
correction, decoherence, and hardware architecture are
probably the most formidable. Error correction is rather self
explanatory, but what errors need correction? The answer is
primarily those errors that arise as a direct result of
decoherence, or the tendency of a quantum computer to
decay from a given quantum state into an incoherent state as
it interacts, or entangles, with the state of the environment.
These interactions between the environment and qubits are
unavoidable, and induce the breakdown of information stored
in the quantum computer, and thus errors in computation.
Before any quantum computer will be capable of solving
hard problems, research must devise a way to maintain
decoherence and other potential sources of error at an
acceptable level. Thanks to the theory (and now reality) of
quantum error correction, first proposed in 1995 and
continually developed since, small scale quantum computers
have been built and the prospects of large quantum
computers are looking up. Probably the most important idea
in this field is the application of error correction in phase
coherence as a means to extract information and reduce error
in a quantum system without actually measuring that system.
In 1998, researches at Los Alamos National Laboratory and
MIT led by Raymond Laflamme managed to spread a single
bit of quantum information (qubit) across three nuclear spins
in each molecule of a liquid solution of alanine or
trichloroethylene molecules. They accomplished this using
the techniques of nuclear magnetic resonance (NMR). This
experiment is significant because spreading out the
information actually made it harder to corrupt. Quantum
mechanics tells us that directly measuring the state of a qubit
invariably destroys the superposition of states in which it
exists, forcing it to become either a 0 or 1. The technique of
spreading out the information allows researchers to utilize the
property of entanglement to study the interactions between
states as an indirect method for analyzing the quantum
information. Rather than a direct measurement, the group
compared the spins to see if any new differences arose
between them without learning the information itself. This
technique gave them the ability to detect and fix errors in a
qubit's phase coherence, and thus maintain a higher level of
coherence in the quantum system. This milestone has
provided argument against skeptics, and hope for believers.
Currently, research in quantum error correction continues
with groups at Caltech (Preskill, Kimble), Microsoft, Los
Alamos, and elsewhere.
At this point, only a few of the benefits of quantum
computation and quantum computers are readily obvious, but
before more possibilities are uncovered theory must be put to
the test. In order to do this, devices capable of quantum
computation must be constructed. Quantum computing
hardware is, however, still in its infancy. As a result of
several significant experiments, nuclear magnetic resonance
(NMR) has become the most popular component in quantum
hardware architecture. Only within the past year, a group
from Los Alamos National Laboratory and MIT constructed
the first experimental demonstrations of a quantum computer
using nuclear magnetic resonance (NMR) technology.
Currently, research is underway to discover methods for
battling the destructive effects of decoherence, to develop an
optimal hardware architecture for designing and building a
quantum computer, and to further uncover quantum
algorithms to utilize the immense computing power available
in these devices. Naturally this pursuit is intimately related
to quantum error correction codes and quantum algorithms,
so a number of groups are doing simultaneous research in a
number of these fields. To date, designs have involved ion
traps, cavity quantum electrodynamics (QED), and NMR.
Though these devices have had mild success in performing
interesting experiments, the technologies each have serious
limitations. Ion trap computers are limited in speed by the
vibration frequency of the modes in the trap. NMR devices
have an exponential attenuation of signal to noise as the
number of qubits in a system increases. Cavity QED is
slightly more promising; however, it still has only been
demonstrated with a few qubits. Seth Lloyd of MIT is
currently a prominent researcher in quantum hardware. The
future of quantum computer hardware architecture is likely to
be very different from what we know today; however, the
current research has helped to provide insight as to what
obstacles the future will hold for these devices.

The main difficulty that the research-and-development


engineers have encountered is the fact that it is extremely
difficult to get particles to behave in the proper way for a
significant length of time. The slightest disturbance will
cause the machine to cease working in quantum fashion and
revert to "single-thought" mode like a conventional
computer. Stray electromagnetic fields, physical movement,
or a tiny electrical discharge can disrupt the process.

6.Applications with Quantum Computers:


To date an operating system QOS has been created. It has
been designed to coordinate the timing and configuration of
the computer as well as processing various signals of output.
Using QOS simple algorithms have been devised such as the
following algorithm known as Grover's algorithm with runs
a search on "N" unsorted items.

Quantum computers, due to their large scale processing


abilities are ideal for such tasks as large number crunching.
Examples of this is factorization. Factorization is a key idea
in encryption technology. The higher and more complex the
number the harder it is to crack. With quantum technology,
computers can decrypt much faster than before. Also along
the same lines of decryption is cracking passwords and things
like that. Using it's superposition states it can run through
many different inputs, evaluate the outcome and move to the
next input much much faster than a regular computer.

Quantum computers might prove especially useful in the


following applications:

• Breaking ciphers
• Statistical analysis
• Factoring large numbers
• Solving problems in theoretical physics
• Solving optimization problems in many variables

Silicon computer:
A quantum computer - a new kind of computer far more
powerful than any that currently exist - could be made today,
say Thaddeus Ladd of Stanford University , Kohei Itoh of
Keio University in Japan, and their co-workers. They have
sketched a blueprint for a silicon quantum computer that
could be built using current fabrication and measurement
techniques.

The microelectronics industry has decades of experience of


controlling and fine-tuning the structure and properties of
silicon. These skills would give a silicon-based quantum
computer a head start over other schemes for putting one
together.

Quantum and conventional computers encode, store and


manipulate information as sequences of binary digits, or bits,
denoted as 1s and 0s. In a normal computer, each bit is a
switch, which can be either 'on' or 'off'.

In a quantum computer, switches can be on, off or in a


superposition of states - on and off at the same time. These
extra configurations mean that quantum bits, or qubits, can
encode more information than classical switches.

That increase in capacity would, in theory, make quantum


computers faster and more powerful. In practice it is
extremely difficult to maintain a superposition of more than a
few quantum states for any length of time. So far, quantum
computing has been demonstrated with only four qubits,
compared with the billions of bits that conventional silicon
microprocessors handle.

Several quantum-computing demonstrations have used


nuclear magnetic resonance (NMR) to control and detect the
quantum states of atoms floating in solution. But this beaker-
of-liquid approach is unlikely to remain viable beyond ten or
so qubits.

Many researchers suspect that making a quantum computer


with as many qubits as a Pentium chip has transistors will
take the same kind of technology, recording the information
in solid-state devices.

In 1998, Bruce Kane of the University of New South Wales


in Australia showed that solid-state quantum computing was
conceivable, but not practical. He suggested that atoms of
phosphorus in crystalline films of silicon could store qubits
that could be read and manipulated using NMR sensitive
enough to detect single atoms.
The device proposed by Ladd and his colleagues is similar,
but more within the reach of current technical capabilities.
They suggest that qubits could be encoded in an isotope of
silicon called silicon-29, or 29Si.

Itoh's group in Japan believes it has the capability to grow


grid-like arrays of 29Si chains atom by atom on the surface of
the most abundant silicon isotope, 29Si. A tiny magnet and
radio waves would then be used to control the magnetic
quantum states of 28Si.

Crucially, each qubit would be stored not just in a single 29Si


atom but in many thousand copies, one in each 29Si chain.
This would avoid the problem of making measurements on
single atoms. The readout could be performed using
magnetic resonance force microscopy, which detects the
oscillations of a thin bridge in which the rows of silicon
atoms are embedded.

The details are subtle, but the point, the researchers say, is
that the device is feasible without "unrealistic advances in
fabrication, measurement, or control technologies". All they
have to do now is build it.

7.Future Outlook:
At present, quantum computers and quantum information
technology remains in its pioneering stage. At this very
moment obstacles are being surmounted that will provide the
knowledge needed to thrust quantum computers up to their
rightful position as the fastest computational machines in
existence. Error correction has made promising progress to
date, nearing a point now where we may have the tools
required to build a computer robust enough to adequately
withstand the effects of decoherence. Quantum hardware, on
the other hand, remains an emerging field, but the work done
thus far suggests that it will only be a matter time before we
have devices large enough to test Shor's and other quantum
algorithms. Thereby, quantum computers will emerge as the
superior computational devices at the very least, and perhaps
one day make today's modern computer obsolete. Quantum
computation has its origins in highly specialized fields of
theoretical physics, but its future undoubtedly lies in the
profound effect it will have on the lives of all mankind.

8.references:
1. www.computer.howstuffworks.com
2. www.cas.org
3. www.apt.net.au
4. www.qubit.org

You might also like