Nothing Special   »   [go: up one dir, main page]

Seminar Report

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Quantum Computing

Project Report Submitted in Partial Fulfilment of the


Requirements for the Degree of

Bachelor of Engineering
in
Computer Science Engineering
Submitted by
Nitin Saini: (Roll No. :- 19UCSE4010)

Under the Supervision of


Abhisek Gour
(Assistant Professor)

Department of Computer Science and Engineering


MBM University, Jodhpur
April 2022
Department of Computer
Science & Engineering
M.B.M. Engineering College, Jai Narain
Vyas University
Ratanada, Jodhpur, Rajasthan, India –
342011

CERTIFICATE

This is to certify that the work contained in this report entitled “Quantum
Computing” is submitted by Mr. Nitin Saini (Roll. No: 19UCSE4010) to
the Department of Computer Science & Engineering, M.B.M. Engineering
College, Jodhpur, for the partial fulfilment of the requirements for the
degree of Bachelor of Engineering in Computer Science Engineering.

They have carried out their work under my supervision. This work has not
been submitted else-where for the award of any other degree or diploma.

The project work in our opinion, has reached the standard fulfilling of the
requirements for the degree of Bachelor of Engineering in Computer
Science in accordance with the regulations of the Institute.

Abhisek Gour
Assistant Professor
(Supervisor)
Dept. of Computer Science & Engg.
M.B.M. Engineering College, Jodhpur

Dr. Nemi Chand Barwar


(Head)
Dept. of Computer Science & Engg.
M.B.M. Engineering College, Jodhpur
DECLARATION

I, Nitin Saini, hereby declare that this seminar/project titled “Quantum


Computing” is a record of original work done by me under the supervision
and guidance of Abhisek Gour.

I, further certify that this work has not formed the basis for the award of the
Degree/Diploma/Associateship/Fellowship or similar recognition to any
candidate of any university and no part of this report is reproduced as it is
from any other source without appropriate reference and permission.

(Nitin Saini)
7th Semester, CSE
Enroll. - < Enroll No>
Roll No. - 19UCSE4010
ACKNOWLEDGEMENT

With immense please I, Nitin Saini presenting this seminar


report as part of the curriculum of B.E in Computer Science.
I am extremely grateful to Mr. N.C. Barwar (HOD), Head
of Department, Department of Computer Science
Engineering, for providing all the required resources for the
successful completion of my seminar report. My heartfelt
gratitude to my seminar guide Mr. Abhisek Gour(Assistant
Professor ,Computer Science Engineering) for his valuable
suggestions and guidance in the preparation of the seminar
report. I express my thanks my friends for all the help and
co-ordination extended in bringing out this seminar
successfully in time. I will be failing in duty if I do not
acknowledge with gratitude thanks to the authors of the and
other literature referred to in this seminar.

Thanking You
Nitin Saini
ABSTRACT
Quantum computing is a modern way of computing that is based on the
science of quantum mechanics and its unbelievable phenomena. It is a
beautiful combination of physics, mathematics, computer science and
information theory. It provides high computational power, less energy
consumption and exponential speed over classical computers by controlling
the behavior of small physical objects i.e. microscopic particles like atoms,
electrons, photons, etc. Here, we present an introduction to the fundamental
concepts and some ideas of quantum computing. This seminar starts with
the origin of traditional computing and discusses all the improvements and
transformations that have been done due to their limitations until now. Then
it moves on to the basic working of quantum computing and the quantum
properties it follows like superposition, entanglement and interference. This
paper covers the architecture, hardware, software, design, types and
algorithms that are specifically required by the quantum computers. It
uncovers the capability of quantum computers that can impact our lives in
various viewpoints like cyber security, traffic optimization, medicines,
artificial intelligence and many more. At last, we concluded all the
importance, advantages and disadvantages of quantum computers. Small-
scale quantum computers are being developed recently. This development is
heading towards a great future due to their high potential capabilities and
advancements in ongoing research. Before focusing on the significances of a
general-purpose quantum computer and exploring the power of the new
arising technology, it is better to review the origin, potentials, and
limitations of the existing traditional computing. This information helps us
in understanding the possible challenges in developing exotic and
competitive technology. It will also give us an insight into the ongoing
progress in this field.
Contents

1. Introduction to the Topic (6-8 pages) 1

1.1. What is Quantum Computing?

1.2. Why Quantum Computing?

1.2.1. Limitations of classical computers

1.2.2. Moore’s Law

1.2.3. Study of matter at atomic level

1.3. Superposition

1.4. Entanglement

1.5. Bits n Qbits

1.5.1. Bits

1.5.2. Qbits

1.5.3. Bits vs Qbits

2. History, Evolution & Technical Details (4-5 Pages) 4

2.1. History and Evolution

2.2. Future of Quantum Computing

3. Similar Technologies (4-5 Pages) 5


3.1. The Memcomputer

4. Applications, Pros & Cons (3-4 Pages) 7

4.1. Application of Quantum Computer

4.2 Limitations of Quantum Computing

5. Summary of Study (1-2 Pages) 9

References…………………………………………………………………….. 10
Chapter 1
INTRODUCTION

1.2. What is Quantum Computing ?

Quantum computing is a type of computation that harnesses the


collective properties of quantum states, such as superposition,
interference, and entanglement, to perform calculations. The
devices that perform quantum computations are known as
quantum computers.
1.2. Why Quantum Computing ?

Before learning the quantum computing there is a question will


arrised in mind that “why we need the Quantum Computers ?”.
There are some points below which help us to understand the
need of quantum computing.
1.2.1. Limitations of Classical Computers
The classical computers though have become compact and
fast, cannot solve problems such as factoring of a large integer.
The large digit prime numbers are used to send messages in
coded form. And some NP hard problems are also not solved
by our classical computers. Currently the single transistor on a
chip is turned on or off by using around hundreds of electrons.
1.2.2. Moore’s Law

Since 1949, the power of digital computers has grown


exponentially. The trajectory of this growth is known as
"Moore's Law". 
First proposed by Intel Co-Founder & CEO Gordon Moore in
1965, Moore noticed that the number of transistors on an
integrated circuited seemed to double at regular intervals.
In 1975, Moore predicted that computational power would
double every 2 years for the foreseeable future.
Moore's prediction has held accurate to present day. It is now is
colloquially called a 'Law'.

1.1.3. Study of matter at atomic level

Within a few short years scientists developed a consistent


theory of the atom that explained its fundamental structure and
its interactions. Crucial to the development of the theory was
new evidence indicating that light and matter have both wave
and particle characteristics at the atomic and subatomic levels.
Theoreticians had objected to the fact that Bohr had used an ad
hoc hybrid of classical Newtonian dynamic for the orbits and
some quantum postulates to arrive at the energy levels of atomic
electrons. The new theory ignored the fact that electrons are
particles and treated them as waves. By 1926 physicists had
developed the laws of quantum mechanics, also called wave
mechanics, to explain atomic and subatomic phenomena.
1.3. Superposition

Quantum computers are based on quantum superposition.


Superposition allows quantum objects to simultaneously exist
in more than one state or location. This means that an object
can be in two states at one time while remaining a single object.
This allows us to explore much richer sets of states.

1.4. Entanglement

Quantum entanglement is a quantum mechanical phenomenon


in which the quantum states of two or more objects have to be
described with reference to each other, even though the
individual objects may be spatially separated. This leads to
correlations between observable physical properties of the
systems.

1.5. Bits n Qbits:-


1.5.1. Bits

A bit is a binary digit, the smallest increment of data on a


computer. A bit can hold only one of two values: 0 or 1,
corresponding to the electrical values of off or on, respectively.
Because bits are so small, you rarely work with information
one bit at a time.
1.5.2. Qbits

A qubit (or quantum bit) is the quantum mechanical analogue


of a classical bit. In classical computing the information is
encoded in bits, where each bit can have the value zero or one.
In quantum computing the information is encoded in qubits. A
qubit is a two-level quantum system where the two basis qubit
states are usually written as ∣0⟩ and ∣1⟩. A qubit can be in
state ∣0⟩, ∣1⟩ or (unlike a classical bit) in a linear combination
of both states. The name of this phenomenon is superposition.

1.5.3. Bits vs Qbits :-

 When we consider bit in traditional computing


technology, bits refer only to the binary values such as 0s and
1s, and they cannot be considered for other values. Whereas in
qubits, it represents 0s, 1s, and a superposition of both the
values. That means it can be used to represent the combination
of 0s and 1s in quantum computing, where it is much important
to notify all the values in the system.

 When bit storing the information of binary digits, qubits


store the combination of binary digits, which helps the qubits in
quantum computing work three times as fast as a conventional
computer system. The information stored and the data transfer
is huge, which helps to transfer the information faster.

 When the problem is to be solved on the computer, bits


approach the problem as if in a hit and trial run. This is due to
the fact that one value is considered at a time, and parallel
processing is not happening when the problem has to be solved.
When the same problem has to be solved using quantum
computing, it is approached with parallel processing by
supporting all four values at a time and solving it at a faster
pace.
 When more qubits are added to the quantum computer,
the power to do the processing increases at an exponential rate.
In contrast, when bits are added to the normal computer, the
power will not increase, and the operations will be done at the
same pace as one at a time. In quantum computing, this happens
due to superposition.
 It is extremely difficult to build quantum computers
because they need extreme isolation and quantum objects’
proper temperature. This is not the case with traditional
computers, which anyone with hardware knowledge can build
and make it work for all the needed conditions for the user.
Hence, the number of quantum computers is very less, and
their use is recently being increased.
 The storage space required by traditional computers for
bits is huge, and it takes up lots of room. This can be avoided
for qubits as huge information can be stored in the system
with a small area. As the systems and devices are getting
smaller, qubits help reimagine the technological world with
really small size devices being handy to carry everywhere.
 The scientific world can be viewed with different light
with the help of qubits as it helps to modify and recalculate the
physical phenomena, even though it is really huge, within a
short span of time than the normal computers and make the
process really easy for all who is beneficial with the same.
Chapter 2

2.1. History and Evolution


Einstein’s quantum theory of light, developed in 1905, laid the
conceptual framework for the field of quantum mechanics.
Subsequent advances by Bohr, Heisenberg, de Broglie, and
Schrodinger further developed quantum theory over the next
several decades. It was only in 1981, however, that quantum
mechanics departed from the realm of the theoretical. Richard
Feynman declared in a 1981 MIT lecture: "Nature isn't
classical, dammit, and if you want to make a simulation of
nature, you'd better make it quantum mechanical, and by golly,
it's a wonderful problem because it doesn't look so easy.” IBM
set to work tackling this “wonderful problem”, releasing the
first quantum cryptography protocol in 1984. BB84 was a
provably secure quantum key distribution scheme--an exciting
theoretical advancement, but not yet possible to implement. The
next year, David Deutsch at Oxford University proposed an idea
for a universal quantum computer, and his Oxford colleague
Artur Ekert developed entanglement- based secure
communication in 1991.

In 1994, Peter Shor at AT&T developed the algorithm that


bears his name, capable of large integer factorization on a
quantum computer. Shor’s algorithm, if implemented, could
crack most of today’s public key encryption schemes; it
inspired global interest in quantum computers. Lov Grover’s
1996 quantum database search algorithm (though less powerful
than Shor’s algorithm) also threatened classical cryptography
by accelerating brute-force computation via a quadratic
speedup. The United States Department of Defense began
investing in quantum research.

In 2000, the Los Alamos National Laboratory developed the


first working 7-qubit quantum computer. The next year, IBM
and Stanford University succeeded in executing Shor’s
algorithm to factor the number 15 using 7 qubits and identical
molecules. In 2004, China’s1018 University of Science and
Technology demonstrated five-photon entanglement and
Oxford developed the first working pure state NMR quantum
computer.

In 2011, D-Wave Systems claimed to have developed the first


commercially available quantum computer, but this claim
remains under dispute. Many argue that while the D-Wave One
can perform quantum calculations, these calculations can be
executed on a classical computer at the same speed. Meanwhile,
the established tech companies were making progress.

In 2016, IBM released the Quantum Experience, and public


online interface for quantum simulation. Google simulated a
hydrogen molecule using 9 qubits. Last year, Google released a
72 qubit chip called “Bristlecone” and Intel released a 49 qubit
chip called “Tangle Lake.” This
year, IBM released its first commercial quantum computer, the
Q System One.
2.2 Future of Quantum Computing

The practical uses of quantum computers are still being


researched and tested. In the future, it is possible that quantum
computers will be able to solve problems that have been
impossible to solve before. For example, they have the potential
to be used for modelling molecules or predicting how a
molecule will behave under different conditions.
We should also remember that a quantum computer is not faster
than a regular computer - it's just more powerful. That means
that "running" a program on a quantum computer will take just
as long as on a regular computer - but with much better results
because of their increased power.
Quantum computers will allow for the storage and processing of
data in ways that we cannot even comprehend today. They also
offer more complex calculations than traditional computers and
therefore can easily solve problems that would take years to
solve on a traditional computer.
Some experts believe that they could be used to calculate
complex formulas with no time limit, which will make them an
invaluable tool in medical science, AI technologies,
aeronautical engineering and so on. So far, quantum computing
has been used to solve optimization problems, which are too
complex for traditional computer models. It's also been used to
study protein folding and drug interactions within the body.
Quantum computers are powerful computers that work on the
principles of quantum mechanics. They use qubits, not bits to
represent data and they can access potentially more than two
values at the same time. Quantum computers will be able to
break all of the encoding and encryption we have today.
Quantum computing is changing the world of cybersecurity.
Quantum computers are capable of running sophisticated
simulations in parallel, making them much faster than classical
computers. The ability to run simulations in parallel means that
quantum computers can quickly find solutions to difficult
problems. Quantum computers will disrupt many industries like
finance, healthcare, and education.
While it's still unclear how big of an impact quantum computing
will have on marketing in the future, there are already some
significant uses happening now. One example is in ad targeting
where companies can analyze customer behaviour with
astounding precision by processing large amounts.
Chapter 3

Similar Technologies
3.1. The Memcomputer

For some computer scientists, the solution lies in


building quantum computers—devices which take advantage of
the inexplicable weirdness of atomic-level physics. The only
downside? Quantum computers require cool, carefully tended
environments that are beyond our current technological
capabilities. But Massimiliano Di Ventra, a physicist and
computer scientist at the University of California, San Diego,
has an entirely different solution. He and a team of his
colleagues have just designed and built the first brain-like
computer prototype that bypasses certain structural limits of our
modern electronics. Called the memcomputer, its the first
computer to store and processes info simultaneously.
According to Di Ventra, despite his new technology's futuristic
promise, "memcomputers can be built with standard technology
and operate at room temperature. This puts them on a
completely different level of simplicity and cost in
manufacturing compared to quantum computers."

The fault in our computers


In short, a big problem with modern computers is that they store
data and solve problems with it in two entirely different areas:
the memory, and the central processing unit (CPU). And all that
shuffling back and fourth takes its toll, says Di Ventra. "To
make a quick comparison: our own brain expends about 20
watts to perform 10^16 operations per second," he says, while a
supercomputer would require 10 million times more power to
do the same number of operations. "A big chunk of that power
is wasted in the back and forth transfer of information between
the CPU and the memory," says Di Ventra.

Di Ventra's memcomputer sprung out of an easy-to-understand


thought experiment from the 1970's. What if, like our brains, a
computer stored data in the exact same place it crunched the
numbers? And better yet, what if the actual process of
crunching data was used as memory?
This type of memory-crunching computer
(hence: memcomputer) would sidestep the costly data shuffle.
Furthermore, mathematicians have actually proven that that 2-
for-1 process would also allow memcomputers to solve certain
fantastically complex problems in a single step.
To build his memcomputer, Di Ventra and his colleagues had to
physically rebuild and reprogram a computer from its most
basic components. Rather than classical silicon transistors (the
building blocks that combine to build all electronics), at the
core of Di Ventra's machine are what he calls memprocessors.
Di Ventra's simple computer uses 6 of them.

Here's how they work. A classical transistor's job basically boils


down to one thing, either letting energy through, or not,
depending one what it's been told to do. A memprocessor does
this exact same job, but it also physically changes some of its
properties ("such as its [electrical] resistance," says Di Ventra)
depending on how much energy is trying to move through. Even
when the memprocessor loses power, it stores that change. In
this way, while totally functioning as a classical, data-crunching
CPU, memprocessors can also be coded to store resistance-
laden information at the same time. No more back and fourth.
Chapter 4

4.1. Application of Quantum


Computer
* Artificial Intelligence & Machine Learning
Artificial intelligence and machine learning are some of the
prominent areas right now, as the emerging technologies have
penetrated almost every aspect of humans’ lives. Some of the
widespread applications we see every day are in voice, image
and handwriting recognition. However, as the number of
applications increased, it becomes a challenging task for
traditional computers, to match up the accuracy and speed. And,
that’s where quantum computing can help in processing through
complex problems in very less time, which would have taken
traditional computers thousand of years.

* Computational Chemistry
IBM, once said, one of the most promising quantum computing
applications will be in the field of computational chemistry. It is
believed that the number of quantum states, even in a tiniest of
a molecule, is extremely vast, and therefore difficult for
conventional computing memory to process that. The ability for
quantum computers to focus on the existence of both 1 and 0
simultaneously could provide immense power to the machine to
successfully map the molecules which, in turn, potentially
opens opportunities for pharmaceutical research. Some of the
critical problems that could be solved via quantum computing
are — improving the nitrogen-fixation process for creating
ammonia-based fertilizer; creating a room-temperature
superconductor; removing carbon dioxide for a better climate;
and creating solid-state batteries.

* Drug Design & Development


Designing and developing a drug is the most challenging
problem in quantum computing. Usually, drugs are being
developed via the trial and error method, which is not only very
expensive but also a risky and challenging task to complete.
Researchers believe quantum computing can be an effective
way of understanding the drugs and its reactions on humans
which, in turn, can save a ton of money and time for drug
companies. These advancements in computing could enhance
efficiency dramatically, by allowing companies to carry out
more drug discoveries to uncover new medical treatments for
the better pharmaceutical industry. 

* Cybersecurity & Cryptography


The online security space currently has been quite vulnerable
due to the increasing number of cyber-attacks occurring across
the globe, on a daily basis. Although companies are establishing
necessary security framework in their organisations, the process
becomes daunting and impractical for classical digital
computers. And, therefore, cybersecurity has continued to be an
essential concern around the world. With our increasing
dependency on digitisation, we are becoming even more
vulnerable to these threats. Quantum computing with the help of
machine learning can help in developing various techniques to
combat these cybersecurity threats. Additionally, quantum
computing can help in creating encryption methods, also known
as, quantum cryptography.
* Financial Modelling
For a finance industry to find the right mix for fruitful
investments based on expected returns, the risk associated, and
other factors are important to survive in the market. To achieve
that, the technique of ‘Monte Carlo’ simulations are continually
being run on conventional computers, which, in turn, consume
an enormous amount of computer time. However, by applying
quantum technology to perform these massive and complex
calculations, companies can not only improve the quality of the
solutions but also reduce the time to develop them. Because
financial leaders are in a business of handling billions of
dollars, even a tiny improvement in the expected return can be
worth a lot for them. Algorithmic trading is another potential
application where the machine uses complex algorithms to
automatically trigger share dealings analysing the market
variables, which is an advantage, especially for high-volume
transactions.

* Logistics Optimisation
Improved data analysis and robust modelling will indeed enable
a wide range of industries to optimise their logistics and
scheduling workflows associated with their supply-chain
management. The operating models need to continuously
calculate and recalculate optimal routes of traffic management,
fleet operations, air traffic control, freight and distribution, and
that could have a severe impact on applications. Usually, to do
these tasks, conventional computing is used; however, some of
them could turn into more complex for an ideal computing
solution, whereas a quantum approach may be able to do it.
Two common quantum approaches that can be used to solve
such problems are — quantum annealing and universal quantum
computers. Quantum annealing is an advanced optimisation
technique that is expected to surpass traditional computers. In
contrast, universal quantum computers are capable of solving
all types of computational problems, not yet commercially
available.

* Weather Forecasting
Currently, the process of analysing weather conditions by
traditional computers can sometimes take longer than the
weather itself does to change. But a quantum computer’s ability
to crunch vast amounts of data, in a short period, could indeed
lead to enhancing weather system modelling allowing scientists
to predict the changing weather patterns in no time and with
excellent accuracy — something which can be essential for the
current time when the world is going under a climate change.

Weather forecasting includes several variables to consider, such


as air pressure, temperature and air density, which makes it
difficult for it to be predicted accurately. Application of
quantum machine learning can help in improving pattern
recognition, which, in turn, will make it easier for scientists to
predict extreme weather events and potentially save thousands
of lives a year. With quantum computers, meteorologists will
also be able to generate and analyse more detailed climate
models, which will provide greater insight into climate change
and ways to mitigate it.
4.2 Limitations of Quantum Computing

Hardware limitations
The frequent challenge that troubles researchers is isolation.
Quantum decoherence can be caused by heat and light,
when subjected to such conditions qubits can lose their
quantum properties like entanglement that further leads to a
loss in data stored in these qubits. Secondly, rotations in
quantum computers’ logic gates are prone to error and these
are also crucial to change the state of the qubit. Any wrong
rotation can cause an error in the output. The requirement
of computers with a greater circuit length and error
correction( with redundancy for every qubit) is also crucial
for the field of quantum machine learning.

Software limitations
The developer of algorithms for Quantum computers has to
be concerned about their physics. While a classical
algorithm can be developed along the lines of the Turing
machine, to develop an algorithm for Quantum computers,
the developer has to base it along the lines of raw physics
with no simple formulas that would link it to logic.

The critical issue in such a design is always scalability.


Designing a program to operate on larger data with more
processing power. Very little information is available to
develop such algorithms for quantum computing. Most of
the development is therefore intuitive. Most known
Quantum algorithms suffer from a proviso of specific
simulations that limit their practical applicability and it
becomes difficult to develop models that can have a
significant impact on machine learning. The third limitation
in quantum computing is that the number of qubits one can
have on a quantum circle is limited.  Though these
limitations are applicable to quantum computing in general,
the augmentation of fields such as machine learning can
grab more eyeballs and push the field in the right direction. 
Chapter 5
Summary of Study

Quantum computing is a technology that uses the laws


of quantum mechanics to solve the problems which is
too complex for classical computers.
In our classical computers we uses bits to store the
information which can take a value 0 or 1. Whereas in
qubits, it represents 0s, 1s, and a superposition of both
the values. That means it can be used to represent the
combination of 0s and 1s in quantum computing.
When bit storing the information of binary digits, qubits
store the combination of binary digits, which helps the
qubits in quantum computing work three times as fast
as a conventional computer system. The information
stored and the data transfer is huge, which helps to
transfer the information faster.
When the problem is to be solved on the computer, bits
approach the problem as if in a hit and trial run. This is
due to the fact that one value is considered at a time,
and parallel processing is not happening when the
problem has to be solved. When the same problem has
to be solved using quantum computing, it is aproached
with parallel processing by supporting all four values at
a time and solving it at a faster pace.
For example if we want to make a drug by mixing some
elements by some context with some hit and trial
method (Brute force algo) then in such case our
classical computers will take too much time to go
through each and every case but it will be very easier
for our quantum computers.

Quantum computers can used in many areas like:-


 Artificial Intelligence & Machine Learning

 Computational Chemistry

 Logistics Optimisation

 Financial Modelling

 Cybersecurity & Cryptography

 Drug Design & Development

 Weather Forecasting and many mores

Quantum computers have the potential to revolutionize


computation by making certain types of classically
intractable problems solvable. While no quantum
computer is yet sophisticated enough to carry out
calculations that a classical computer can't, great
progress is under way.
Because right now we have some limitation of the
quantum computer due to which we are now able to
build a complete quantum computer. For example the
quantum computer can be effected by tempurature, etc
so we have to operate the quantum computers in a temp
of 0K (-273* C) which is not possible to obtain so due
to these hardware limits and some software limits
quantum computers are still in the research phase. But
in the future this will make a revolution in the worlds of
technologies.
References
[1] Practical Quantum Computing for Developers By Vladimir Silva, 2018
Edition

[2] Learn Quantum Computation using Qiskit


https://qiskit.org/textbook/preface.html visited on 15 April 2022.

[3] Introduction to Quantum Computing WRITTEN BY Surya Teja Marella and


Hemanth Sai Kumar Parisa. Submitted On: August 23rd, 2020, Reviewed On: September
18th, 2020, Published On : October 29th, 2020 DOI: 10.5772/intechopen.94103 https://
www.intechopen.com/chapters/73811 visited on 15 April 2022.

[4] Qubits vs Bits By Priya Pedamkar https://www.educba.com/qubits-vs-bits/


visited on 15 April 2022.

You might also like