Nothing Special   »   [go: up one dir, main page]

Debdeep Ghosh - 23000322039 - ECE

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

CAMELLIA INSTITUTE OF TECHNOLOGY

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING

TOPIC OF THE REPORT: STACK


NAME: DEBDEEP GHOSH
UNIVERSITY ROLL NO: 23000322039
REGISTRATION NO: 222300120213 (2022-23)
SEMESTER: 3rd
SUBJECT: DATA STRUCTURE AND ALGORITHM
SUBJECT CODE: PCC-CS301
EXAM: CONTINUOUS ASSESSMENT 2 (CA2)
STACK
Introduction:
A stack is a container of objects that are inserted and removed according to
the last-in first-out (LIFO) principle. In the pushdown stacks only two operations are allowed:
push the item into the stack, and pop the item out of the stack. A stack is a limited access data
structure - elements can be added and removed from the stack only at the top.
push adds an item to the top of the stack, Pop removes the item from the top. A helpful analogy
is to think of a stack of books; you can remove only the top book, also you can add a new
book on the top.

History:
The importance of the stack in the development of computer science has been explained by
Chomsky and Miller in terms of the push-down storage automaton (PDA). This is an automaton
with one read-only input tape moving in one direction only, and one storage tape, movable in
both directions. Symbols are written and read on the storage tape, which is used as a push-down
storage/ stack with only one symbol visible at a time. It has been said that ”…the theory of
push-down automata is, in fact, essentially another version of the theory of context free
grammar” The PDA automaton was introduced by Newell et al in 1959. The languages defined
by (accepted by) the PDA are exactly the context-free languages, while finite automata without a
stack define regular languages. If we let a finite automaton have two stacks, it can simulate the
tape moves of a Turing machine and thus have the same computational power as a Turing
machine. The languages accepted by TM:s are called recursively enumerable. If we restrict the
TM:s to those who always halt, the corresponding languages are called recursive. This is the case
corresponding to our notion of a practical algorithm. The trouble here is the ambiguous use of
the word recursive. As we know, stacks are used to implement recursive functions, which we
informally would say are functions defined in terms of themselves. But recursive is, as
demonstrated, also used as a synonym for decidable. In the context of Gödel’s work there was a
need to denote functions simple enough to be represented by recursive functions (in the original
sense) which always finish. For an illuminating discussion see Hopcroft-Ullman. As a final
contribution to the terminological difficulties we have the stack automaton (SA) which is like a
PDA except that it can read its input tape in both directions. Further its push down store is no
longer a stack but allows for reading at any place in the interior of the tape. The SA is more
capable than the PDA, but not up to the level of a TM. The property of the PDA to
recognize/parse context-free languages of course made it extremely useful in the theory and
practice of parsing programming languages. Natural language in human speakers and listeners
also involves parsing, e.g. of embedded clauses. Victor H. Yngve, working in computational
linguistics in 1960, proposed that a speaker’s short term memory functions as a stack with
limited depth: successive left branching parts are in turn stored on a stack only to be removed
when they are matched as the speaker is uttering their corresponding later parts toward the end of
the sentence. However, Miller and Chomsky reject Yngve’s hypothesis by discussing the
situation when a sentence has nested, self embedded parts as in The salmon that the man that the
dog chased smoked tasted bad. This is grammatically correct English and we may speak or
understand such a sentence with a conscious effort, but we cannot do it easily. Our difficulties in
parsing such a sentence show that our personal stack has a very limited depth, much less than the
classic 7±2 result for short-time memory. It is certainly an indication that the stack is not a
suitable model for short time memory. The source for the first occurrence of the computer
science stack in the OED entry is a 1960 paper by E.W. Dijkstra. Dijkstra was a pioneer of
compiler construction with the first Algol 60 compiler.

The stack was invented independently by F. L. Bauer and K. Samuelson in another context.
Working on the fundamentals of mathematical notation, they were interested in the structure of
algebraic expressions. The parenthesis-free Polish notation of Lukasiewicz was introduced in
1929 and interest arose in formulating rules for well-formed expressions. Bauer had a continuing
interest in mathematical logic since the 1930’s. He wanted to set up a relay realization of a
propositional formula and designed a mechanism where intermediate results are stored in the
reverse order of their later use. He called the storage principle “Kellar” (cellar). In 1955 he put
the principle to use in the analysis of arithmetic formulas, improving the O(n2 ) method used by
Rutishauser to O(n). In Bauer’s method: “…not only would intermediate results be pushed down,
but the operation symbols which are to be postponed would be pushed down as well.” Infix
notation could in this way easily be translated to postfix notation. Bauer and his collaborator K.
Samelson applied for a patent for the procedure in 1957. The patent was granted much later, too
late for being relevant for the advent of handheld calculators allowing infix input. A hardware
implementation of a stack was included in 1955 in the plans for the PERM II computer, which
however never was built. Charles.L.Hamblin, Australian philosopher, logician and computer
scientist, developed Lukasiewicz’s notation into what is now called “reverse Polish notation”
where the operands are placed before the operators. It has the advantage that the operators have
the same order as in infix notation, which makes the compiler’s translation into machine
instructions much simpler. Hamblin in 1957 independently invented the stack (which he called “a
running accumulator” ) for this purpose. His ideas were taken up by the British computer
manufacturer English Electric and realized in the hardware stack of the KDF9 computer. A few
years later the Burroughs B5000 computer became admired for its inclusion of hardware stacks
to facilitate the execution of Algol 60 programs. Processors with hardware stacks have been
around since then as discussed by Eric LaForest. The VAX computer implemented the CISC
instructions pop and push for the handling of stacks. It seems that Bauer–Samelson’s work with
algebraic expressions and a hardware stack to analyze them influenced the language design in
which they were involved. The cellar principle was available, hence there would be no problem
in extending recursive definitions to a language. “Compiler design and programming language
design went hand in hand.” Algol 60 became a prime example of this. In accounting, this is a
situation where a company has an inventory of identical articles which have been bought or
manufactured at different costs. When the value of the inventory is to be declared to the tax
authorities, what values should be used? The values for those sold are used in the order from
those last bought, i.e. the LIFO method is used. With rising prices/inflation, which has been the
most common situation, it means that those still left in the inventory have a lower value, which is
advantageous from the point of view of tax reduction. The method was accepted in U.S. tax laws
in 1939.

Working of Stack:
Stack works on the LIFO pattern. As we can observe in the below figure there are five memory
blocks in the stack; therefore, the size of the stack is 5.
Suppose we want to store the elements in a stack and let's assume that stack is empty. We have
taken the stack of size 5 in which we are pushing the elements one by one until the stack
becomes full.
Since our stack is full as the size of the stack is 5. In the above cases, we can observe that it goes
from the top to the bottom when we were entering the new element in the stack. The stack gets
filled up from the bottom to the top.
When we perform the delete operation on the stack, there is only one way for entry and exit as
the other end is closed. It follows the LIFO pattern, which means that the value entered first will
be removed last. In the above case, the value 5 is entered first, so it will be removed only after
the deletion of all the other elements.

Standard Operations in Stack:

● push(): When we insert an element in a stack then the operation is known as a push. If
the stack is full then the overflow condition occurs.

● pop(): When we delete an element from the stack, the operation is known as a pop. If the
stack is empty means that no element exists in the stack, this state is known as an
underflow state.

● isEmpty(): It determines whether the stack is empty or not.

● isFull(): It determines whether the stack is full or not.

● peek(): It returns the element at the given position.

● count(): It returns the total number of elements available in a stack.

● change(): It changes the element at the given position.

● display(): It prints all the elements available in the stack.


Applications of Stack:

● Balancing of Symbols

● String Reversal

● Undo/Redo

● Recursion

● Depth First Search

● Memory Management etc.


Conclusion:

A stack is a data structure which is used to store data in a linear fashion. Stack follows a Last in,
First out (LIFO) principle i.e. the data which is inserted most recently in the stack will be deleted
first from the stack. There are many stack operations that can be performed on a stack. Some of
them are push(), pop(), peek(), isEmpty(), isFull(), size().
References:
1. https://www.geeksforgeeks.org/stack-data-structure/
2. https://en.wikipedia.org/wiki/Stack_(abstract_data_type)
3. https://www.programiz.com/dsa/stack etc.

You might also like