A Presentation On: By: Edutechlearners
A Presentation On: By: Edutechlearners
A Presentation On: By: Edutechlearners
By:
Edutechlearners
www.edutechlearners.com
The perceptron was first proposed by Rosenblatt (1958) is a simple
neuron that is used to classify its input into one of two categories.
A perceptron is a single processing unit of a neural network. A
perceptron uses a step function that returns +1 if weighted sum of its
input 0 and -1 otherwise.
b (bias)
x1
w1
v y
x2 w2 (v)
wn
xn
While in actual neurons the dendrite receives electrical signals from the
axons of other neurons, in the perceptron these electrical signals are
represented as numerical values. At the synapses between the dendrite
and axons, electrical signals are modulated in various amounts. This is
also modeled in the perceptron by multiplying each input value by a
value called the weight.
An actual neuron fires an output signal only when the total strength of
the input signals exceed a certain threshold. We model this
phenomenon in a perceptron by calculating the weighted sum of the
inputs to represent the total strength of the input signals, and applying a
step function on the sum to determine its output. As in biological neural
networks, this output is fed to other perceptrons.
Perceptron can be defined as a single artificial neuron that
computes its weighted input with the help of the threshold activation
function or step function.
It is also called as a TLU (Threshold Logical Unit).
x1 w1 w0
w2
x2
. n o
. wn w x
i=0 i i
x.n
n
1 if wi xi >0
f(xi)= { i=0
-1 otherwise
Supervised learning is used when we have a set of training data.This
training data consists of some input data that is connected with some
correct output values. The output values are often referred to as target
values. This training data is used by learning algorithms like back
propagation or genetic algorithms.
In machine learning, the perceptron is an algorithm
for supervised classification of an input into one of several possible
non-binary outputs.
Perceptron can be defined as a single artificial neuron that computes its
weighted input with the help of the threshold activation function or step
function.
The Perceptron is used for binary Classification.
The Perceptron can only model linearly separable classes.
First train a perceptron for a classification task.
- Find suitable weights in such a way that the training examples are
correctly classified.
- Geometrically try to find a hyper-plane that separates the examples of
the two classes.
Linear separability is the concept wherein the separation of the input
space into regions is based on whether the network response is positive
or negative.
When the two classes are not linearly separable, it may be desirable to
obtain a linear separator that minimizes the mean squared error.
Definition : Sets of points in 2-D space are linearly separable if the sets
can be separated by a straight line.
Generalizing, a set of points in n-dimensional space are linearly
separable if there is a hyper plane of (n-1) dimensions separates the
sets.
Consider a network having positive response in the first quadrant and
negative response in all other quadrants (AND function) with either
binary or bipolar data, then the decision line is drawn separating the
positive response region from the negative response region.
The net input to the output Neuron is:
Yin = w0 + Ʃi xi wi
x2
Input
Output
xn
Hidden layers
The input layer:
• Introduces input values into the network.
• No activation function or other processing.