Day 10
Day 10
Day 10
Course
Laptop with Good Internet
No basic knowledge is required.
Note & Pen
Deep Learning Algorithm
ARTIFICIAL NEURAL
01
NETWORK (ANN)
RECURRENT NEURAL
02
NETWORK (RNN)
CONVOLUTIONAL
03
NEURAL NETWORK (CNN)
ANN
• Learns any Non-Linear Function, It is known as
Universal Function Approximators Hidden
Input
• Activation Function introduce non linear
property to network, so it will identify complex
relationship between input & output Output
• It don’t have recurrent connections like RNN, instead it has convolution type of
hidden layers
• POOLING: picking maximum value from selected region is Max pooling and vice
versa.
CNN Architecture FC_4
FC_3 Fully Connected
Fully Connected Neural Network
Max-pooling
0
Conv_2
Convolution (2x2) ReLU Activation
(5x5)
Conv_1
Convolution
(5x5)
Max-pooling
(2x2) 1
.
.
2
.
. 9
Output
Input
28x28x1 n1 channels n1 channels n2 channels Flattened
n3 units
(24 x 24 x n1) (12 x 12 x n1) (8 x 8 x n2) n2 channels
(4 x 4 x n2)
Simple Softmax Classification
Simple Softmax Classification
784 Pixels
….
.
.
.
.
Input
28x28x1
.
.
.
.
0 1 2 3 9
100 image at a time
In TensorFlow
SOFTMAX Function
Back Propagation
• Calculating error between predicted output and target output
and use Gradient descent method to update weights
Gradient Descent
• Machine Learning algorithm
• It operates iteratively to find the optimal values for its parameters.
user-defined learning rate, and initial parameter values
Vanishing & Exploding Gradient
• It is very common problem in every Neural Network, which is associated
with Backpropagation.
• When the number of hidden layer is high, then the gradient vanishes or
explodes as it propagates backward. It leads instability in network,
unable to learn from training
Adding Layers
model.add(Dense(8, activation='relu'))
Compile Model
model.compile(loss='binary_crossentropy',
optimizer='adam', metrics=['accuracy'])
Batch vs Epoch
Training occurs over epochs and each epoch is split into batches.
Epoch - One pass through all of the rows in the training dataset
Batch - One or more samples considered by the model within an epoch before weights are updated.
NOVITECH COIMBATORE
novitechresearchanddevelopment