Nothing Special   »   [go: up one dir, main page]

Maths Pca

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

We use the above discussed PCA Algorithm-

PRACTICE PROBLEMS BASED ON PRINCIPAL COMPONENT


ANALYSIS-
Step-01:

Problem-01:
Get data.

The given feature vectors are-


Given data = { 2, 3, 4, 5, 6, 7 ; 1, 5, 3, 6, 7, 8 }.
x1 = (2, 1)
Compute the principal component using PCA Algorithm.
x2 = (3, 5)
x3 = (4, 3)
OR x4 = (5, 6)
x5 = (6, 7)

Consider the two dimensional patterns (2, 1), (3, 5), (4, 3), (5, 6), (6, 7), (7, 8). x6 = (7, 8)

Compute the principal component using PCA Algorithm.

OR

Compute the principal component of following data-

CLASS 1 Step-02:

X=2,3,4

Y=1,5,3 Calculate the mean vector (µ).

CLASS 2

X=5,6,7 Mean vector (µ)

Y=6,7,8 = ((2 + 3 + 4 + 5 + 6 + 7) / 6, (1 + 5 + 3 + 6 + 7 + 8) / 6)

= (4.5, 5)

Solution-
Thus,

Step-04:

Calculate the covariance matrix.

Covariance matrix is given by-

Step-03: Now,

Subtract mean vector (µ) from the given feature vectors.

x1 – µ = (2 – 4.5, 1 – 5) = (-2.5, -4)


x2 – µ = (3 – 4.5, 5 – 5) = (-1.5, 0)
x3 – µ = (4 – 4.5, 3 – 5) = (-0.5, -2)
x4 – µ = (5 – 4.5, 6 – 5) = (0.5, 1)
x5 – µ = (6 – 4.5, 7 – 5) = (1.5, 2)
x6 – µ = (7 – 4.5, 8 – 5) = (2.5, 3)

Feature vectors (xi) after subtracting mean vector (µ) are-


= (m1 + m2 + m3 + m4 + m5 + m6) / 6

On adding the above matrices and dividing by 6, we get-

···
Now,

Covariance matrix
Step-05:

Calculate the eigen values and eigen vectors of the covariance matrix.

From here,

(2.92 – λ)(5.67 – λ) – (3.67 x 3.67) = 0

16.56 – 2.92λ – 5.67λ + λ2 – 13.47 = 0

λ2 – 8.59λ + 3.09 = 0

λ is an eigen value for a matrix M if it is a solution of the characteristic equation |M – λI| = 0.

So, we have- Solving this quadratic equation, we get λ = 8.22, 0.38

Thus, two eigen values are λ1 = 8.22 and λ2 = 0.38.

Clearly, the second eigen value is very small compared to the first eigen value.
X = Eigen vector
λ = Eigen value

Substituting the values in the above equation, we get-

Solving these, we get-


So, the second eigen vector can be left out.
2.92X1 + 3.67X2 = 8.22X1

Eigen vector corresponding to the greatest eigen value is the principal component for the 3.67X1 + 5.67X2 = 8.22X2
given data set.

So. we find the eigen vector corresponding to eigen value λ1.


On simplification, we get-
Audit Active Directory Changes
Real-time active
5.3X1 = 3.67X2 directory
………(1) change auditing and user behavior analytics tool ManageEngine ADAudit Plus

3.67X1 = 2.55X2 ………(2)

··· From (1) and (2), X1 = 0.69X2

From (2), the eigen vector is-

We use the following equation to find the eigen vector-

MX = λX Thus, principal component for the given data set is-

where-

M = Covariance Matrix
The given feature vector is (2, 1).

Lastly, we project the data points onto the new subspace as-

The feature vector gets transformed to

= Transpose of Eigen vector x (Feature Vector – Mean Vector)

To gain better understanding about Principal Component Analysis,

Watch this Video Lecture

Problem-02:
Get more notes and other study material of Pattern Recognition.

Watch video lectures by visiting our YouTube channel LearnVidFun.


Use PCA Algorithm to transform the pattern (2, 1) onto the eigen vector in the previous
question.
Sponsored Content Recommended by

Solution-

You might also like