[BOŞ_SES] Hello.

A previous lesson,

We saw part of the eigenvalues and eigenvectors of matrices.

Now that the application of the matrix

We will examine the diagonalized.

A summary of the information we obtain from these eigenvalues here written in bold letters.

And the discrete eigenvalues of matrix

Even if the eigenvalues of symmetric matrices repeat.

So discrete eigenvalues in general matrix, eigenvalues of symmetric matrices

Even if repeat the underemployed eigenvector

We can find and able to bring the diagonal structure matrices using them.

So this course summary.

Of course, this is not so simple.

We need to see some finesse.

We can ask the following question: Does each matrix diagonalizable?

We have the answer to that.

Symmetric matrix can always be diagonalized.

Although not symmetric matrices

Find eigenvectors at all eigenvalues we can diagonalizable.

Critical here is to find a sufficient number of eigenvectors.

Hermit matrix can always be diagonalized as symmetric matrices.

Now let's put an exceptional case aside.

N times the real value and start working in a general matrix size.

This eigenvalue problem in a lamp, the lamp in two,

including the one on the lamp with eigenvalues eigenvalues, eigenvectors have problems.

Eigenvectors's one of the problems with using these lambdas.

Now all of this equation comes opposed to each column.

We find that a square matrix column matrix has hit a column matrix.

On the right side are already a number lambda, to a column matrix.

So here are the one column matrix.

All of these also a partner.

So as we can.

We are the one that equation, the one column is a column

resulting in a matrix by typing one, we can look at.

All as a matrix multiplier.

Our one to the matrix çarptğı time we find a lamp in an email.

Lambda e a.

When we find two e lambda two to two and a hit.

As you can see one of the columns of the matrix of this equation a

one can make the multiplication of matrices.

Each of the column that left a times

The item consists of the vector.

The right side again many lamps in an e a,

two to two and have the lamp in each line.

This matrix right now

We analyzed because there is little we can divide lambda are a part,

none lambda are a part.

Let him see how we can do now.

E here yazsam,

lambda in diagonal on yazsam we really get this product

When we multiply matrices in which to bring the first column.

Of course, there is only one lambda.

Lambda will hit one of the elements of E, E two reset

to multiply, to be multiplied by three resets, the reset will be multiplied.

So this creates a lambda e.

The Lambda binary column in the same way this time

E forms by capturing only two here.

When we do, we see that this product is really left.

You take the last column you will get it.

Horizontal turn will multiply.

See all the zeros to one of the components of zeros, zeros are multiplied.

E straining it occurs.

It's just with all the components of the e n

wherein the first component of the vector,

we're in a column to position the second component as n'yinc component.

All these numbers are multiplied by the lambda.

So with the multiplication of the vector lambda is made here.

You wrote it in the last row.

Now we have a name to them and Herzegovina.

Comprising such e, e q matrix which columns matrix say,

q You will see a time left on the previous page.

The right side of the q times here

We parse this as lambda.

So a time q, q times lambda matrix.

Lambda matrix is a diagonal matrix and also just above the diagonal

They found matrix eigenvalues.

Immediately left the q Here

If we multiply that by the inverse of the inverse of q

warranty because there are n independent eigenvectors.

The columns are independent, because they are the opposite of the matrix.

We're not mix exception.

All lambda are independent.

Even if the situation repeated or standalone e vectors can be found.

We'll see in the concrete examples.

As you can see here we hit q minus the merger,

If we multiply this number by q q minus a reverse left aq occurs,

q q minus the right side of the combine is going matrix multiplication unit.

This unit matrix lambda that hit the diagonal matrix

diagonal matrix itself remains.

As you can see from left and right in a way that gets hit

we obtain a diagonal matrix.

This is called diagonalization of the matrix and can say it as a theorem.

n times the size of a real matrix in the grain

Whether linearly independent eigenvector, we assume that there is.

This product, q minus one,

It converts aq diagonal matrix multiplication.

columns of the q eigenvectors of this matrix consists of a matrix.

And it counted on the light in this diagonal matrix in this

and the eigenvalues of a matrix are

q here in order to put our vectors

During the revenues that we put into the first column vector opposite

from the first eigenvalues first diagonal, the opposing second column

including a second diagonal structure is reached.

This visually summarize once more.

We can understand better.

A middle matrix.

We put the q matrix right of a matrix.

This column consists of a eigenvectors.

e a e two, there is guaranteed to be the best and the independent one.

Otherwise, there would not be one that is the opposite of minus q q.

We obtain a diagonal matrix when we received this product.

This has eigenvalues on the diagonal.

And it's as e a e two, while it is e n,

In a lamp, the lamp in two, the light comes on.

If we do, we will see in the examples of this product but to make this product, we get it

without us even need this theorem

The diagonal matrix that will come and go in this structure guarantees.

Now we pass to examples, but I need a break before the samples.

This theory says you get a little more thinking of the opportunity.