Flash and JavaScript are required for this feature.
Download the video from iTunes U or the Internet Archive.
These video lectures of Professor Gilbert Strang teaching 18.06 were recorded in Fall 1999 and do not correspond precisely to the current edition of the textbook. However, this book is still the best reference for more information on the topics covered in each lecture.
Strang, Gilbert. Introduction to Linear Algebra. 5th ed. Wellesley-Cambridge Press, 2016. ISBN: 9780980232776.
Instructor/speaker: Prof. Gilbert Strang
Lecture 21: Eigenvalues and...
Related Resources
OK. So this is the first lecture on eigenvalues and eigenvectors, and that's a big subject that will take up most of the rest of the course.
It's, again, matrices are square and we're looking now for some special numbers, the eigenvalues, and some special vectors, the eigenvectors.
And so this lecture is mostly about what are these numbers, and then the other lectures are about how do we use them, why do we want them.
OK, so what's an eigenvector?
Maybe I'll start with eigenvector.
What's an eigenvector?
So I have a matrix A.
OK. What does a matrix do?
It acts on vectors.
It multiplies vectors x.
So the way that matrix acts is in goes a vector x and out comes a vector Ax.
It's like a function.
With a function in calculus, in goes a number x, out comes f(x). Here in linear algebra we're up in more dimensions.
In goes a vector x, out comes a vector Ax.
And the vectors I'm specially interested in are the ones the come out in the same direction that they went in.
That won't be typical.
Most vectors, Ax is in -- points in some different direction.
But there are certain vectors where Ax comes out parallel to x.
And those are the eigenvectors.
So Ax parallel to x.
Those are the eigenvectors.
And what do I mean by parallel?
Oh, much easier to just state it in an equation.
Ax is some multiple -- and everybody calls that multiple lambda -- of x.
That's our big equation.
We look for special vectors -- and remember most vectors won't be eigenvectors -- that -- for which Ax is in the same direction as x, and by same direction I allow it to be the very opposite direction, I allow lambda to be negative or zero.
Well, I guess we've met the eigenvectors that have eigenvalue zero.
Those are in the same direction, but they're -- in a kind of very special way.
So this -- the eigenvector x.
Lambda, whatever this multiplying factor is, whether it's six or minus six or zero or even some imaginary number, that's the eigenvalue.
So there's the eigenvalue, there's the eigenvector.
Let's just take a second on eigenvalue zero.
From the point of view of eigenvalues, that's no special
deal. That's, we have an eigenvector.
If the eigenvalue happened to be zero, that would mean that Ax was zero x, in other words zero.
So what would x, where would we look for -- what are the x-s? What are the eigenvectors with eigenvalue zero?
They're the guys in the null space, Ax equals zero.
So if our matrix is singular, let me write this down.
If, if A is singular, then that -- what does singular mean?
It means that it takes some vector x into zero.
Some non-zero vector, that's why -- will be the eigenvector into zero.
Then lambda equals zero is an eigenvalue.
But we're interested in all eigenvalues now, lambda equals zero is not, like, so special anymore.
OK. So the question is, how do we find these x-s and lambdas?
And notice -- we don't have an equation Ax equal B anymore.
I can't use elimination.
I've got two unknowns, and in fact they're multiplied together.
Lambda and x are both unknowns here.
So, we need to, we need a good idea of how to find them.
But before I, before I do that, and that's where determinant will come in, can I just give you some matrices?
Like here you go.
Take the matrix, a projection matrix.
OK.
So suppose we have a plane and our matrix P is -- what I've called A, now I'm going to call it P for the moment, because it's -- I'm thinking OK, let's look at a then this, this other new matrix, I just have an Ax, projection matrix.
What are the eigenvalues of a projection matrix?
So that's my question.
What are the x-s, the eigenvectors, and the lambdas, the eigenvalues, thing,4 but the roots of that quadratic for -- and now let me say a projection matrix.
My, my point is that we -- before we get into determinants and, and formulas and all that stuff, let's take some matrices where we know what they do.
We know that if we take a vector b, what this matrix does is it projects it down to Pb.
So is b an eigenvector in, in that picture?
Is that vector b an eigenvector?
No.
Not so, so b is not an eigenvector c- because Pb, its projection, is in a different direction.
So now tell me what vectors are eigenvectors of P? What vectors do get projected in the same direction that they
start? So, so answer, tell me some x-s.
Do you see what3 so it's if Ax equals lambda x,
In this picture, where could I start with a vector b or x, do its projection, and end up in the same direction?
Well, that would happen if the vector was right in that plane already.
If the vector x was -- so let the vector x -- so any vector, any x in the plane will be an eigenvector.
And what will happen when I multiply by P, when I project a vector x -- I called it b here, because this is our familiar picture, but now I'm going to say that b was no good for, for the, for our purposes.
I'm interested in a vector x that's actually in the plane, and I project it, and what do I get back?
x, of course.
Doesn't move. can be complex numbers.
So any x in the plane is unchanged by P, and what's that telling me?
That's telling me that x is an eigenvector, and it's also telling me what's the eigenvalue, which is -- just compare it with that.
The eigenvalue, the multiplier, is just one.
Good.
So we have actually a whole plane of eigenvectors.
Now I ask, are there any other eigenvectors?
And I expect the answer to be yes, because I would like to get three, if I'm in three dimensions, I would like to hope for three independent eigenvectors, two of them in the plane and one not in the plane.
OK. So this guy b that I drew there was not any good.
What's the right eigenvector that's not in the plane?
The, the good one is the one that's perpendicular to the
plane. There's an, another good x, because what's the projection?
So these are eigenvectors.
Another guy here would be another eigenvector.
But now here is another one. two.
Any x that's perpendicular to the plane, what's Px for that, for that, vector?
What's the projection of this guy perpendicular to the plane?
It is zero, of course.
So -- there's the null space.
Px and n- for those guys are zero, or zero x if we like, and the eigenvalue is zero.
So my answer to the question is, what are the eigenvalues for
In our example, the one we worked out, a projection matrix?
There they are.
One and zero.
OK. We know projection matrices.
We can write them down as that A, A transpose, A inverse, A transpose thing, but without doing that from the picture we could see what are the eigenvectors.
OK.
Are there other matrices?
Let me take a second example.
How about a permutation matrix?
What about the matrix, I'll call it A now.
Zero one, one zero. A equals zero one one zero, that had eigenvalue one and
Can you tell me a vector x -- see, we'll have a system soon enough, so I, I would like to just do these e- these couple of examples, just to see the picture before we, before we let it all, go into a system where that, matrix isn't anything special.
Because it is special.
And what, so what vector could I multiply by and end up in the same direction?
Can you spot an eigenvector for this guy?
That's a matrix that permutes x1 and x2, right?
It switches the two components of x.
How could the vector with its x2 x1, with -- permuted turn out to be a multiple of x1 x2, the vector we start with?
Can you tell me an eigenvector here for this guy?
x equal -- what is -- actually, can you tell me one vector that which is lambda x, and I have a three x,
And of course you -- everybody knows that they're -- what, has eigenvalue one?
So what, what vector would have eigenvalue one, just above what we2 found here. so that if I, if I permute it it doesn't change? right?
There, that could be one one, thanks.
One one.
OK, take that vector one one.
That will be an eigenvector, because if I do Ax I get one one.
So that's the eigenvalue is one.
Great. That's one eigenvalue.
But I have here a two by two matrix, and I figure there's going to be a second eigenvalue.
And eigenvector.
Now, what about that?
What's a vector, OK, maybe we can just, like, guess it.
A vector that the other -- actually, this one that I'm thinking of is going to be a vector that has eigenvalue minus one.
That's going to be my other eigenvalue for this matrix.
It's a -- notice the nice positive or not negative matrix, but an eigenvalue is going to come out negative.
And can you guess, spot the x that will work for
Times x is supposed to give me zero, right? that?
So I want a, a vector.
When I multiply by A, which reverses the two components, I want the thing to come out minus the original.
So what shall I send in in that case?
If I send in negative one one.
Then when I apply A, I get I do that multiplication, and I get one negative one, so it reversed sign.
So Ax is -x.
Lambda is minus one.
Ax -- so Ax was x there and Ax is minus x here.
Can I just mention, like, jump ahead, have, give a perfectly innocent-looking quadratic and point out a special little fact about eigenvalues.
n by n matrices will have n eigenvalues.
And I get this matrix4 zero zero zero one,
And it's not like -- suppose n is three or four or more.
It's not so easy to find them.
We'd have a third degree or a fourth degree or an n-th degree equation.
But here's one nice fact.
There, there's one pleasant fact. we -- the eigenvalues came out four and two.
That the sum of the eigenvalues equals the sum down the diagonal.
That's called the trace, and I put that in the lecture
Now I add three I to that matrix. content specifically.
So this is a neat fact, the fact that sthe sum of the lambdas, add up the lambdas, equals the sum -- what would you like me to, shall I write that down?
What I'm want to say in words is the sum down the diagonal of A.
Shall I write a11+a22+...+ ann.
That's add up the diagonal entries.
In this example, it's zero.
In other words, once I found this eigenvalue of one, I knew the other one had to be minus one in this two by two case, because in the two by two case, which is a good one to, to, play with, the trace tells you right away what the other eigenvalue is.
So if I tell you one eigenvalue, you can tell me the
other one. We'll, we'll have that -- we'll, minus one and eigenvectors one one and eigenvector minus one we'll see that again.
OK. Now can I -- I could give more examples, but maybe it's time to face the, the equation, Ax equal lambda x, and figure how are we going to find x and lambda.
And that is lambda one times lambda3
OK.
So this, so the question now is how to find eigenvalues and eigenvectors.
How to solve, how to solve Ax equal lambda x from the three x, so it's just I mean, when we've got two unknowns both in the equation.
OK.
Here's the trick.
Simple idea.
Bring this onto the same side.
Rewrite. Bring this over as A minus lambda times the identity x
One. equals zero.
Right? I have Ax minus lambda x, so I brought that over and I've got zero left on the, on the right-hand side.
What's the relation between that problem and -- let me write
OK. I don't know lambda and I don't know x, but I do know something
here. What I know is if I, if I'm going to be able to solve this thing, for some x that's not the zero vector, that's not, that's a useless eigenvector, doesn't count.
What I know now is that this matrix must be what?
If I'm going to be -- if there is an x -- I don't -- right now I don't know what it is.
I'm going to find lambda first, actually.
And -- but if there is an x, it tells me that this matrix, this special combination, which is like the matrix A with lambda -- shifted by lambda, shifted by lambda I, that it has to be singular.
This matrix must be singular, otherwise the only x would be the zero x, and zero matrix.OK. So this is singular.
And what do I now know about singular matrices?
So, so take three away.
Their determinant is zero.
So I've -- so from the fact that that has to be singular, I know that the determinant of A minus lambda I has to be zero.
And that, now I've got x out of it.
I've got an equation for lambda, that the key equation -- it's called the characteristic equation or the eigenvalue equation.
And that -- in other words, I'm now in a position to find lambda first.
So -- the idea will be to find lambda first.
And actually, I won't find one lambda, I'll find N different lambdas.
Well, n lambdas, maybe not n different ones.
A lambda could be repeated.
A repeated lambda is the source of all trouble in 18.06. So, let's hope for the moment that they're not repeated.
There, there they were different, right?
One and minus one in that, in that, for that permutation.
OK. So and after I found this lambda, can I just look ahead?
How I going to find x?
After I have found this lambda, the lambda being this -- one of the numbers that makes this matrix singular.
Their product was eight.
Then of course finding x is just by elimination.
Right? It's just -- now I've got a singular matrix, I'm looking for the null space.
We're experts at finding the null space.
You know, you do elimination, you identify the, the, the pivot columns and so on, you're -- and, give values to the free variables.
Probably there'll only be one free variable.
We'll give it the value one, like there.
And we find the other variable.
OK.
So let's -- find the x second will be a doable job.
That's my big equation for x.
Let's go, let's look at the first job of finding lambda.
Can I take another example?
OK.
And let's, let's work that one out.
OK.
So let me take the example, say, let me make it easy. it's just sitting there.
Three three one and one. what do you know about the complex numbers?
So I've made it easy.
I've made it two by two.
I've made it symmetric.
And I even made it constant down the diagonal. That a matrix, a perfectly real matrix could
So that -- so the more, like, special properties I stick into the matrix, the more special outcome I get for the eigenvalues.
For example, this symmetric matrix, I know that it'll come out with real eigenvalues. one.
The eigenvalues will turn out to be nice real numbers.
And up in our previous example, that was a symmetric matrix.
Actually, while we're at it, that was a symmetric matrix.
Its eigenvalues were nice real numbers, one and minus one.
And do you notice anything about its eigenvectors?
And what do you notice?
Anything particular about those two vectors, one one and minus And now comes that thing that I wanted to be reminded of. one one?
They just happen to be -- no, I can't say they just happen to be, because that's the whole point, is that they had to be -- what?
What are they?
They're perpendicular.
The vector, when I -- if I see a vector one one and a one -- and a minus one one, my mind immediately takes that
dot product. It's zero. what's the determinant of that matrix?
Those vectors are perpendicular.
That'll happen here too.
Well, let's find the eigenvalues.
Actually, oh, my example's too easy.
My example is too easy.
Let me tell you in advance what's going to happen.
May I?
Or shall I do the determinant of A minus lambda, and then point out at the end?
Will you remind me at the -- after I've found the eigenvalues to say why they were -- why they were easy from
That -- it had to be eight, because we factored into lambda the, from the example we did?
OK, let's do the job here.
Let's compute determinant of A minus lambda I.
So that's a determinant.
And what's, what is this thing?
It's the matrix A with lambda removed from the diagonal. for this matrix?
So the diagonal matrix is shifted, and then I'm taking the determinant.
OK.
So I multiply this out.
So what is that determinant?
Do you notice, I didn't take lambda away from all the entries.
It's lambda I, so it's lambda along the
Lambda plus three x. diagonal.
So I get three minus lambda squared and then minus one, right?
And I want that to be zero.
And what is A minus lambda I x?
Well, I'm going to simplify it.
And what will I get?
So if I multiply this out, I get lambda squared minus six What's -- how is this matrix related to that matrix? lambda plus what? Plus eight. But it's out there. And that I'm going to set to zero. And I'm going to solve it. So and it's, it's a quadratic equation. I can use factorization, I can use the quadratic formula. I'll get two lambdas. Before I do it, tell me what's that number six that's showing up in this equation?
It's the trace.
That number six is three plus three.
And while we're at it, what's the number eight that's showing up in this equation?
It's the determinant.
That our matrix has determinant eight.
So in a two by two case, it's really nice.
It's lambda squared minus the trace times lambda -- the trace is the linear coefficient -- and plus the determinant, the constant term.
OK. So let's -- can, can we find the roots?
I guess the easy way is to factor that as something times something.
If we couldn't factor it, then we'd have to use the old b^2-4ac formula, but I, I think we can factor that into lambda minus what times lambda minus what?
Can you do that factorization?
Four and two?
Lambda minus four times lambda minus two.
So the, the eigenvalues are four and two.
So the eigenvalues are -- one eigenvalue, lambda one,
Now I'm looking for x, the eigenvector. let's say, is four.
Lambda two, the other eigenvalue, is two. The eigenvalues are four and two.
And then I can go for the eigenvectors. Suppose I have a matrix A, and Ax equal lambda x. equals zero.
You see I got the eigenvalues first.
So if they, if this had eigenvalue lambda,
Four and two.
Now for the eigenvectors.
So what are the eigenvectors?
They're these guys in the null space when I take away, when I make the matrix singular by taking four I or two I away.
So we're -- we got to do those separately.
I'll -- let me find the eigenvector for four first.
So I'll subtract four, so A minus four I is -- so taking four away will put minus ones there.
And what's the point about that matrix?
If four is an eigenvalue, then A minus four I had better be a what kind of matrix?
Singular.
If that matrix isn't singular, the four wasn't correct.
But we're OK, that matrix is singular.
And what's the x now?
The x is in the null space.
So what's the x1 that goes with, with the lambda one? eigenvalue, eigenvector, eigenvalue for this,
So that A -- so this is -- now I'm doing A x1 is lambda one x1. So I took A minus lambda one I, that's this matrix, and now I'm looking for the x1 in its null space, and who is he?
What's the vector x in the null space?
Of course it's one one.
So that's the eigenvector that goes with that eigenvalue.
So, so now -- Let's just spend one more minute on this bad
Now how about the eigenvector that goes with the other eigenvalue?
Can I do that with, with erasing?
I take A minus two I.
So now I take two away from the diagonal, and that leaves me with a one and a one.
So A minus two I has again produced a singular matrix, as it had to.
I'm looking for the null space of that guy.
What vector is in its null space?
Well, of course, a whole line of vectors.
So when I say the eigenvector, I'm not speaking correctly.
There's a whole line of eigenvectors, and you just -- I just want a basis.
And for a line I just want one vector.
But -- You could, you're -- there's some freedom in choosing that one, but choose a reasonable one.
What's a vector in the null space of that?
Well, the natural vector to pick as the eigenvector with, with lambda two is minus one one.
If I did elimination on that vector and set that, the free variable to be one, I would get minus one and get that eigenvector.
So you see then that I've got eigenvector,
Now the other neat fact is that the determinant,
How are those two matrices related?
Well, one is just three I more than the other one, right? two.
I just took that matrix and I -- I took this matrix and I added three I.
So my question is, what happened to the minus four times lambda minus two. eigenvalues and what happened to the eigenvectors?
That's the, that's like the question we keep asking now in this chapter.
If I, if I do something to the matrix, what happens if I -- or I know something about the matrix, what's the what's the conclusion for its eigenvectors and eigenvalues?
Because -- those eigenvalues and eigenvectors are going to tell us important information about the matrix.
And here what are we seeing?
What's happening to these eigenvalues, one and minus one, when I add three I?
It just added three to the eigenvalues.
I got four and two, three more than one and minus
one. What happened to the eigenvectors?
Nothing at all.
One one is -- and minus -- and one -- and minus one one are -- is still the eigenvectors.
In other words, simple but useful observation.
If I add three I to a matrix, its eigenvectors don't change and its eigenvalues are three bigger.
Let's, let's just see why.
Let me keep all this on the same board. but just so you see -- so I'll try to do that. this has eigenvalue lambda plus three.
And x, the eigenvector, is the same x for both matrices.
OK.
So that's, great.
Of course, it's special.
We got the new matrix by adding three I.
Suppose I had added another matrix.
Suppose I know the eigenvalues and eigenvectors of A.
So I took A minus lambda I x, and what kind of a matrix I
So this is, this, this little board here is going to be not so great.
Suppose I have a matrix A and it has an eigenvector x with
an eigenvalue lambda. You remember, I solve A minus lambda I x
And now I add on some other matrix.
So, so what I'm asking you is, if you know the eigenvalues of A and you know the eigenvalues of B, let me say suppose B -- so this is if -- let me put an if here.
If Ax equals lambda x, fine, and B has, eigenvalues, has eigenvalues -- what shall we call them?
Alpha, alpha one and alpha -- let's say -- I'll use alpha for the eigenvalues of B for no good reason.
What a- you see what I'm going to ask is, how about A plus B?
Let me, let me give you the, let me give you, what you might think first.
OK.
If Ax equals lambda x and if B has an eigenvalue alpha, then I allowed to say -- what's the matter with this argument?
That gave us the constant term eight.
It's wrong.
What I'm going to write up is wrong.
I'm going to say Bx is alpha x.
Add those up, and you get A plus B x equals lambda plus alpha x.
So you would think that if you know the eigenvalues of A and you knew the eigenvalues of B, then if you added you would know the eigenvalues of A plus B.
But that's false.
A plus B -- well, when B was three I, that worked great.
But this is not so great.
And what's the matter with that argument there?
We have no reason to believe that x is also an eigenvector of
B has some eigenvalues, B. but it's got some different eigenvectors normally.
It's a different matrix.
I don't know anything special.
If I don't know anything special, then as far as I know, it's got some different eigenvector y, and when I add I get just rubbish.
I mean, I get -- I can add, but I don't learn anything.
So not so great is A plus B.
Or A times B.
Normally the eigenvalues of A plus B or A times B are not eigenvalues of A plus eigenvalues of B.
Ei- eigenvalues are not, like, linear.
Or -- and they don't multiply.
Because, eigenvectors are usually different and, and there's just no way to find out what A plus B does to affect
What do I do now? it.
OK. So that's, like, a caution.
Don't, if B is a multiple of the identity, great, but if B is some general matrix, then for A plus B you've got to find -- you've got to solve the eigenvalue problem.
Now I want to do another example that brings out a,
OK. another point about eigenvalues.
Let me make this example a rotation matrix. possibility of complex numbers.
OK.
So here's another example.
So a rotate -- oh, I'd better call it Q.
I often use Q for, for, rotations because those are the, like, very important examples of orthogonal matrices.
Let me make it a ninety degree rotation.
So -- my matrix is going to be the one that rotates every
And that's the sum, that's lambda one plus lambda vector by ninety degrees.
So do you remember that matrix?
It's the cosine of ninety degrees, which is zero, the sine of ninety degrees, which is one, minus the sine of ninety, the cosine of ninety.
So that matrix deserves the letter Q.
It's an orthogonal matrix, very, very orthogonal matrix. Now I'm interested in its eigenvalues and eigenvectors.
Two by two, it can't be that tough.
We know that the eigenvalues add to zero.
Actually, we know something already here.
The eigen- what's the sum of the two eigenvalues?
Just tell me what I just said.
Zero, right.
From that trace business.
The sum of the eigenvalues is, is going to come out zero.
And the product of the eigenvalues, did I tell you about the determinant being the product of the eigenvalues?
No. But that's a good thing to know.
We pointed out how that eight appeared in the, in the quadratic equation. eigenvalues, we can postpone that evil day,
So let me just say this.
The trace is zero plus zero, obviously.
And that was the determinant.
OK.
What I'm leading up to with this example is that something's going to go wrong.
Something goes wrong for rotation because what vector can come out parallel to itself after a rotation?
If this matrix rotates every vector by ninety degrees, what could be an eigenvector?
Do you see we're, we're, we're going to have trouble. eigenvectors are -- Well.
Our, our picture of eigenvectors, of, of coming out in the same direction that they went in, there won't be it.
And with, and with eigenvalues we're going to have trouble.
From these equations.
Let's see.
Why I expecting trouble?
The, the first equation says that the eigenvalues add to zero.
So there's a plus and a minus.
So I take the eigenvalue.
But then the second equation says that the product is plus one.
We're in trouble.
But there's a way out.
So how -- let's do the usual stuff.
Look at determinant of Q minus lambda I.
So I'll just follow the rules, take the determinant, subtract lambda from the diagonal, where I had zeros, the rest is the same.
Rest of Q is just copied.
Compute that determinant.
OK, so what does that determinant equal?
Lambda squared minus minus one plus what?
What's up?
There's my equation.
My equation for the eigenvalues is lambda squared plus one equals zero.
What are the eigenvalues lambda one and lambda two?
They're I, whatever that is, and minus it, right.
Those are the right numbers.
To be real numbers even though the matrix was perfectly real.
So this can happen.
Complex numbers are going to -- have to enter eighteen oh six at this moment.
Boo, right.
All right.
If I just choose good matrices that have real supposed to have here?
We do know a little information about the, the two complex numbers.
They're complex conjugates of each other.
If, if lambda is an eigenvalue, then when I change, when I go -- you remember what complex conjugates are?
You switch the sign of the imaginary part.
Well, this was only imaginary, had no real part, so we just switched its sign.
So that eigenvalues come in pairs like that, but they're complex.
A complex conjugate pair.
And that can happen with a perfectly real matrix.
And as a matter of fact -- so that was my, my point earlier, that if a matrix was symmetric, it wouldn't happen.
So if we stick to matrices that are symmetric or, like, close to symmetric, then the eigenvalues will stay real.
But if we move far away from symmetric -- and that's as far as you can move, because that matrix is -- how is Q transpose related to Q for that matrix?
That matrix is anti-symmetric. Q transpose is minus Q.
That's the very opposite of symmetry.
When I flip across the diagonal I get -- I reverse all the signs.
Those are the guys that have pure imaginary eigenvalues.
So they're the extreme case.
And in between are, are matrices that are not symmetric or anti-symmetric but, but they have partly a symmetric part and an anti-symmetric part. OK.
So I'm doing a bunch of examples here to show the possibilities.
The good possibilities being perpendicular eigenvectors, real eigenvalues.
The bad possibilities being complex eigenvalues.
We could say that's bad.
There's another even worse.
I'm getting through the, the bad things here today.
Then, then the next lecture can, can, can be like pure happiness.
OK. Here's one more bad thing that could happen.
So I, again, I'll do it with an example.
Suppose my matrix is, suppose I take this three three one and I change that guy to zero.
What are the eigenvalues of that matrix?
What are the eigenvectors?
This is always our question.
Of course, the next section we're going to show why are, why do we care.
But for the moment, this lecture is introducing
them. And let's just find them.
OK. What are the eigenvalues of that matrix?
Let me tell you -- at a glance we could answer that question.
Because the matrix is triangular.
It's really useful to know -- if you've got properties like a triangular matrix.
It's very useful to know you can read the eigenvalues
off. They're right on the diagonal.
So the eigenvalue is three and also three.
Three is a repeated eigenvalue.
But let's see that happen.
Let me do it right.
The determinant of A minus lambda I, what I always have to do is this determinant.
I take away lambda from the diagonal.
I leave the rest.
I compute the determinant, so I get a three minus lambda times a three minus lambda.
And nothing.
So that's where the triangular part came in.
Triangular part, the one thing we know about triangular matrices is the determinant is just the product down the diagonal.
And in this case, it's this same, repeated -- so lambda one is one -- sorry, lambda one is three and lambda two is three.
That was easy.
I mean, no -- why should I be pessimistic about a matrix whose eigenvalues can be read off right away?
The problem with this matrix is in the eigenvectors.
So let's go to the eigenvectors.
So how do I find the eigenvectors?
I'm looking for a couple of eigenvectors.
Singular, right?
It's supposed to be singular.
And then it's got some vectors -- which it is.
So it's got some vector x in the null space.
And what, what's the, what's -- give me a basis for the null space for that guy.
Tell me, what's a vector x in the null space, so that'll be the, the eigenvector that goes with lambda one equals three.
The eigenvector is -- so what's in the null space?
One zero, is it?
Great.
Now, what's the other eigenvector?
What's, what's the eigenvector that goes with lambda two?
Well, lambda two is three again.
So I get the same thing again.
Give me another vector -- I want it to be independent.
If I'm going to write down an x2, I'm never going to let it be dependent on x1. I'm looking for independent eigenvectors, and what's the conclusion?
There isn't one.
This is a degenerate matrix.
It's only got one line of eigenvectors instead of two.
It's this possibility of a repeated eigenvalue opens this further possibility of a shortage of eigenvectors.
And so there's no second independent eigenvector x2. So it's a matrix, it's a two by two matrix, but with only one independent eigenvector.
So that will be -- the matrices that -- where eigenvectors are -- don't give the complete story.
OK.
My lecture on Monday will give the complete story for all the other matrices.
Thanks.
Have a good weekend.
A real New England weekend.
Free Downloads
Free Streaming
Video
- iTunes U (MP4 - 100MB)
- Internet Archive (MP4 - 205MB)
Caption
- English-US (SRT)