Calculation of Eigenvalues and Eigenvectors:
Motivating Example:
Let
.
Find the eigenvalues of A and their associated eigenvectors.
[solution:]
Let be the eigenvector associated with the eigenvalue . Then,
.
Thus,
is the nonzero (nontrivial) solution of the homogeneous linear system . is singular .
Therefore,
.
- As ,
.
- As ,
.
Note:
In the above example, the eigenvalues of A satisfy the following equation
.
After finding the eigenvalues, we can further solve the associated homogeneous system to find the eigenvectors.
Definition of the characteristic polynomial:
Let . The determinant
,
is called the characteristic polynomial of A.
,
is called the characteristic equation of A.
Theorem:
A is singular if and only if 0 is an eigenvalue of A.
[proof:]
.
A is singular has non-trivial solution There exists a nonzero vector x such that
.
x is the eigenvector of A associated with eigenvalue 0.
0 is an eigenvalue of A There exists a nonzero vector x such that
.
The homogeneous system has nontrivial (nonzero) solution.
A is singular.
Theorem:
The eigenvalues of Aare the real roots of the characteristic polynomial of A.
Let be an eigenvalue of A associated with eigenvector u. Also, let be the characteristic polynomial of A. Then,
The homogeneous system has nontrivial (nonzero) solution x is singular
.
is a real root of .
Let be a real root of is a singular matrix There exists a nonzero vector (nontrivial solution) v such that .
v is the eigenvector of A associated with the eigenvalue .
Procedure of finding the eigenvalues and eigenvectors of A:
- Solve for the real roots of the characteristic equation . These real roots are the eigenvalues of A.
2. Solve for the homogeneous system or , . The nontrivial (nonzero) solutions are the eigenvectors associated with the eigenvalues .
Example:
Find the eigenvalues and eigenvectors of the matrix
.
[solution:]
and 10.
- As ,
.
Thus,
,
are the eigenvectors associated with eigenvalue .
- As ,
.
Thus,
,
are the eigenvectors associated with eigenvalue .
Example:
.
Find the eigenvalues and the eigenvectors of A.
[solution:]
and 6.
- As ,
.
Thus,
,
are the eigenvectors associated with eigenvalue .
- As ,
.
Thus,
,
are the eigenvectors associated with eigenvalue .
Note:
In the above example, there are at most 2 linearly independent eigenvectors and for matrix A.
The following theorem and corollary about the independence of the eigenvectors:
Theorem:
Let be the eigenvectors of a matrix A associated with distinct eigenvalues , respectively, . Then, are linearly independent.
[proof:]
Assume are linearly dependent. Then, suppose the dimension of the vector space V generated by is
(i.e. the dimension of Vthe vector space generated by ). There exists jlinearly independent vectors of which also generate V. Without loss of generality, let be the j linearly independent vectors which generate V (i.e., is a basis of V). Thus,
,
are some real numbers. Then,
Also,
Thus,
.
Since are linearly independent,
.
Futhermore,
are distinct,
It is contradictory!!
Corollary:
If a matrix A has n distinct eigenvalues, then A has n linearly independent eigenvectors.
1