<< Chapter < Page Chapter >> Page >
(Blank Abstract)

When we apply a matrix to a vector (i.e. multiply them together), the vector is transformed. An interesting questionto ask ourselves is whether there are any particular combinations of such a matrix and vector whose result is a newvector that is proportional to the original vector. In math terminology, this question can be posed as follows: if we have amatrix A : n n , does there exist a vector x n and a scalar such that A x x ? If so, then the complexity of A x is reduced. It no longer must be thought of as a matrix multiplication; instead, applying A to x has the simple effect of linearly scaling x by some scalar factor .

In this situation, where A x x , is known as an eigenvalue and x is its associated eigenvector. For a certain matrix, each one of its eigenvectors is associated with a particular (though notnecessarily unique) eigenvalue. The word "eigen" is German and means "same"; this is appropriate because the vector x after the matrix multiplication is the same as the originalvector x , except for the scaling factor. The following two examples give actual possible values for the matrices, vectors, and values discussed in general terms above.

1 -1 -1 1 1 1 0 1 1

Here, 1 1 is the eigenvector and 0 is its associated eigenvalue.

2 1 1 2 1 1 3 1 1

In this second example, 1 1 is again the eigenvector but the eigenvalue is now 3 .

Now we'd like to develop a method of finding the eigenvalues and eigenvectors of a matrix. We start with what is basically the defining equation behind this whole idea:

A x x

Next, we move the x term to the left-hand side and factor:

A I x 0

Here's the important rule to remember: there exists x 0 satisfying the equation if and only if A I 0 . So, to find the eigenvalues, we need to solve this determinant equation.

Given the matrix A , solve for in A I 0 .

A 2 1 1 2
A I 2 1 1 2 2 2 1 2 4 3 0
3 1
Got questions? Get instant answers now!

After finding the eigenvalues, we need to find the associated eigenvectors. Looking at the defining equation , we see that the eigenvector x is annihilated by the matrix A I . So to solve for the eigenvectors, we simply find the kernel (nullspace) of A I using the two eigenvalues we just calculated. If we did this for the example above, we'd find that the eigenvector associated with 3 is 1 1 and the eigenvector associated with 1 is 1 -1 .

You may be wondering why eigenvalue decomposition is useful. It seems at first glance that it is only helpful in determining the effect a matrix has on a certain small subset of possible vectors (the eigenvectors). However, the benefits become clear when you think about how many other vectors can be looked at from an eigenvalue perspective by decomposing them into components along the available eigenvectors. For instance, in the above example, let's say we wanted to apply A to the vector 2 0 . Instead of doing the matrix multiply (admittedly not too difficult in this case), the vector 2 0 could be split into components in the direction of the eigenvalues:

2 0 1 1 1 -1

Now, each of these components could be scaled by the appropriate eigenvalue and then added back together to form the net result.

Multiplicity

Once we have determined the eigenvalues of a particular matrix, we can start to discuss them in terms of their multiplicity. There are two types of eigenvalue multiplicity: algebraic multiplicity and geometric multiplicity.

Algebraic Multiplicity
The number of repetitions of a certain eigenvalue. If, for a certain matrix, 3 3 4 , then the algebraic multiplicity of 3 would be 2 (as it appears twice) and the algebraic multiplicity of 4 would be 1 (as it appears once). This type of multiplicity is normallyrepresented by the Greek letter , where i represents the algebraic multiplicity of i .
Geometric Multiplicity
A particular eigenvalue's geometric multiplicity is defined as the dimension of the nullspace of I A . This type of multiplicity is normally represented by the Greek letter , where i represents the geometric multiplicity of i .

Helpful facts

Here are some helpful facts about certain special cases of matrices.

Rank

A matrix A is full rank if A 0 . However, if 0 then I A 0 . This tells us that A 0 . Therefore, if a matrix has at least one eigenvalue equal to 0 , then it cannot have full rank. Specifically, for an n -dimensional square matrix:

  • When one eigenvalue equals 0 , rank A n 1
  • When multiple eigenvalues equal 0 rank A n 0 . This property holds even if there are other non-zero eigenvalues

Symmetric matrices

A symmetric matrix is one whose transpose is equal to itself ( A A ). These matrices (represented by Abelow) have the following properties:

  • Its eigenvalues are real.
  • Its eigenvectors are orthogonal.
  • They are always diagonalizable.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, State space systems. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10143/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'State space systems' conversation and receive update notifications?

Ask