Linear Transforms and Eigenvectors


I never understood Eigenvectors. They were a piece of mathematical magic my sophomore year, and would have remained so if two of my classes this quarter had not queried my understanding of them.

Lets start with Linear Transforms

Linear Transformations

A linear transform or linear mapping simply takes in a vector, and translates it by some function. It can also map vectors between different spaces (R2 -> R3, etc) but lets keep it simple to start with.

Suppose we have a vector

\stackrel{\to }{x}=\left[\begin{array}{c}1\\ 1\end{array}\right]

Plotting it in Matlab would look something like:


Basic Transformations

But what if we wanted to rotate that same vector 90° CCW? That’s where linear transformations come into play. Suppose (referencing Wikipedia) we knew that multiplying some matrix, A by our vector, would do precisely that.

A=\left[\begin{array}{cc}0; -1\\ 1; 0\end{array}\right]


u = [1;1];
A = [ 0 -1;
1  0];
v = A*u
>> v = [-1; 1]
plotVector([u v]);


Lets try something a little more interesting – with three random vectors:

u = randi(10, [2,3]);

%Rotate 90 CCW
A = [0 -1;
     1  0];
v = A*u;

Which results in vectors

u=\left[\begin{array}{ccc}10& 2& 3\\ 9& 9& 5\end{array}\right] v=\left[\begin{array}{ccc}-9& -9& -5\\ 10& 2& 3\end{array}\right]

More interesting linear transformations

Some linear transformations aren’t quite as predictable. In fact, if we arbitrarily chose a linear transformation such as A=\left[\begin{array}{cc}-2& 2\\ 4& 1\end{array}\right] the resulting transformation is different for each vector! Lets see what this means:

The small vector in the middle rotating CCW is the input vector, u. The larger vector on the outside is the mapped vector v = A*u. As you can see, the transformation depends on the input vector.

There are four important positions (vectors) in this animation – They occur when the input vector aligns with the output vector (either directly or in opposite potions).

These special positions (vectors) are called Eigenvectors! Also, notice the proportional difference in size between the input and output vectors; this is the Eigenvalue of each Eigenvector. Watch the animation again and notice where they show up. (Cheat sheet)

Eigenvectors & Eigenvalues

Now that we have an intuitive idea of what Eigenvectors and Eigenvalues are, lets look at the formal definition.

A\stackrel{\to }{x}=\lambda \stackrel{\to }{x}

When any vector whose transformed output is a scalar multiple of the input, you have an Eigenvector. The scalar multiple is called the Eigenvalue. Simple, really.

Now there’s one more important thing to notice here – the eigenvalues / eigenvectors aren’t the source of the transformation matrix. Instead, the transformation matrix (or really any matrix – depending on how you look at it) is what determines the eigenvalues and eigenvectors. That is, they are properties of the matrix. Blue and soft could be the properties of a ball, but blue and soft don’t define a ball.

As for the math – ill leave that up to you. If you have any simple practical applications of eigenvalues / eigenvectors, i’d love to hear about them. Leave them in the comments below!


Matlab functions: LinearTransformation

Leave a Reply

Your email address will not be published. Required fields are marked *