Columns describe how a matrix acts on the coordinate axes, whereas eigenvalues describe how the matrix acts along its own preferred directions (the eigenvectors).
Each column is the image of a standard basis vector; only when the standard basis vectors are eigenvectors does the scaling factor equal the corresponding eigenvalue.
Eigenvalues show how the matrix scales its special directions, called eigenvectors.
Only when a coordinate axis is an eigenvector does a column get scaled by an eigenvalue.
Eigenvalues tell us how much the matrix stretches its own special directions (eigenvectors).
Eigenvectors and Eigenvalues: Mathematical Example
Standard Mathematical Example
Given a matrix \( A \) and a vector \( \mathbf{v} \):
When we multiply \( A \) by \( \mathbf{v} \):
This shows that \( \mathbf{v} \) has been scaled by the matrix \( A \). We can see the scaling in action:
- The vector \( \begin{pmatrix} 1 \\ 2 \end{pmatrix} \) was scaled along the x-axis by a factor of 2 and along the y-axis by a factor of 3.
Eigenvectors and Eigenvalues
Now, if \( \mathbf{v} \) was an eigenvector of \( A \), it would only be scaled (not rotated or changed in direction) by a constant eigenvalue.
For this example:
- \( \mathbf{v} = \begin{pmatrix} 1 \\ 0 \end{pmatrix} \) is an eigenvector with eigenvalue \( \lambda = 2 \),
- \( \mathbf{v} = \begin{pmatrix} 0 \\ 1 \end{pmatrix} \) is an eigenvector with eigenvalue \( \lambda = 3 \).
Summary
"An eigenvector is a vector that, when a linear transformation is applied, only changes in magnitude (scaling), not in direction. The eigenvalue is the factor by which the eigenvector is scaled."
In essence, eigenvectors represent the "directions" that remain unchanged by the transformation (except for scaling), and the eigenvalues give us the scaling factor along those directions.