Primary Tool
Math Workspace (CAS)
CAS works well here because you can compute the eigenstructure and immediately test what the matrix does to specific vectors.
The geometric picture
Most vectors change direction under a linear map. An eigenvector is special because keeps the vector on the same line through the origin.
If , the vector is stretched. If , it is compressed. If , the direction is flipped as well as scaled.
Why this matters computationally
Once you know the invariant directions, repeated applications of the matrix become easier to understand. That is why diagonalization turns matrix powers into powers of eigenvalues.
The geometry and the computation are the same story told two ways: invariant directions on one side, simpler coordinates on the other.
What the matrix is really doing
For a generic vector, a linear transformation usually mixes coordinates and changes direction. Eigenvectors are the rare directions where that mixing disappears.
That is why eigenvectors matter geometrically: they identify the directions along which the map behaves like simple scaling instead of a complicated combination of scaling and turning.
A practical test is: pick a vector , compute , and ask whether lies on the same line as . If it does, then is an eigenvector and the scale factor is the eigenvalue.
How this connects to diagonalization
If you can collect enough independent eigenvectors, they form a basis adapted to the transformation itself. In that basis the matrix becomes diagonal.
So diagonalization is not a separate topic from eigenvectors. It is the coordinate-system version of the same geometric idea.
Common Pitfall
A vector is an eigenvector only if applying the matrix sends it to a scalar multiple of itself. In plain terms, must stay on the same line as , even if the length changes or the direction flips.
Try a Variation
Pick a matrix with a repeated eigenvalue. Do you still get two independent invariant directions, or does the geometry change?
Related Pages
Keep moving through the cluster
Diagonalize a matrix and compute
Diagonalization is one of the first places students see why eigenvalues matter computationally. Once , powers of become powers of a diagonal matrix.
Open worked example →
Proof that the kernel of a linear map is a subspace
This is a standard linear algebra proof because it packages the subspace test into one reusable pattern: show zero is inside, then check closure under addition and scalar multiplication.
Open proof →