Primary Tool
Math Workspace (CAS)
CAS works well here because you can compute the eigenstructure and immediately test what the matrix does to specific vectors.
The geometric picture
Most vectors change direction under a linear map. An eigenvector is special because keeps the vector on the same line through the origin.
If , the vector is stretched. If , it is compressed. If , the direction is flipped as well as scaled.
Why this matters computationally
Once you know the invariant directions, repeated applications of the matrix become easier to understand. That is why diagonalization turns matrix powers into powers of eigenvalues.
The geometry and the computation are the same story told two ways: invariant directions on one side, simpler coordinates on the other.
What the matrix is really doing
For a generic vector, a linear transformation usually mixes coordinates and changes direction. Eigenvectors are the rare directions where that mixing disappears.
That is why eigenvectors matter geometrically: they identify the directions along which the map behaves like simple scaling instead of a complicated combination of scaling and turning.
A practical test is: pick a vector , compute , and ask whether lies on the same line as . If it does, then is an eigenvector and the scale factor is the eigenvalue.
How this connects to diagonalization
If you can collect enough independent eigenvectors, they form a basis adapted to the transformation itself. In that basis the matrix becomes diagonal.
So diagonalization is not a separate topic from eigenvectors. It is the coordinate-system version of the same geometric idea.
Common Pitfall
A vector is an eigenvector only if applying the matrix sends it to a scalar multiple of itself. In plain terms, must stay on the same line as , even if the length changes or the direction flips.
Try a Variation
Pick a matrix with a repeated eigenvalue. Do you still get two independent invariant directions, or does the geometry change?
Related Pages
Keep moving through the cluster
Diagonalize a matrix and compute
Diagonalization is one of the first places students see why eigenvalues matter computationally. Once , powers of become powers of a diagonal matrix.
Open worked example →
Proof that eigenvectors for distinct eigenvalues are linearly independent
Distinct eigenvalues do more than label eigenspaces. They force eigenvectors to be independent, which is exactly why diagonalization becomes possible so often in practice.
Open proof →
Compute eigenvalues and eigenvectors of a matrix
A first serious 3 by 3 eigenproblem should show new structure without burying the point in arithmetic. This one does that: you see the characteristic polynomial, the eigenspaces, and the diagonalization test in one pass.
Open worked example →
What a basis does in linear algebra
A basis is what turns a vector space into something you can navigate. It reaches every vector, and it does so without redundancy, so coordinates become possible.
Open explanation →