Most (all?) undergraduate courses use determinants to introduce eigenvalues and eigenvectors. So the eigenvalues of a matrix (or linear transformation ) are the solutions of . However, Sheldon Axler published a paper in 1994 called Down with Determinants! where he maintains that determinants should not be used so early in linear algebra courses. He gives a very nice proof of the existence of eigenvalues in finite-dimensional vector spaces (over ) which I would like to reproduce here.
Every linear transformation of a finite-dimensional complex vector space has an eigenvalue.
Here is his proof:
Let be a non-trivial finite-dimensional complex vector space and tÂ a linear transformation . Let be a fixed non-zero vector in and suppose that . Then the vectors are linearly dependent. Hence there exists complex numbers not all 0 such that
Now, since is algebraically closed, Â the polynomial will factorise so we get
where are complex numbers with . It follows that
which means that, since this is composition of functions, and , then
soÂ Â Â so that and is an eigenvector
so Â is an eigenvector
is an eigenvector,
and hence has an eigenvalue. Â Â