Eigenvalues without determinants

Friday 18 May 2007 at 3:14 pm | In Articles | 9 Comments

Most (all?) undergraduate courses use determinants to introduce eigenvalues and eigenvectors. So the eigenvalues of a matrix A (or linear transformation t) are the solutions of \det(A-\lambda I)=0. However, Sheldon Axler published a paper in 1994 called Down with Determinants! where he maintains that determinants should not be used so early in linear algebra courses. He gives a very nice proof of the existence of eigenvalues in finite-dimensional vector spaces (over \mathbb{C}) which I would like to reproduce here.

Every linear transformation of a finite-dimensional complex vector space has an eigenvalue.

Here is his proof:

Let V be a non-trivial finite-dimensional complex vector space and t a linear transformation t\colon V \to V. Let v be a fixed non-zero vector in V and suppose that \dim V =n. Then the n+1 vectors v,t(v),t^2 (v),\dots,t^n (v) are linearly dependent. Hence there exists complex numbers \alpha_0,\alpha_1,\dots,\alpha_n not all 0 such that

\alpha_0 v + \alpha_1 t(v) + \dots + \alpha_n t^n (v)=0.

and hence

(\alpha_01+ \alpha_1 t + \dots + \alpha_n t^n)(v)=0.

Now, since \mathbb{C} is algebraically closed,  the polynomial \alpha_0 + \alpha_1 z+ \dots + \alpha_n z^n will factorise so we get

\alpha_0 + \alpha_1 z+ \dots + \alpha_n z^n=c(z-\lambda_1)(z-\lambda_2)\dots(z-\lambda_m).

where c, \lambda_1,\dots,\lambda_m are complex numbers with c \neq 0. It follows that

c(t-\lambda_1 1)(t-\lambda_2 1)\dots(t-\lambda _m 1)(v)=0

which means that, since this is composition of functions, and v \ne 0, then
either
(t-\lambda _m 1)(v)=0 so  (t-\lambda_m 1)(v)=v so that t(v)=\lambda_m v and v is an eigenvector
or
(t-\lambda _{m-1} 1)(t-\lambda _m 1)(v)=0 so (t-\lambda _m 1)(v) is an eigenvector
or

or
(t-\lambda_2 1)\dots(t-\lambda _m 1)(v) is an eigenvector,
and hence t has an eigenvalue.   \blacksquare

Discussion on this approach of not using determinants can be found at NeverEndingBooks and The n-category Café.

9 Comments

TrackBack URI

  1. My Vector Spaces instructor, when introducing eigenvectors also gave us this little caveat. Most Linear Algebra classes I’ve seen deal with the reals anyway.

    By the way, on the line starting “where c=0 and …” you have “c, r_1, …, r_m”. Did you mean lambda instead of r?

    Comment by pierre — Friday 18 May 2007 7:38 pm #

  2. Whoops! Thanks, yes it should be lambda. I will change it.

    The Down with Determinants paper only deals with real vector spaces much later on, considering them as embedded in a complex vector space. He then shows that linear transformations of odd-dimensional real vector spaces have a real eigenvalue.

    Comment by steve — Friday 18 May 2007 7:51 pm #

  3. I think it’d be nicer if we don’t have to use determinants. For a 4×4 matrix the determinant method can get nasty and my work tends to become messy with a higher chance of errors! It can be ‘tedious’ as well, and then to top it up sometimes it doesn’t factorize nicely!

    However like Pierre said, we’ve not really dealt with complex numbers. Everything normally happens in the reals (or sometime Z_p). I think we’re going to be doing Complex Analysis next year.

    Comment by beans — Sunday 20 May 2007 4:16 pm #

  4. Surely the proof, as stated, isn’t valid. It would imply that any vector v is an eigenvector. Instead what we know is that either v, or (t-λm1)v, or (t-λm-11)(t-λm1)v, etc. is an eigenvector. (Or you could simply point out that you mean “for some v, not equal to the v we chose above” in the last sentence of the proof.)

    Nice comment preview BTW!

    Comment by sigfpe — Thursday 28 June 2007 11:31 pm #

  5. Yes, thanks for that and I will change it. This is the sort of mistake I would make a caustic comment about when a student does it 🙂

    Comment by steve — Friday 29 June 2007 10:09 am #

  6. Axler has also published an entire book on linear algebra from this point of view. It is called ‘Linear Algebra Done Right’. You may like it. The proofs are quite pretty, but I do not like the book for teaching since it makes it difficult for students to build intuition.

    Comment by Scott — Friday 11 January 2008 8:25 pm #

  7. Eigenvalues and determinants reveal quite a bit of information about a matrix. In this lab we will learn how to use MATLAB to compute the eigenvalues, eigenvectors, and the determinant of a matrix. We will also learn about diagonalization and how it can be applied to study certain problems in population dynamics.

    Comment by plastik — Thursday 7 February 2008 9:35 am #

  8. When I first learned about eigenvalues/eigenvectors, and still now when they are mentioned in other classes it is always with determinants. After reading your post I really wish that I had seen this approach since it makes things more obvious (to me at least).

    Comment by Nicholas James — Saturday 22 November 2008 7:03 pm #

  9. But how do you know that the vectors v, tv, t2v, t3v, …., tnv are all distinct to begin with?

    Comment by Angelos Sphyris — Tuesday 13 March 2012 6:21 pm #

Sorry, the comment form is closed at this time.

Powered by WordPress with Pool theme design by Borja Fernandez.
Entries and comments feeds. Valid XHTML and CSS. ^Top^