Algebra golf: Determinants
29 July 2023- A painless definition of determinants
- Properties for free
- Matrices
- Cramer’s rule
- Finishing the challenge
- Bonus: Eigenvalue theory
- Conclusion
It is time for some recreational linear algebra. Todays’ challenge will be to show that is an affine group scheme in the most economical way possible (even if it takes away some algebraic insight.)
We apply the following completely arbitrary set of rules:
- Using basic results from linear algebra in vector spaces is allowed, but we have no definition of determinants yet.
- Avoid explicit combinatorics whenever possible.
- Assuming knowledge of tensor products and basic category theory is allowed.
If is a commutative ring, we define to be the group of invertible matrices. If we can define determinants in some “clean” way and show that
we are essentially done.
A painless definition of determinants #
Let be a commutative unital ring. For any -module we define the -th exterior power of as the module
We write for the image of by the quotient map .
If is a linear map, then clearly descends to a linear map . Since the tensor product is functorial, so is this assignment. We denote the functor by .
Now suppose is free of rank . Then is free of rank . To see this, let be a basis of . We claim that generates . Any nonzero -wedge of the must contain each basis element exactly once so we can identify them with a permutation in . Observe that
so if for a transposition , then the corresponding wedges satisfy . Since the transpositions generate we conclude
and is a free generator of .
Let be a linear map and suppose is again free of rank . Since , there exists a unique element such that
Properties for free #
Since is functorial, we immediately obtain the properties
In particular, if is invertible, then so is . We are already halfway there.
Matrices #
For the remainder of the discussion we shall work with matrices. Consider the free module with the standard basis . We identify a matrix with the linear map and a linear map with the matrix whose entries are determined by the equations
Suppose has columns . Note that
This shows that determinants are linear in each column of the matrix . Writing the as linear combinations of the standard basis, we see that is given by a polynomial in the entries with integer coefficients (recall that wedges of the standard basis are integer linear combinations of as we have seen before).
Cramer’s rule #
Let and let be the matrix obtained by replacing column in by . Since determinants are linear in each column, the map
is linear. We notice that so identification of with a matrix leads to
To see that also holds, we use a neat shortcut. The columns of are given by
so if we replace by an indeterminate matrix , the entries of are integer coefficient polynomials in the and thus consists of polynomials in . All those polynomials vanish on because left and right inverses coincide for complex matrices by the rank-nullity theorem. Hence, all the polynomials are zero. Since for any we obtain by evaluating those polynomials in the entries of , we have for all .
We conclude that if , then is invertible and thus
Remark: Our proof of Cramer’s rule shows in some opaque way that is Dedekind-finite – a property that we could have derived from Nakayama’s lemma as well.
Finishing the challenge #
Fix a commutative ring . Mapping a -algebra morphism to
is functorial because matrix multiplication and determinants are polynomial in the matrix entries. Hence, we have a functor .
Let denote an indeterminate matrix and define the algebra
It is clear that can be identified with by evaluation in , hence
define a natural isomorphism and is representable, making it an affine group scheme.
Bonus: Eigenvalue theory #
So far we have not derived an explicit formula for the determinant. As it turns out, we can do surprisingly much without a formula. Let us develop some eigenvalue theory.
In this chapter we take to be a field and is a -vector space of dimension .
Characteristic polynomials #
Let be a linear map. The characteristic polynomial of is defined as
This is well-defined because functoriality of implies that is invariant under change of basis and for any fixed basis we know that is polynomial in the coefficients that determine the linear operator.
Eigenvalues as roots #
Let be a linear map. A scalar is an eigenvalue of , if the linear system
has a non-trivial solution . In that case, we call an eigenvector of .
We claim that the eigenvalues of are precisely the roots of . Indeed, if has no solutions , then is an isomorphism and thus . On ther other hand, if some satisfies the equation, then we can extend to a basis of and satisfies
We conclude .
Extracting coefficients #
We can now establish that is the product of the eigenvalues in the algebraic closure of . To see this, we wish to find the degree and leading coefficient of without using an explicit formula for the determinant. Consider the case . Then is continuous as a polynomial function and we have
and clearly for higher powers of this becomes zero. Hence with leading coefficient . Since this holds for all , the equation must also hold for the generic integer polynomial that, evaluated in the coefficients of , determines . Therefore, this fact about the characteristic polynomial holds in any field.
Let now be any field. In the algebraic closure of we can factorize
and thus we have
Conclusion #
It is surprising how much information can be obtained about determinants without knowing some explicit formula in terms of matrix coefficients. Of course, this is not a good way to go about it because such proofs leave out most of the algebraic insight. It is still an interesting recreational challenge and perhaps there is an even more economical way to solve it!
Scroll to Top