The Full Wiki

More info on Linear Algebra/Definition of Determinant

Linear Algebra/Definition of Determinant: Wikis


Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.


Up to date as of January 23, 2010

From Wikibooks, the open-content textbooks collection

< Linear Algebra

In the first chapter of this book we considered linear systems and we picked out the special case of systems with the same number of equations as unknowns, those of the form  T\vec{x}=\vec{b} where T is a square matrix. We noted a distinction between two classes of T's. While such systems may have a unique solution or no solutions or infinitely many solutions, if a particular T is associated with a unique solution in any system, such as the homogeneous system \vec{b}=\vec{0}, then T is associated with a unique solution for every \vec{b}. We call such a matrix of coefficients "nonsingular". The other kind of T, where every linear system for which it is the matrix of coefficients has either no solution or infinitely many solutions, we call "singular".

Through the second and third chapters the value of this distinction has been a theme. For instance, we now know that nonsingularity of an  n \! \times \! n matrix T is equivalent to each of these:

  1. a system  T\vec{x}=\vec{b} has a solution, and that solution is unique;
  2. Gauss-Jordan reduction of T yields an identity matrix;
  3. the rows of T form a linearly independent set;
  4. the columns of T form a basis for  \mathbb{R}^n ;
  5. any map that T represents is an isomorphism;
  6. an inverse matrix T − 1 exists.

So when we look at a particular square matrix, the question of whether it is nonsingular is one of the first things that we ask. This chapter develops a formula to determine this. (Since we will restrict the discussion to square matrices, in this chapter we will usually simply say "matrix" in place of "square matrix".)

More precisely, we will develop infinitely many formulas, one for 1 \! \times \! 1 matrices, one for 2 \! \times \! 2 matrices, etc. Of course, these formulas are related — that is, we will develop a family of formulas, a scheme that describes the formula for each size.


Got something to say? Make a comment.
Your name
Your email address