Geometric Intuition
One good way of measuring linear transformations is with how the area of a given region changes. You can think of the determinant of a matrix as a measure of the linear transformation the matrix represents.
When a linear transformation is applied to the space, every region in the space changes area by the same factor, called the determinant. For example, if a transformation increases every area by 3, its determinant 3
If a linear transformation transforms the area of a region into 0, then the determinant must be 0. This is a special case and has applications, such as when calculating eigenvectors and eigenvalues. We can visualize this linear transformation as squishing a grid into a lower dimension
If the orientation of space is inverted, then determinant is negative. You can use RH/LH rule to verify change in orientation.
Above properties are for 2-D but apply to 3D as well.
Calculation
The determinant of a matrix is calculated as follows:
Laplaceβs Expansion
For larger matrices, such as
we use Laplaceβs Expansion. Laplaceβs Expansion states that the definition of the determinant of a matrix is
Triangle Method
If we have a matrix with an upper or lower triangle (upper triangle shown below)
Then, multiplying along the diagonal and by the determinants of any elementary matrices used in converting to row echelon form:
This is just an extension of Laplaceβs Expansion.
Determinant of Matrix Product
The determinant the product of two matrices is just the product of their determinants.
Proof
To prove it, we must start with proving the above property for elementary matrices.
Row Swapping
The formula for the determinant of a row-swapping elementary matrix by a matrix is
Our lord and savior Chris Ge provides both an informal and formal proof. Since I hate collared shirts, the informal proof is as follows:
- Swapping adjacent rows -> determinant is multiplied by -1 (Laplaceβs Expansion)
- Swapping two non-adjacent rows and is the same as making swaps -> determinant is multiplied by -1.
Row Multiplication
Given an elementary matrix that multiplies a row of a matrix by a factor , the determinant of is
or equivalently
We know that the determinant of is just by the Triangle Method.
The proof for this formula is pretty intuitive; if you use Laplaceβs Expansion along the row of that was scaled by the factor , then the determinant is naturally going to be multiplied by .
Row Addition/Subtraction
Given an elementary matrix that performs row addition within a matrix , the determinant of is just
or equivalently
This is because the determinant, according to Laplaceβs Expansion, is a linear function of the rows and columns of the matrix. Therefore, adding rows to each other means that they cancel out eventually.
Hereβs an example. Let
The determinant is
Now, if we add two of to ,
The determinant is
Thus ending our proof.
Combining Elementary Matrices
Now that we have our above lemmas, we can just combine them all to say that for any matrices and , as long as one of them are invertible,
This is because invertible matrices can be written as the product of a series of elementary matrices, allowing us to expand out the determinant of the product as a product of many determinants of elementary matrices.
Properties
- If is invertible
- If one row of is a multiple of another row, then
- If A has a row or column of all zeroes, then
- If , then is not invertible