Skip to main content

Linear Algebra: Notes for MATH 341

Section 4.2 Invertible Matrices

In this section, we will only consider square matrices. A matrix \(A \in M_{n \times n}\) is invertible if there exists a matrix \(B\) such that \(AB=Id_n\) and \(BA=Id_n\text{.}\) The inverse matrix of \(A\) is denoted \(A^{-1}\text{.}\) Be careful that you do not use the notation \(A^{-1}\) until you have shown that \(A\) is invertible.

Subsection Elementary Matrices

Recall that an elementary row operation on a matrix is an operation of the form:
  • multiplying a row by a non-zero scalar
  • switching two rows
  • adding a multiple of one row to another row
Elementary matrices are obtained by performing an elementary operation on the identity matrix.

Question 4.19.

Give the elementary matrix obtained by performing the given operation on \(Id_3\text{.}\) (These are 4 separate questions):
(a)
Scaling the first row by \(\alpha\)
(b)
Switching the second and third rows
(c)
Adding 3 times the 2nd row to the 1st row
(d)
Adding 3 times the 1st row to the 2nd row

Question 4.20.

Check that your answer to the previous question does the desired operation by multiplying each of the four previous elementary matrices by \(\begin{bmatrix} a\amp b\amp c\\d\amp e\amp f\\g\amp h\amp i \end{bmatrix}\text{.}\) Which side do you multiply the elementary matrix on to correspond to row operations?

Question 4.21.

Compute (and verify) the inverse of each of the elementary matrices from the previous problems.
Hint.
Think about how you would go backwards for each of the elementary operations.
Your work on the previous questions should convince you that elementary matrices are invertible and that multiplying by an elementary matrix produces the same result as having performed the corresponding elementary row operation. Elementary matrices offer a way of keeping track of elementary operations. We will not write our a proof of the following theorem at this time, but we state it for future uses:
You shoud, however, at this time prove the theorems below.

Question 4.22.

Give all values of \(k\) where \(A=\begin{bmatrix} 1\amp 0\amp 2\\-1\amp k\amp 4\\3\amp 5\amp 1 \end{bmatrix}\) will be invertible.

Question 4.23.

Give all values of \(k\) where \(A=\begin{bmatrix} 1\amp 0\amp 2\\-1\amp k\amp 4\\3\amp -1\amp 1 \end{bmatrix}\) will be invertible.

Question 4.24.

How many pivots must a matrix \(A\) have in order to be row reducible to \(Id_n\text{?}\) Justify using previous results.

Question 4.25.

Prove or disprove: If \(A\) and \(B\) are invertible \(n\) by \(n\) matrices, then \(A+B\) is invertible.

Question 4.26.

Prove that if \(A\) is invertible, then \(A^T\) is invertible.

Subsection Computing Inverses

In general computing the inverse of a matrix takes more time and operations than solving a system of equations. For this reason, it is generally easier to find and solve a related system of equations problem than to compute the inverse matrix. We will outline a few ways to find inverse matrices and compute a few small examples.

Question 4.27.

If a matrix \(A\) is row reduced to \(Id_n\) by elementary row operations corresponding (in order of use) to elementary matrices \(E_1\text{,}\) \(E_2\text{,}\) ... , \(E_k\text{,}\) give an expression for \(A^{-1}\text{.}\)

Question 4.28.

Use your answer to the previous question to prove the following: Any sequence of elementary row operations that reduces \(A\) to \(Id_n\) also transforms \(Id_n\) into \(A^{-1}\text{.}\)
The previous result shows that computing inverses is equivalent to a row reduction problem. In particular, if \(A\) is invertible, then reducing \([ A \quad | \quad Id_n]\) to reduced row echelon form will produce the matrix \([ Id_n \quad | \quad A^{-1}]\text{.}\)

Question 4.29.

Use the idea above to compute the inverse of \(\begin{bmatrix} a\amp b\\c\amp d \end{bmatrix}\text{.}\) Be sure to note any assumptions you will need to make in order to reduce \([ A \quad | \quad Id_n]\) to \([ Id_n \quad | \quad A^{-1}]\text{.}\)

Exercise 4.15.

If \(A=\begin{bmatrix}1\amp 0\amp 1 \\0\amp 2\amp -1 \\ 0\amp 6\amp -1\end{bmatrix}\text{,}\) find \(A^{-1}\) and check that \(A A^{-1}=Id_3\text{.}\)

Exercise 4.16.

If \(A=\begin{bmatrix} 0\amp -1\\3\amp 4 \end{bmatrix}\text{,}\) find \(A^{-1}\) and use your answer to solve \(A\vec{x} = \vec{b}\) if:
(a)
\(\vec{b} =\colvec{3\\ 1}\)
(b)
\(\vec{b} =\colvec{-1\\ -2}\)
(c)
\(\vec{b} =\colvec{0\\ 5}\)
(d)
\(\vec{b} =\colvec{\alpha\\ \beta}\)

Subsection Invertible Matrix Theorem

Question 4.30.

In many texts there is a long list of equivalent conditions for when a square matrix is invertible. Below is a list of some of these conditions that we have talked about or proven. Go back through your notes and questions and cite when we connected two of the ideas in the list. For instance, parts \(a)\) and \(b)\) are linked by Theorem 4.13
Before stating this major theorem, we should explain what the phrase “the following are equivalent” (sometimes written “TFAE” in scratchwork or on the board) means. A theorem of this type is essentially a giant if and only if theorem. Specifically, each statement in the theorem is true or each statement in the theorem is false. It is not possible for some to be true and some to be false. In a theorem with, say, three statements, we often prove that statement 1 implies statement 2, statement 2 implies statement 3, and statement three implies statement 1. Then you can start at any statement and reach any other statement, showing that if one is true, all the others must be true. However, with longer lists, we sometimes have to prove things a bit more piecemeal.

Question 4.31.

Two important ideas in this course that have been tied to many different methods or ideas are 1) consistent systems of linear equations and 2) invertible matrices. These two ideas are a bit different though. Give an example of a consistent system of linear equations (in matrix equation form \(A\vec{x} = \vec{b}\)) where the coefficient matrix \(A\) is a non-invertible square matrix.
We close this section with a theorem that should not be surprising based on the work that we have done so far. A proof is provided for you. The criterion the following theorem states can be added to the list of statements in The Invertible Matrix Theorem.

Proof.

\(\Rightarrow\).
Since \(A\) is invertible, there exist elementary matrices that row reduce \(A\) to the identity matrix. That is, we have elementary matrices \(E'_i\) such that
\begin{equation*} E'_1\cdots E'_kA = Id_n\text{.} \end{equation*}
Each elementary matrix is invertible, so we can write
\begin{equation*} A = (E'_k)^{-1}\cdots (E'_2)^{-1}(E'_1)^{-1}\text{.} \end{equation*}
As the inverse of an elementary matrix is an elementary matrix, the right-hand side is a product of elementary matrices as desired.
\(\Leftarrow\).
If we have \(A = E_1E_2\cdots E_k\text{,}\) we can multiply one-by-one on the left by the inverses of the elementary matrices, which are also elementary matrices. Thus, we have
\begin{equation*} (E_k)^{-1}\cdots(E_2)^{-1}(E_1)^{-1}A = Id_n\text{.} \end{equation*}
This shows that there is a way to row reduce \(A\) and obtain the identity matrix, so \(A\) is invertible.