SECTION 4.4

  1. Let \(M = \left(\begin{array}{ccccccccc}A&B\\O&C\end{array}\right) \in {\mathsf{\small M}}_{n \times n}\), where A is an \(r \times r\) matrix, C is an \(s \times s\) matrix, and O is the \(s \times r\) zero matrix. Suppose first that A is not invertible. Then the rank of A is less than r, and so the columns of A are linearly dependent by Theorem 3.5. In addition, \(\det(A) = 0\) by the corollary to Theorem 4.6. Clearly the first r columns of M are linearly dependent, and so the rank of M is less than \(r + s\). Thus \(\det(M) = 0.\) Hence \(\det(M) = 0 = \det(A) \cdot \det(C)\).
  2. Now suppose that A is invertible, and define \[ P = \left(\begin{array}{ccccccccc}A&O^{\thinspace\prime}\\O&I_s\end{array}\right) \qquad \mbox{and} \qquad Q = \left(\begin{array}{ccccccccc}I_r&A^{-1}B\\O&C\end{array}\right), \] where \(O^{\thinspace\prime}\) is the \(r \times s\) zero matrix. Because the row i, column j entry of PQ is the sum of the corresponding entries from row i of P and column j of Q, we have \(PQ = M\). A straightforward induction argument on s using cofactor expansion along the last row of P gives \(\det(P) = \det(A)\). Similarly, \(\det(Q) = \det(C)\). Hence \[ \det(M) = \det(PQ) = \det(P) \cdot \det(Q) = \det(A) \cdot \det(C), \] completing the proof.