Matrices 11: Matrices and simultaneous equations

Overview:
  1. Introduction to matrices
  2. Adding and subtracting matrices
  3. Multiplying matrices
  4. 2 × 2 Matrices and linear transformations
  5. Determinants of 2 × 2 matrices
  6. Inverses of 2 × 2 matrices
  7. Invariant points and lines in 2 dimensions
  8. 3 × 3 Matrices and linear transformations
  9. Determinants of 3 × 3 matrices
  10. Inverses of 3 × 3 matrices
  11. Matrices and simultaneous equations

 Part 11: Matrices and simultaneous equations


Solving simultaneous equations

We can use our knowledge of matrix multiplication and inverse matrices to solve simultaneous equations. For example, consider this pair of simultaneous equations:

\(4x + 5y = 37\\2x + 3y = 19\)

These can be rewritten as a product of a \(\color{red}{\text{coefficient matrix}}\) and a \(\color{green}{\text{column vector}}\) \(\color{red}{ \begin{pmatrix} 4 & 5\\ 2 & 3 \\\end{pmatrix}}\color{green}{\begin{pmatrix} x\\ y \\\end{pmatrix}}=\begin{pmatrix} 37\\ 19 \\\end{pmatrix}\)

Remember, solving the pair of simultaneous equations in the above case means finding the values of \(x\) and \(y\) that satisfy the equations. We can do this by left-multiplying both sides of the matrix equation by the inverse of \(\color{red}{\begin{pmatrix} 4 & 5\\ 2 & 3 \\\end{pmatrix}}\), which is \(\color{blue}{\frac{1}{2}\begin{pmatrix} 3 & -5\\ -2 & 4 \\\end{pmatrix}}\):

\(\color{blue}{\frac{1}{2}\begin{pmatrix} 3 & -5\\ -2 & 4 \\\end{pmatrix}}\color{red}{\begin{pmatrix} 4 & 5\\ 2 & 3 \\\end{pmatrix}}\color{green}{\begin{pmatrix} x\\ y \\\end{pmatrix}}=\color{blue}{ \frac{1}{2}\begin{pmatrix} 3 & -5\\ -2 & 4 \\\end{pmatrix}} \begin{pmatrix} 37\\ 19 \\\end{pmatrix}\)

Since the red and blue are inverses (and therefore multiply to give us the identity matrix), this simplifies to:

\(\color{green}{\begin{pmatrix} x\\ y \\\end{pmatrix}}=\color{blue}{ \frac{1}{2}\begin{pmatrix} 3 & -5\\ -2 & 4 \\\end{pmatrix}} \begin{pmatrix} 37\\ 19 \\\end{pmatrix}\)

Multiplying the right-hand side, we find:

\(\color{green}{\begin{pmatrix} x\\ y \\\end{pmatrix}}=\begin{pmatrix} 8\\ 1 \\\end{pmatrix}\)

 

Graphical visualisations

In part 6, we were introduced to the idea that not all matrices have inverses. For the matrix M to have an inverse, we need |M| ≠ 0 i.e. we need M to be non-singular. When the coefficient matrix is non-singular, we can find a unique solution to the set of linear simultaneous equations. A non-singular \(2 \times 2\) matrix corresponds to a pair of equations of lines that intersect at exactly one point, with this point defining the solution:

<

A non-singular \(3 \times 3\) matrix corresponds to a set of three planes that intersect at exactly one point as shown. It might be easier to appreciate how three planes can intersect at this point by starting with just one plane (the red one), and then introducing the next two planes one at a time:

<

Note that you can click and drag in the applet to rotate your view.

What if the determinant is 0?

If the coefficient matrix has a determinant of 0, this will either be because the simultaneous equations have no solutions or an infinite number solutions. When there are no solutions, we say that the set of simultaneous equations is inconsistent. (Note that when there are infinite solutions, the equations are consistent, so a coefficient matrix with a determinant of 0 does not necessarily imply that the set of simultaneous equations is inconsistent.) The following applets graphically illustrate the scenarios (in 2D and 3d) in which a coefficient matrix can have a determinant of 0:

2\(\times\)2 matrices
<
3\(\times\)3 matrices

Click each scenario to see a graphical illustration. Note that you can click and drag in the applet to rotate your view.

<