A set of simultaneous equations may be written as

(1) |

(2) |

Another set of simultaneous equations which arises frequently in practice is the so-called homogeneous equations

(3) |

This set always has the solution which is
often called the trivial solution.
For (3) to have nontrivial
solution values for the determinant of should vanish,
meaning that
the columns of do not span an *n*-dimensional space.
We will return later
to the subject of actually solving sets of simultaneous equations.

A most useful feature of matrices is that their elements may be not only numbers but that they may be other matrices. Viewed differently, a big matrix may be partitioned into smaller submatrices. A surprising thing is that the product of two matrices is the same whether there are partitions or not. Study the identity

(4) |

(5) |

(6) |

We now utilize matrix partitioning
to develop the bordering method of matrix inversion.
The bordering method is not the fastest or the most accurate method
but it is quite simple,
even for nonsymmetric complex-valued matrices,
and it also gives the determinant and works for homogeneous equations.
The bordering
method proceeds by recursion.
Given the inverse to a matrix,
the method shows how to find the inverse of a matrix,
which is the same old matrix with an additional row and column
attached to its borders.
Specifically, , and
are taken to be known in (7).
The task is to find , and *z*.

(7) |

The first thing to do is multiply the partitions in (7) together. For the first column of the product we obtain

(8) | ||

(9) |

(10) |

(11) |

(12) | ||

(13) |

(14) |

(15) |

Let us summarize the recursion:
One begins with the upper left-hand corner of a matrix.
The corner is a scalar and its inverse is trivial.
Then it is
considered to be bordered by a row and a column as shown in (7).
Next, we find the inverse of this matrix.
The process is continued as long as one likes.
A typical step is first compute *z* by (15)
and then compute of one larger size by

(16) |

SUBROUTINE CMAINE (N,B,A) C A=MATRIX INVERSE OF B COMPLEX B,A,C,R,DEL DIMENSION A(N,N),B(N,N),R(100),C(100) DO 10 I=1,N DO 10 J=1,N 10 A(I,J)=0 DO 40 L=1,N DEL=B(L,L) DO 30 I=1,L C(I)=0. R(I)=0. DO 20 J=1,L C(I)=C(I)+A(I,J)*B(J,L) 20 R(I)=R(I)+B(L,J)*C(I) 30 DEL=DEL-B(L,I)*C(I) C(L)=-1. R(L)=-1. DO 40 I=1,L C(I)=C(I)/DEL DO 40 J=1,L 40 A(I,J)=A(I,J)+C(I)*R(J) RETURN END

It is instructive to see what becomes of if **A** is perturbed
steadily in such a way that the determinant of **A** becomes singular.
If the element *g* in the matrix of (7) is moved closer and
closer to ,then we see from (15) that *z*
tends to infinity.
What is interesting is that the second term in
(16) comes to dominate the first,
and the inverse tends to infinity
times the product of a column with a row **r**.

The usual expressions
or in the limit of small *z ^{-1}* tend to

(17) |

(18) |

(19) | ||

(20) |

In summary, then,
to solve an ordinary set of simultaneous equations like (1),
one may compute the matrix inverse of **A** by the bordering method
and then multiply (1) by obtaining

(21) |

The row homogeneous equations of (20) was introduced because such a
set arises naturally for the solution to the row eigenvectors of a
nonsymmetric matrix.
In the next section, we will go into some detailed properties of eigenvectors.
A column eigenvector **c** of a matrix **A**
is defined by the solution to

(22) |

(23) |

(24) |

(25) |

- Indicate the sizes of all the matrices in equations (8) to (15).
- Show how (16) follows from (10), (11), (14), and (15).

10/30/1997