Block Matrices
Block Matrices
computers &
mathematics
with applications
PERGAMON Computers and Mathematics with Applications 43 (2002) 119-129
www.eisevier.com/locate/camwa
A b s t r a c t - - l n this paper, the authors give explicitinverse formulae for 2 x 2 block matrices with
three differentpartitions. Then these resultsare applied to obtain inversesof block triangularmatrices
and various structured matrices such as Hamiltonian, per-Hermitian, and centro-Hermitian matrices.
(~) 2001 Elsevier Science Ltd. All rights reserved.
1. I N T R O D U C T I O N
This p a p e r is devoted to the inverses of 2 x 2 block matrices. First, we give explicit inverse
formulae for a 2 x 2 block matrix
D ' (1.1)
with three different partitions. Then these results are applied to obtain inverses of block triangular
matrices and various structured matrices such as bisymmetric, Hamiltonian, per-Hermitian, and
centro-Hermitian matrices. In the end, we briefly discuss the completion problems of a 2 x 2
block matrix and its inverse, which generalizes this problem.
T h e inverse formula (1.1) of a 2 x 2 block matrix appears frequently in m a n y subjects and has
long been studied. Its inverse in terms of A -1 or D -1 can be found in standard textbooks on
linear algebra, e.g., [1-3]. However, we give a complete treatment here. Some papers, e.g., [4,5],
deal with its inverse in terms of the generalized inverse of A. Needless to say, a lot of research is
devoted to the generalized inverse of the 2 x 2 block matrix, e.g., [6-8]. But this paper is not in
this direction.
There are m a n y related papers on the 2 x 2 block matrix. The Schur complement D - C A - 1 B
of A in (1.1) has been studied by several mathematicians, e.g., [9-11]. Lazutkin [12] studies the
signature of a symmetric 2 x 2 block matrix. B a p a t and Kwong [13] obtain an inequality for the
Schur product of positive definite 2 x 2 block matrices.
This work was supported by the National Science Council of the Republic of China under Contract NSC86-2815-
C- 110-022.
The authors would like to thank the referee for his helpful comments in revising this paper.
0898-1221/01/$ - see front matter (~) 2001 Elsevier Science Ltd. All rights reserved. Typeset by ~4A~-TEX
PII: S0898-1221(01)00278-4
120 T.-T. Lu AND S.-H. SHIOU
This paper has three objectives. First, we completely list all the relevant formulae for a 2 x 2
block matrix and its inverse. Although it is nothing but a mechanical exercise, some of the results
do not appear in the literature. Second, we explore matrices with symmetric structures related
to a 2 × 2 block matrix.
We present this paper in the traditional way, though our formulae can also be proved using
computer algebra systems. In fact, the package NCAlgebra [14] for noncommutative algebra has
been used to solve certain 2 x 2 and 3 x 3 block matrix completion problems. The methodology
of the solution procedure is explained in [15]. So our final objective is to indicate this fact and
encourage further study of these techniques.
Only complex matrices will be considered in this paper, but most of our results can be extended
to matrices with elements in an arbitrary field. This paper is organized as follows. In Section 2,
we derive several formulae for the inverse of a 2 × 2 block matrix with three different partitions.
In Section 3, we apply these results to get the inverses of 2 × 2 block triangular matrices. In
Section 4, we apply our formulae to matrices with certain structures. In the last section, we
indicate the related completion problems of a 2 x 2 block matrix and its inverse, and the possible
computer theorem proving of matrix theory.
2. I N V E R S E FORMULAE
A nonsingular square matrix R and its inverse R -1 can be partitioned into 2 x 2 blocks as
R= [A B]
D and R_ 1 = [ E FH] . (2.1)
To make the multiplication of R by R -1 and R -1 by R possible, the sizes of all blocks cannot
be arbitrary. Assume A, B, C, and D have sizes k × m, k × n, l × m, and 1 × n, respectively,
with k + l = m + n; then the sizes of E, F , G , and H must be m x k, m × l , n × k , andn×l,
respectively. In other words, R -1 is in the transposed partition of R.
In this section, we shall write down the formulae for E, F, G, and H in terms of A, B, C,
and D. We assume one of the blocks A, B, C, or D is a nonsingular square matrix to avoid
generalized inverses. Thus, we have only three possible partitions:
* square diagonal partition: k = m and I -- n,
• square off-diagonal partition: k = n and l = m,
• all-square partition: k = l = m = n.
The original matrix R and its inverse R -1, of course, must have even dimension in the all-square
partition.
First, we consider the square diagonal partition of R and R - I . In this case, A, D, E, H are
square matrices, A and E have the same size, and so do D and H. The following theorem is well
known and can be found in [3, Problem 1.6.7].
THEOREM 2.1.
(i) Assume A is nonsingular; then the matr/x R in (2.1) is invertible if and only if the Schur
complement D - C A - 1 B of A is invertible, and
(ii) Assume D is nonsingular; then the matrix R is invertible if and only if the Schur comple-
ment A - B D - 1 C o l D is invertible, and
It is clear that these two set of formulae axe used in different situations, and they are equiv-
alent if both A and D are nonsingular. References [1, Theorem 8.2.1] and [2, 0.7.3] give mixed
expressions for R-1 by combining these two sets of formulae.
There are several ways to prove this theorem, but it is rather trivial in view of the block
Gaussian elimination of R [16, Exercise 2.6.15]. More precisely,
R= [CA_ 1 O][o
I o ][o
D-CA-1B
to get
R- 1 = J(RJ)- 1.
But for the completeness and later use, we include their formulae here.
THEOREM 2.2.
(i) Assume B is nonsingular; then the matrix R in (2.1) is invertible if and only if the Schur
complement C - DB-XA orb is invertible and
(i;) Assume C is nonsingular; then the matrix R is invertible if and only if the Schur comple-
ment B - AC-XD of C is invertible and
Similarly, these two sets of formulae are used in different situations, and they are equivalent if
both B and C are nonsingular.
Finally, we consider the all-square partition; i.e., all blocks in R and R -1 are square. In this
case, R and R -1 must be of even size. Since this partition can be regarded as the square diagonal
partition and the square off-diagonal partition, the previous two theorems are applicable, but used
under different assumptions. In some special cases, these formulae are identical:
• (2.2) and (2.4) are equivalent if A and B are invertible;
• (2.2) and (2.5) are equivalent if A and C are invertible;
• (2.3) and (2.4) are equivalent if B and D are invertible;
• (2.3) and (2.4) are equivalent if C and D are invertible.
122 T.-T. Lu AND S.-H. Smou
We remark that for a nonsingular matrix R, it is possible that all its blocks A, B, C, and D
0
0]1
are singular. So all the above formulae fail to compute R-1. A typical example is the all-square
0
0 0
In fact, all of its square diagonal partitions fail to have nonsingular A or D. However, the square
off-diagonal partition with m = 1 and n = 3 can be applied to get its inverse. Therefore, different
partitions give us more choices to find the desired inverse when other methods break down, and
each one has its own value as we shall see in the next two sections.
3. B L O C K TRIANGULAR MATRICES
In this section, we apply our main theorems to 2 × 2 block diagonal and block triangular
matrices. First, we have the trivial consequence for the inverses of block diagonal and block
secondary diagonal matrices.
COROLLARY 3.1.
(i) For the square diagonal partition, [Ao are invertible,
L
Notice that the inverse of a block diagonal matrix is also block diagonal. Similarly, the inverse
of a block secondary diagonal matrix is block secondary diagonal too, but in transposed partition
so that there is a switch between B and C. This corollary is also easy to extend to n x n block
diagonal and secondary diagonal matrices.
In the rest of this section, we will study the inverses of block triangular matrices. By Theo-
rems 2.1 and 2.2, we have the following corollary.
(i) For the square diagonal partition, it is invertible if and only if A and D are invertible,
and it has inverse
[~1 -A-1BD-1
D_ 1 ]. (3.1)
(ii) For the square off-diagonal partition with B nonsingular, it is invertible if and only
if D B - 1 A is invertible, and it has inverse
( D B - 1 A ) - 1DB-1 _ ( D B - ' A ) -1 ]
B _ I _ B _ I A ( D B _ I A ) _ I DB_I B_IA(DB_,A)_,j. (3.2)
Clearly, the inverse of a block upper triangular matrix is block upper triangular only in the
square diagonal partition. In general this is not true for the square off-diagonal partition. More-
over, if the partition is in fact an all-square partition and A, B, and D are all invertible, then (3.2)
is equivalent to (3.1).
Similarly, for the block lower triangular matrix in the square diagonal partition,
= [ _ D _ I C A _ 1 D0- l ] •
2 x 2 BlockMatrices 123
For the square off-diagonal partition,
[c OD]-I [C-ID(AC-1D)-~C-1-C-~D(AC-'D)-'AC-1]
= [ - ( A C - 1 D ) -1 (AC-'D)-IAc -1 •
There are two more possibilities, namely, A -- O or D -- O. For the former case, Theorems 2.1
and 2.2 are reduced to the following corollary.
[o;]
(i) For the square off-diagonal partition, it is invertible if and only if B and C are also
invertible, and it has inverse
o ]" (3.3)
(ii) For the square diagonal partition with D nonsingular, it is invertible if and only if B D - 1 C
is also invertible, and it has inverse
- ( B D - I C ) -1 (BD-'C) -1BD-' ]
D - 1 C (BD-1C) -1 D -1 - D - 1 C (BD-1C) - 1 B D - ' J" (3.4)
In the first part of the theorem, its inverse is still sparse but in the transposed position. In
the second half, the sparsity will be destroyed in general. Moreover, if the partition is in fact an
all-square partition and B, C, D are all invertible, then (3.4) is equivalent to (3.3).
Similar results hold for the square off-diagonal partition
[A oB]-I [BO_I C -1
_B-1AC-I ] ,
and for the square diagonal partition
4. S T R U C T U R E D MATRICES
In this section, we will apply our main theorems to structured matrices, which includes bisym-
metric, Hamiltonian, Hankel, Toeplitz, circulant, Hermitian, per-Hermitian, centro-Hermitian,
and their skew Hermitian matrices.
The natural partition for a Hermitian or symmetric matrix is the square diagonal partition,
which preserves the symmetry of the diagonal blocks. On the contrary, the square off-diagonal
partition will, in general, spoil the symmetry of Hermitian matrices. However, Theorem 2.1 or
Theorem 2.2 is still applicable for a Hermitian matrix of even size in the all-square partition. In
summary, we have the following corollary.
COROLLARY4.1.
(i) A matrix is Hermitian if and only if it has the form [ B*A DB] in the square diagonal partition,
where A and D are Hermitian. Its inverse can be computed by
A 1 + A - 1 B (D - B*A-1B) - 1 B * A -1 -A-1B(D-B*A-1B)-I]
(4.1)
- (D - B'A-IS)-1B.A_ 1 ( D - B * A - 1 B ) -1 J
or
(A_BD_IB,)_I - (A - BD-1B*) - 1 B D -1 ]
(4.2)
- D - 1 B * (A - B D - I B * ) -1 D -1 + D - 1 B * ( A - BD-*B*) - 1 B D - 1 j '
/f the required inverses exist.
124 T.-T. Lu AND S.-H. Smou
in Part (i) can also be calculated by the same partition method. Thus, we have a recursive
method to obtain the inverses of Hermitian matrices, which is useful in practical and parallel
computing.
It is well known that the inverse [~ F [ of a Hermitian matrix is also Hermitian. Hence, for the
k - - ~ j
square diagonal partition, E and H are Hermitian, and G = F*. This fact can also be checked
easily from the above inverse formulae. So it suffices to compute only one of G and F, and half
of the entries of E and H.
As a special case, we consider the positive definite matrix, which is Hermitian automatically [2,
p. 397]. Let it be partitioned as B" D ; then its principal submatrices .4 and D are positive
definite too. So both A and D are nonsingular. The inverse of such a matrix can be computed
using the same formulae (4.1)-(4.4). Notice that its inverse matrix is positive definite and so
are the principal submatrices of this inverse. Therefore, all the diagonal blocks in (4.1)-(4.4) are
all positive definite. Horn and Johnson [2, p. 472] point out the same fact, but only on ( A -
BD-1B*) -1 and ( D - B*A-1B) -1.
Now we turn to the skew-Hermitian or skew-symmetric matrix. The square diagonal partition
is the right choice in order to preserve skew-symmetry. In fact, we have the following corollary.
COROLLARY 4.2. A matrix is skew-Hermitian if and only if it has the representation [ -B*A ~ ]
k J
in the square diagonal partition, with A and D skew-Hermitian. Its inverse can be computed
by (2.2) and (2.3). For the all-square partition, in addition (2.4) and (2.5) can be used.
It is easy to see that the derived inverse matrix is skew-Hermitian too. We remark that it
suffices to consider the inverse of a skew-symmetric matrix of even order, since a skew-symmetric
matrix of odd order must be singular and has no inverse.
A bisymmetric matrix is a real matrix of the form
[A
- B n- D '
such that its diagonal blocks A and D are symmetric negative semidefinite of the same size, and
the remaining matrix /_OT ~ / i s skew-symmetric. It is straightforward to show that a bisym-
metric matrix is negative semidefinite as well. Such matrices occur in the linear complementarity
problems of quadratic programming; for example, see [17].
Since a bisymmetric matrix is in the all-square partition, we can get its inverse / E
GHF~ by L J
and H are symmetric. Since this bisymmetric matrix has an inverse, it must be negative definite,
and so must its inverse and corresponding principal submatrices E and H.
A matrix R = (rij) is called per-Hermitian if rij = r n + l - j , n ÷ l - i for all i and j [18]. In short,
R -= JR*J, where J is the matrix defined in Section 2. A real per-Hermitian matrix is called
persymmetric, or secondary symmetric in [19,20], which elements are symmetric with respect
to its secondary diagonal. The natural partition for a per-Hermitian matrix is the square off-
diagonal partition, which preserves the symmetry of the off-diagonal blocks. On the contrary, the
square diagonal partition, except the all-square one, will spoil the symmetry of the per-Hermitian
matrix in general.
°
COROLLARY 4.4. A matrix is per-Hermitian if and only if it has the form JA*J
square off-diagonal partition, where B and C are per-Hermitian. Its inverse can be computed by
using (2.4) and (2.5). In particular, if its partition is the all-square one, then in addition both
(2.2) and (2.3) can be used.
Similarly, the inverses
B -1 , (C-JA*JB-1A) -1 , C -1 and ( B - A C - 1 J A * J ) -1
in (2.4) and (2.5) can be calculated recursively by the same partition method.
It is trivial to see that the inverse [ ~ F [ o f a per-Hermitian matrix R is also per-Hermitian,
since
R -1 = ( j R * j ) -1 = J ( R - l ) * J.
Hence, for the square off-diagonal partition, F and G are per-Hermitian, and H = JE*J. This
fact can also be checked easily from all its inverse formulae. So it suffices to compute only one
of' E and H, and half of the entries of F and G.
Similarly, a matrix R is called skew-per-Hermitian if R = - J R * J [18]. Like per-Hermitian ma-
trices, the square off-diagonal partition is the right choice in order to preserve skew-persymmetry
of such matrices. It is also easy to see that the derived inverse matrix is skew-per-Hermitian too.
COROLLARY 4.5. A matrix is skew-per-Hermitian ifand only ifit has the form [A -JA*jB ] in the
square off-diagonal partition, with B and C skew-per-Hermitian. Its inverse can be computed
by (2.4) and (2.5). For such a matrix in the all-square partition, (2.2) and (2.3) can also be used.
A real skew-per-Hermitian matrix is usually called skew-persymmetric matrix, or secondary
skew-symmetric in [19,20]. We remark that every skew-persymmetric matrix of odd order is
singular and has no inverse. This can be easily verified as follows. We first notice that the
determinant of the matrix J is either 1 or - 1 . Then an n x n skew-persymmetric matrix R
satisfies
det R = ( - 1 ) n det J . det R T • det J = ( - 1 ) n det R.
Hence, det R vanishes when n is odd.
A Hamiltonian matrix is a matrix of the form
where B and C are symmetric of the same size. Such a matrix is related to the algebraic Riccati
equation in control theory [21]. Since it is in the all-square partition, both Theorems 2.1 and 2.2
are applicable.
COROLLARY 4.6. The inverse of the Hamiltonian matrix (4.6) can be computed by (2.2)-(2.5).
Notice that (2.4) and (2.5) hold even when B and C are of different sizes. Let its inverse
r
be / ~ ~ ] " It is easy to show, from our inverse formulae, that F and G are symmetric as well
L
COROLLARY 4.7. A matrix is centro-Hermitian of even order if and only if it has the form
I
L J
ovo partition,
all formulae (2.2)-(2.5) can be used to compute its inverse.
Since
R-1 = j - 1 / ~ - i j - 1 = j R _ l j,
JB JAJ 0 A- B
where
K=~
1[:, j .
0 A-B K=K-i (A + B ) - i
O (A )-I K,
in view of Corollary 3.1(i). This leads to the result obtained by Good [26].
For a centrosymmetric matrix of odd order, we have a similar result [24,25]. Such a matrix
can be represented as
A x BJ]
R = yT r yTj ,
JB Jx JAJ
where A and B are matrices of the same size, x and y are column vectors, and r is a scalar. Let
K = 1 ;
then
A+B 2x O ]
R=K-1 YO r O K,
0 A-B
and, again by Corollary 3.1(i),
R-1 - K - 1 yT 0 K.
0 (A - B) -1
The upper left inverse of the 2 x 2 block matrix can be calculated as before. To be more precise,
according to Theorem 2.1(i), if A + B is invertible and
t = r - 2yT(A + B ) - l x ¢ 0,
then
yT = ~ _qT
yT = --~
r L - - r y T M -1 r + 2yTM-lx "
Similarly, a matrix R is skew-centro-Hermitian if R = - J / ~ J [22,23]. For such matrices, we
have the following corollary.
2 x 2 Block Matrices 127
COROLLARY 4.8. A matrix is skew-centro-Hermitian of even order ff and only if it has the
form [ -JB
A - jBJ
f i j ] in the all-square partition. Therefore, (2.2)-(2.5) can be used to compute its
L
inverse.
Since its inverse is skew-centro-Hermitian too, only two blocks of its inverse need to be com-
puted.
A real skew-centro-Hermitian matrix is also called skew-centrosymmetric. An even-order skew-
centrosymmetric matrix has the following decomposition [24]:
A+ B ,, (A- )-1 0
For a skew-centrosymmetric matrix R of odd order n, it is always singular and has no inverse.
This can be seen by
det R = ( - 1 ) n det J . det R . det J -- - det R.
Our formulae are also useful for the other structured matrices. For example, all Hankel matrices
are symmetric, and it is natural to use the square diagonal partition and Corollary 4.1 to compute
their inverses. For a Hankel matrix of even order, the all-square partition is the best choice, with
respect to which it has the form [ A DB]. In this case, all (2.2)-(2.5) can be used. Even in the
square off-diagonal partition, off-diagonal blocks of every Hankel matrix are strongly related. In
fact, one is a submatrix of the other.
Toeplitz and circulant matrices are persymmetric automatically, so the square off-diagonal
partition and Corollary 4.4 are the first choice. For these two types of matrices of even order, the
all-square partition is the best to use. In this case, every Toeplitz matrix has the form [GAS],
P
5. C O M P L E T I O N PROBLEMS
In this final section, we give a brief introduction to a more general problem, i.e., the completion
problem of a 2 × 2 block matrix and its inverse. This problem determines if there exists a
nonsingular matrix with some known entries so that its inverse has specified elements. To be
more precise, supposing
and four of A through G are given, our goal is to find all the other block matrices in (5.1). When
four of the given blocks are all on the same side of (5.1), it becomes a simple problem to find the
inverse matrix, which we have done in Section 2. Thus, it is more interesting when the four given
blocks are on different sides of (5.1). It can be three blocks on one side and one on the other, or
each side has exactly two blocks known.
Many works can be found on this type of matrix completion problem. Fiedler and Markham [28]
study the block completion problem (5.1) where A, B, C, and H are known. For the general
128 T.-T. Lu AND S.-H. Smou
t r a n s p o s e d partition, t h e y give the necessary and sufficient conditions such t h a t the problem
has a solution. H u a [29] completely solves the same completion problem but with a s y m m e t r i c
assumption: given A, B, C, and H , find D, E, F , and G satisfying (5.1) such t h a t b o t h blocked
matrices in (5.1) are symmetric. As a special case, he solves the same completion problem with
the s y m m e t r i c positive definite constraint.
B a r r e t t et al. [30] consider (5.1) in a general transposed partition with A, D, F , and G known,
and give several necessary and sufficient conditions such t h a t it is solvable. This problem is
more difficult since it is related to a quadratic matrix equation. Helton et al. repeatedly solve
this exact problem using the n o n c o m m u t a t i v e software package N C A l g e b r a [14]. Assisted by the
same package, Kronewitter solves a specific 3 x 3 block matrix completion problem and amazingly
obtains 31,000 new theorems. T h e methodology, programs, and other applications can be found
in [15] as well as in their website h t t p : / / m a t h . u c s d , edu/,,~ncalg.
It is very probable t h a t all the formulae in this paper can be proved by c o m p u t e r techniques
independent of traditional h u m a n manipulation. One can assert t h a t proper use of a m o d e r n
c o m p u t e r will dramatically increase the power of theorem proving. In this final remark, we
would like to encourage further exploration of this subject.
REFERENCES
1. F.A. Graybill, Matrices with Applications in Statistics, 2 nd Edition, Wadsworth, Belmont, CA, (1983).
2. R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, (1985).
3. B. Noble and J.W. Daniel, Applied Linear Algebra, 3 rd Edition, Prentice-Hall, Englewood Cliffs, N J, (1988).
4. J.W. Blattner, Border matrices, J. Soc. Indust. Appl. Math. 10, 528-536, (1962).
5. K. Nomakuehi, On the characterization of generalized inverses by bordered matrices, Linear Algebra Appl.
33, 1-8, (1980).
6. F.J. Hall, The Moore-Penrose inverse of particular bordered matrices, J. Austral. Math. Soc. Set. A 27,
467-478, (1979).
7. J.M. Miao, General expressions for the Moore-Penrose inverse of a 2 × 2 block matrix, Linear Algebra Appl.
151, 1-15, (1991).
8. C.R. Rao and H. Yanai, Generalized inverses of partitioned matrices useful in statistical applications, Linear
Algebra Appl. 70, 105-113, (1985).
9. D.H. Carlson, What are Schur complements, anyway?, Linear Algebra Appl. '/4, 257-275, (1986).
10. I.N. Imam, The Schur complement and the inverse M-matrix problem, Linear Algebra Appl. 62, 235-240,
(1984).
11. D.V. Ouellette, Schur complements and statistics, Linear Algebra Appl. 36, 187-295, (1981).
12. V.F. Lazutkin, The signature of invertible symmetric matrices, Math. Notes 44, 592-595, (1988).
13. R.B. Bapat and M.K. Kwong, A generalization of A o A -1 > I, Linear Algebra Appl. 93, 107-112, (1987).
14. J.W. Helton, M. Stankus and D. Kronewitter, NCAlgebra, Noncommuting Algebra Software, httla : / / m a t h .
ucsd. edu/,,~acalg, University of California at San Diego.
15. J.W. Helton and M. Stankus, Computer assistance for "discovering" formulas in system engineering and
operator theory, Journal of Functional Analysis 161 (2), 289-363, (1999).
16. P. Lancaster and M. Tismenetsky, The Theory of Matrices, 2 nd Edition, Academic Press, San Diego, (1985).
17. E. Klafszky and T. Terlaky, Some generalizations of the criss-cross method for quadratic programming,
Optimization 24, 127-139, (1992).
18. R.D. Hill, R.G. Bates and S.R. Waters, On perhermitian matrices, SIAM J. Matrix Anal. Appl. 11, 173-179,
(1990).
19. A. Lee, Secondary symmetric, skewsymmetric and orthogonal matrices, Periodica Mathematica Hungarica
7, 63-70, (1976).
20. A. Lee, On S-symmetric, S-skewsymmetric and S-orthogonal matrices, Periodica Mathematica Hungarica 7,
71-76, (1976).
21. P. Benner and H. FaBbender, An implicitly restarted symplectic Lanczos method for the Hamiltonian eigen-
value problem, Linear Algebra Appl. 268, 75-111, (1997).
22. R.D. Hill, R.G. Bates and S.R. Waters, On centrohermitian matrices, SIAM J. Matrix Anal. Appl. 11,
128-133, (1990).
23. A. Lee, Centrohermitian and skew-centrohermitian matrix, Linear Algebra Appl. 29, 205-210, (1980).
24. A.R. Collar, On centrosymmetric and centroskew matrices, Quart. J. Mech. and Applied Math. XV, 265--281,
(1962).
25. J.R. Weaver, Centrosymmetric (cross-symmetric) matrices, their basic properties, eigenvalues and eigenvec-
tors, Amer. Math. Monthly 92, 711-717, (1985).
26. I.J. Good, The inverse of a centrosymmetric matrix, Technometrics 12, 925-928, (1970).
27. W.D. Ray, The inverse of a finite Toeplitz matrix, Technometrics 12, 153-156, (1970).
2 x 2 Block Matrices 129
28. M. Fiedler and T.L. Markham, Completing a matrix when certain entries of its inverse are specified,Linear
Algebra Appl. 74, 225-237, (1986).
29. D. Hua, Completing a symmetric 2 x 2 block matrix and its inverse, Linear Algebra Appl. 235, 235-245,
(1996).
30. W.W. Barrett, M.E. Lundquist, C.R. Johnson and H.J. Woerdeman, Completing a block diagonal matrix
with a partial prescribed inverse, Linear Algebra Appl. 223/224, 73-87, (1995).