0% found this document useful (0 votes)
468 views7 pages

Orthogonal Matrix PDF

This document defines and discusses various types of matrices including orthogonal, antiorthogonal, self-orthogonal, row-orthogonal, and row-antiorthogonal matrices over arbitrary fields. It shows relationships between these matrices and linear codes, in particular demonstrating that orthogonal/antiorthogonal/self-orthogonal matrices correspond to self-dual/weakly self-dual codes, while row-orthogonal/row-antiorthogonal matrices relate to linear codes with complementary duals. Some new constructions of linear codes with complementary duals are obtained from these relationships.

Uploaded by

MustansarWattoo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
468 views7 pages

Orthogonal Matrix PDF

This document defines and discusses various types of matrices including orthogonal, antiorthogonal, self-orthogonal, row-orthogonal, and row-antiorthogonal matrices over arbitrary fields. It shows relationships between these matrices and linear codes, in particular demonstrating that orthogonal/antiorthogonal/self-orthogonal matrices correspond to self-dual/weakly self-dual codes, while row-orthogonal/row-antiorthogonal matrices relate to linear codes with complementary duals. Some new constructions of linear codes with complementary duals are obtained from these relationships.

Uploaded by

MustansarWattoo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Orthogonal, Antiorthogonal and Self-Orthogonal

Matrices and their Codes


James L. Massey
Signal & Information Processing Laboratory
Swiss Federal Institute of Technology
CH-8092 Zurich, Switzerland
Abstract

Orthogonal matrices over arbitrary elds are de ned together with


their non-square analogs, which are termed row-orthogonal matrices. Antiorthogonal and self-orthogonal square matrices are introduced together
with their non-square analogs. The relationships of these matrices to such
codes as self-dual codes and linear codes with complementary duals are
given. These relationships are used to obtain some constructions of linear
codes with complementary duals.

1 Introduction

The aim of this paper is to de ne a number of di erent types of matrices


over an arbitrary eld that are similar to the familiar orthogonal matrices
over the real eld or to natural extensions of orthogonal matrices, then
to show the relationships between these matrices and some familiar linear
codes. In particular, these relationships are used to obtain some new
constructions of linear codes with complementary duals.

2 Orthogonal Matrices

Let F n denote the vector space of n-tuples (or row vectors) with components in an arbitrary eld F . The scalar product of the vectors u and
v is the eld element uvT , where here and hereafter the superscripted
T denotes transposition. The vectors u and v are said to be orthogonal
when uvT = 0.
A square matrix A over F is said to be orthogonal if
AAT = I;
where here and hereafter I denotes an identity matrix of appropriate dimension. Equivalently, A is orthogonal just when each row of A is orthogonal to every other row of A but has a scalar product of 1 with itself.
1

Note that A is orthogonal just when AT = A,1 , and hence AT A = I


so that AT is also orthogonal. An orthogonal matrix is not only nonsingular but always has a determinant that is either +1 or ,1 because
1 = det(I) = det(AAT ) = det(A) det(AT ) = (det(A))2 . An example of
an orthogonal matrix over the nite eld GF(2) is
2
0 1 1 13
A = 64 11 01 10 11 75 :
1 1 1 0
Orthogonal matrices over the eld R of real numbers are of great
importance in the theory of isometries of Rn , cf. [1].
It seems natural, for an in general non-square, matrix A, to say that
A is row-orthogonal if
AAT = I;
as this is equivalent to the condition that each row of A is orthogonal to
every other row of A but has a scalar product of 1 with itself. A roworthogonal matrix always has full row rank and thus must have at least
as many columns as rows. If A is row-orthogonal but nonsquare, then
AT cannot have full row rank and thus cannot also be row-orthogonal.
Deleting rows of an orthogonal matrix gives a row-orthogonal matrix, but
not every row-orthogonal matrix can be so constructed. For instance,
over the eld GF(2), the matrix A = [1 1 1] is trivially row-orthogonal
but there is no orthogonal matrix having [1 1 1] as a row.

3 Antiorthogonal Matrices

One of the most interesting concepts introduced by P. G. Farrell, in whose


honor this paper is written, is that of an anticode [2]. A \code" is usually
designed to have a large minimum distance between its codewords. Because the opposite of \large minimum distance" is surely \small maximum
distance," it was natural for Farrell to use the term anticode to describe a
set n-tuples designed to have small maximum distance between its \codewords". [We note that sometimes anticodes are de ned in such a manner
that their \codewords" are all the n-tuples formed by linear combinations
of the rows of some matrix, which need not have full row rank so that the
\codewords" need not all be di erent.] The concept of an anticode has
found numerous applications both in coding theory and in combinatorics,
cf. pp. 548-556 in [3].
Inspired by Farrell's creative terminology, we seek to de ne an \antiorthogonal matrix" in an appropriate way. Because the opposite of I
is surely ,I [at least if we overlook elds of characteristic 2 for which
I = ,I], it seems natural to call a square matrix B antiorthogonal if
BBT = ,I;
i.e., if the rows of B are pairwise orthogonal but each row has a scalar
product of ,1 with itself. It follows that B is antiorthogonal if and only
2

if B,1 = ,BT , and thus BT B = ,I so that BT is also antiorthogonal.


An example of an antiorthognal matrix over GF(3) is

B=

1 1 :
1 2

In a eld of characteristic 2, and only in such a eld, ,1 = 1 so that an


antiorthogonal matrix is also an orthogonal matrix. Over the real eld,
the scalar product of a vector with itself is nonnegative, which implies
that no antiorthogonal real matrices exist. However, if A is an orthogonal
matrix and i is the imaginary number, then the complex matrix B = iA
is antiorthogonal.
We now relate antiorthogonal matrices to codes. We rst recall that
a q-ary code [i.e., a code in which the components of codewords lie in
GF(q)] with qk codewords is systematic if it possess an information set,
i.e., if there is a set of k coordinates such that no two distinct codewords
have components that agree in all k of these coordinates. By a permutation of coordinates, which does not a ect the Hamming distance between
codewords, one obtains an equivalent code for which the rst k coordinates
are an information set, which we shall call a leading-systematic code. Every linear code is systematic and hence equivalent to a leading-systematic
linear code. Moreover, a linear code is leading-systematic if and only if it
has a generator matrix of the form G = [I : P], which generator matrix
is easily seen to be unique and is called the systematic generator matrix
of the code. We recall further that a linear code V is said to be self-dual
if V = V ? where V ? is the dual code of V . If the code length is n,
then the dimension of a self-dual code must be k = n=2 so that n must
be even. We can now give a very simple, but apparently not previously
stated, characterization of self-dual codes.
Proposition 1 A leading-systematic linear code V is self-dual if and only
if, in its systematic generator matrix

G = [I : P];

the matrix P is antiorthogonal.


Proof: Because the code length of a self-dual code satis es n = 2k where
k is the code dimension, the matrix P must be square. Moreover, V will
be self-dual just when G is also a parity-check matrix of the code, i.e.,
when GGT = 0. But GGT = I + PPT so that V is self-dual just when
PPT = ,I, as was to be shown.
It seems entirely natural, for an in general nonsquare matrix B, to say
that B is row-antiorthogonal if

BBT = ,I;

as this is equivalent to the condition that each row of B is orthogonal


to every other row of B but has a scalar product of -1 with itself. A
row-antiorthogonal matrix always has full row rank and thus must have
3

at least as many columns as rows. If B is row-antiorthogonal but nonsquare, BT cannot also be row-antiorthogonal. In a eld of characteristic
2 and only in such a eld, a row-antiorthogonal matrix is also a roworthogonal matrix. Deleting rows of an antiorthogonal matrix gives a
row-antiorthogonal matrix, but not every row-orthogonal matrix can be
so constructed, as our previous example of the binary matrix B = [1 1 1]
demonstrates.
Recalling that a linear code V is said to be weakly self-dual if V  V ? ,
we obtain a simple generalization of Proposition 1.
Proposition 2 A leading-systematic linear code V is weakly self-dual if
and only if, in its systematic generator matrix

G = [I : P];

the matrix P is row-antiorthogonal.


Proof: The code V will be self-dual just when the row space of G is a
subset of V ? , i.e., when GGT = 0. But GGT = I + PPT so that V is
self-dual just when PPT = ,I, i.e., when P is row-antiorthogonal.

4 Self-Orthogonal Matrices

It seems a natural extension of terminology to say that a square matrix


C over an arbitrary eld F is self-orthogonal if

CCT = O;

where here and hereafter O denotes a zero matrix of appropriate dimension. Equivalently, C is self-orthogonal just when each row of C is orthogonal to every row of C including itself. It follows from CCT = O that
det(C) = 0 and hence that a self-orthogonal matrix is always singular.
An example of a self-orthogonal matrix over the eld GF(2) is
2

1
6
C = 4 11
0
Note that

1
1
1
0

1
1
0
1

13
1 7:
05
1

1 1 0 13
CT C = 64 11 11 00 11 75
1 1 0 1
T
so that C is not self-orthogonal in this example.
We are now virtually forced to say, for an in general non-square matrix
C, that C is row-self-orthogonal if

CCT = O;
4

as this is equivalent to the condition that each row of C is orthogonal to


every row of C including itself. A row-self-orthogonal matrix, which is
not square (and hence not also a self-orthogonal matrix) can have full row
rank. Indeed any matrix obtained by deleting rows from a self-orthogonal
matrix is row-self-orthogonal so that deleting the second row from the
above-displayed self-orthogonal matrix C gives a 3  4 matrix that is
row-self-orthogonal and has full row rank.
The proofs of Propositions 1 and 2 imply the following alternative
characterization of self-dual and weakly self-dual codes.
Proposition 3 A linear code V with generator matrix G is self-dual or
weakly self-dual if and only if G is self-orthogonal or row-self-orthogonal,
respectively.

5 Applications to LCD Codes

We now show some connections between the above-de ned matrices and
linear codes with complementary duals (or LCD codes for short). An LCD
code is a linear code V such that V \ V ? = f0g. The reader is referred
to [4] for proofs of the basic properties of LCD codes including the fact
that if G is a generator matrix for a linear code V , then V is self-dual if
and only if GGT is a nonsingular matrix.
We now show a rst connection between LCD codes and the abovede ned matrices.
Proposition 4 A leading-systematic linear code V is an LCD code if (but
not only if), in its systematic generator matrix
G = [I : P];
the matrix P is row-self-orthogonal or, equivalently, if G is row-orthogonal.
Proof: Because GGT = I + PPT , it follows that G is row-orthogonal just
when P is row-self-orthogonal. Moreover, if P is row-self-orthogonal, then
GGT = I so that V is indeed an LCD code.
As an application of Proposition 4, we rst note that, for any k  m
matrix Q over a eld of characteristic 2, the k  2m matrix P = [Q : Q]
is row-self-orthogonal. Thus G = [I : Q : Q] generates a leadingsystematic LCD code of length n = k + 2m and dimension k. In fact,
these are the codes used in Proposition 2 of [4] to establish the asymptotic
goodness of LCD codes over a nite eld of characteristic 2.
More generally, if Q is any k  m matrix over a eld of characteristic
p such that -1 is a quadratic residule modulo p, i.e., such that there exists
in GF(p) for which 2 = ,1, then P = [Q : Q] is row-self-orthogonal
and hence G = [I : Q : Q] generates a leading-systematic LCD code
of length n = k + 2m and dimension k. A theorem of Lagrange (cf. p.
302 in [5]), implies that, for any prime p, one can nd elements , ,
and  in GF(p) such that 2 + 2 + 2 + 2 = ,1. The corresponding
matrix P = [ Q : Q : Q : Q] is thus row-self-orthogonal. Hence
G = [I : Q : Q : Q : Q] generates a leading-systematic LCD code
5

of length n = k + 4m and dimension k. These are the codes used in [4] to


establish the asymptotic goodness of LCD codes over an arbitrary nite
eld.
A stronger consequence of Proposition 4 in the same vein as the previous examples is the following.
Proposition 5 If B is any m  m antiorthogonal matrix and Q is any
k  m matrix, then
G = [I : Q : QB];
is the generator matrix of a leading-systematic LCD code of length n =
k + 2m and dimension k.
Proof: The proposition follows immediately from Proposition 4 upon
noting that P = [Q : QB] satis es PPT = QQT + QBBT QT =
QQT , QQT = O so that P is indeed a row-self-orthogonal matrix.
The class of codes de ned in Proposition 5 is rich enough to meet the
asymptotic Varshamov-Gilbert bound as even a crude lower bound on
the the number of orthogonal matrices suces to establish, but we omit
details of this argument here.
The following is another consequence of Proposition 4.
Proposition 6 If Q is any k  k matrix, C is any k  m row-selforthogonal matrix, and A is any m  m orthogonal matrix, then

G = [I : QCA];

is the generator matrix of a leading-systematic LCD code of length n = k +


m and dimension k. The same holds true if A is any mm antiorthogonal
matrix
Proof: Letting P = QCA, we have PPT = QCAAT CT QT = QCCT QT =
O so that P is indeed row-self-orthogonal.

6 Closing Remarks

It surpised us somewhat in the course of the work described here that we


were able to characterize both self-dual codes and LCD codes so cleanly in
terms of row-orthogonal matrices, row-antiorthogonal matrices and rowself-orthogonal matrices. Whether this formulation will prove to be of
real use remains to be seen. In [4], it was shown that the complexity of
nearest-codeword decoding of an LCD codes is essentially the complexity
of the problem, when given a codeword in the dual code V ? , of nding
the nearest codeword in the code V . Our suspicion is that the matrix
formulations of LCD codes given above may be useful in attacking this
problem, and we hope that dexterous coding theorists such as P. G. Farrell will take a crack at con rming our suspicion or showing that it is
unfounded..

References

[1] G. A. Jones, \Symmetry," in Handbook of Applicable Mathematics,


Vol.5, Combinatorics and Geometry (Eds. W. Lederman and S. Vajda). Chichester and New York: Wiley, 1985, pp. 329-422.
[2] P. G. Farrell, \Linear Binary Anticodes,' Electronics Letters, Vol. 6,
pp. 419-21, 1970.
[3] F. J. MacWilliams and N. J. A. Sloane, The Theory of Error Correcting Codes. Amsterdam: North-Holland, 1977.
[4] J. L. Massey, "Linear Codes with Complementary Duals," Discrete
Math., Vol. 106/107, pp. 337-342, 1992. [Also appears as 337-342
in A Collection of Contributions in Honor of Jack van Lint (Eds. P.
J. Cameron and H. C. A. van Tilborg), Topics in Discrete Math. 7.
Amsterdam: Elsevier 1992.]
[5] G. H. Hardy and E. M. Wright, An Introduction to the Theory of
Numbers. London: Oxford Univ. Press, 4th Ed., 1965.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy