Singleton bound

From HandWiki
Revision as of 22:49, 6 February 2024 by NBrush (talk | contribs) (update)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Upper bound in coding theory

In coding theory, the Singleton bound, named after Richard Collom Singleton, is a relatively crude upper bound on the size of an arbitrary block code [math]\displaystyle{ C }[/math] with block length [math]\displaystyle{ n }[/math], size [math]\displaystyle{ M }[/math] and minimum distance [math]\displaystyle{ d }[/math]. It is also known as the Joshibound.[1] proved by (Joshi 1958) and even earlier by (Komamiya 1953).

Statement of the bound

The minimum distance of a set [math]\displaystyle{ C }[/math] of codewords of length [math]\displaystyle{ n }[/math] is defined as [math]\displaystyle{ d = \min_{\{x,y \in C : x \neq y\}} d(x,y) }[/math] where [math]\displaystyle{ d(x,y) }[/math] is the Hamming distance between [math]\displaystyle{ x }[/math] and [math]\displaystyle{ y }[/math]. The expression [math]\displaystyle{ A_{q}(n,d) }[/math] represents the maximum number of possible codewords in a [math]\displaystyle{ q }[/math]-ary block code of length [math]\displaystyle{ n }[/math] and minimum distance [math]\displaystyle{ d }[/math].

Then the Singleton bound states that [math]\displaystyle{ A_q(n,d) \leq q^{n-d+1}. }[/math]

Proof

First observe that the number of [math]\displaystyle{ q }[/math]-ary words of length [math]\displaystyle{ n }[/math] is [math]\displaystyle{ q^n }[/math], since each letter in such a word may take one of [math]\displaystyle{ q }[/math] different values, independently of the remaining letters.

Now let [math]\displaystyle{ C }[/math] be an arbitrary [math]\displaystyle{ q }[/math]-ary block code of minimum distance [math]\displaystyle{ d }[/math]. Clearly, all codewords [math]\displaystyle{ c \in C }[/math] are distinct. If we puncture the code by deleting the first [math]\displaystyle{ d-1 }[/math] letters of each codeword, then all resulting codewords must still be pairwise different, since all of the original codewords in [math]\displaystyle{ C }[/math] have Hamming distance at least [math]\displaystyle{ d }[/math] from each other. Thus the size of the altered code is the same as the original code.

The newly obtained codewords each have length [math]\displaystyle{ n-(d-1) = n-d+1, }[/math] and thus, there can be at most [math]\displaystyle{ q^{n-d+1} }[/math] of them. Since [math]\displaystyle{ C }[/math] was arbitrary, this bound must hold for the largest possible code with these parameters, thus:[2] [math]\displaystyle{ |C| \le A_q(n,d) \leq q^{n-d+1}. }[/math]

Linear codes

If [math]\displaystyle{ C }[/math] is a linear code with block length [math]\displaystyle{ n }[/math], dimension [math]\displaystyle{ k }[/math] and minimum distance [math]\displaystyle{ d }[/math] over the finite field with [math]\displaystyle{ q }[/math] elements, then the maximum number of codewords is [math]\displaystyle{ q^k }[/math] and the Singleton bound implies: [math]\displaystyle{ q^k \leq q^{n-d+1}, }[/math] so that [math]\displaystyle{ k \leq n - d + 1, }[/math] which is usually written as[3] [math]\displaystyle{ d \leq n - k + 1. }[/math]

In the linear code case a different proof of the Singleton bound can be obtained by observing that rank of the parity check matrix is [math]\displaystyle{ n - k }[/math].[4] Another simple proof follows from observing that the rows of any generator matrix in standard form have weight at most [math]\displaystyle{ n - k + 1 }[/math].

History

The usual citation given for this result is (Singleton 1964), but it was proven earlier by (Joshi 1958). Joshi notes that the result was obtained earlier by (Komamiya 1953) using a more complex proof. (Welsh 1988) also notes the same regarding (Komamiya 1953).

MDS codes

Linear block codes that achieve equality in the Singleton bound are called MDS (maximum distance separable) codes. Examples of such codes include codes that have only [math]\displaystyle{ q }[/math] codewords (the all-[math]\displaystyle{ x }[/math] word for [math]\displaystyle{ x\in\mathbb{F}_q }[/math], having thus minimum distance [math]\displaystyle{ n }[/math]), codes that use the whole of [math]\displaystyle{ (\mathbb{F}_{q})^{n} }[/math] (minimum distance 1), codes with a single parity symbol (minimum distance 2) and their dual codes. These are often called trivial MDS codes.

In the case of binary alphabets, only trivial MDS codes exist.[5][6]

Examples of non-trivial MDS codes include Reed-Solomon codes and their extended versions.[7][8]

MDS codes are an important class of block codes since, for a fixed [math]\displaystyle{ n }[/math] and [math]\displaystyle{ k }[/math], they have the greatest error correcting and detecting capabilities. There are several ways to characterize MDS codes:[9]

Theorem — Let [math]\displaystyle{ C }[/math] be a linear [[math]\displaystyle{ n,k,d }[/math]] code over [math]\displaystyle{ \mathbb{F}_q }[/math]. The following are equivalent:

  • [math]\displaystyle{ C }[/math] is an MDS code.
  • Any [math]\displaystyle{ k }[/math] columns of a generator matrix for [math]\displaystyle{ C }[/math] are linearly independent.
  • Any [math]\displaystyle{ n-k }[/math] columns of a parity check matrix for [math]\displaystyle{ C }[/math] are linearly independent.
  • [math]\displaystyle{ C^{\perp} }[/math] is an MDS code.
  • If [math]\displaystyle{ G = (I|A) }[/math] is a generator matrix for [math]\displaystyle{ C }[/math] in standard form, then every square submatrix of [math]\displaystyle{ A }[/math] is nonsingular.
  • Given any [math]\displaystyle{ d }[/math] coordinate positions, there is a (minimum weight) codeword whose support is precisely these positions.

The last of these characterizations permits, by using the MacWilliams identities, an explicit formula for the complete weight distribution of an MDS code.[10]

Theorem — Let [math]\displaystyle{ C }[/math] be a linear [[math]\displaystyle{ n,k,d }[/math]] MDS code over [math]\displaystyle{ \mathbb{F}_q }[/math]. If [math]\displaystyle{ A_w }[/math] denotes the number of codewords in [math]\displaystyle{ C }[/math] of weight [math]\displaystyle{ w }[/math], then [math]\displaystyle{ A_w = \binom{n}{w} \sum_{j=0}^{w-d} (-1)^j \binom{w}{j} (q^{w-d+1-j} -1) = \binom{n}{w}(q-1)\sum_{j=0}^{w-d} (-1)^j \binom{w-1}{j}q^{w-d-j}. }[/math]

Arcs in projective geometry

The linear independence of the columns of a generator matrix of an MDS code permits a construction of MDS codes from objects in finite projective geometry. Let [math]\displaystyle{ PG(N,q) }[/math] be the finite projective space of (geometric) dimension [math]\displaystyle{ N }[/math] over the finite field [math]\displaystyle{ \mathbb{F}_q }[/math]. Let [math]\displaystyle{ K = \{P_1,P_2,\dots,P_m \} }[/math] be a set of points in this projective space represented with homogeneous coordinates. Form the [math]\displaystyle{ (N+1) \times m }[/math] matrix [math]\displaystyle{ G }[/math] whose columns are the homogeneous coordinates of these points. Then,[11]

Theorem — [math]\displaystyle{ K }[/math] is a (spatial) [math]\displaystyle{ m }[/math]-arc if and only if [math]\displaystyle{ G }[/math] is the generator matrix of an [math]\displaystyle{ [m,N+1,m-N] }[/math] MDS code over [math]\displaystyle{ \mathbb{F}_q }[/math].

See also

Notes

  1. Keedwell, A. Donald; Dénes, József (24 January 1991). Latin Squares: New Developments in the Theory and Applications. Amsterdam: Elsevier. p. 270. ISBN 0-444-88899-3. https://books.google.com/books?id=Ey8iXKkQpDkC&pg=PA270. 
  2. Ling & Xing 2004, p. 93
  3. Roman 1992, p. 175
  4. Pless 1998, p. 26
  5. Vermani 1996, Proposition 9.2
  6. Ling & Xing 2004, p. 94 Remark 5.4.7
  7. MacWilliams & Sloane 1977, Ch. 11
  8. Ling & Xing 2004, p. 94
  9. Roman 1992, p. 237, Theorem 5.3.7
  10. Roman 1992, p. 240
  11. Bruen, A.A.; Thas, J.A.; Blokhuis, A. (1988), "On M.D.S. codes, arcs in PG(n,q), with q even, and a solution of three fundamental problems of B. Segre", Invent. Math. 92 (3): 441–459, doi:10.1007/bf01393742, Bibcode1988InMat..92..441B 

References

Further reading