Zyablov bound

From BitcoinWiki
This is the approved revision of this page, as well as being the most recent.
Jump to: navigation, search

Zyablov bound is a lower bound on the rate R and relative distance \delta of concatenated codes.

Statement of the bound[edit]

Let R be the rate of the outer code C_{out} and \delta be the relative distance, then the rate of the concatenated codes satisfies the following bound.

\mathcal{R} \geqslant \max\limits_{0 \leqslant r \leqslant 1 - H_q(\delta + \varepsilon)} r \left (1 - {\delta \over {H_q ^{-1}(1 - r) - \varepsilon}} \right )

where r is the rate of the inner code C_{in}.

Description[edit]

Let C_{out} be the outer code, C_{in} be the inner code.

Consider C_{out} meets the Singleton bound with rate of R, i.e. C_{out} has relative distance \delta>1 - R. In order for C_{out} \circ C_{in} to be an asymptotically good code, C_{in} also needs to be an asymptotically good code which means, C_{in} needs to have rate r>0 and relative distance \delta_{in}>0.

Suppose C_{in} meets the Gilbert-Varshamov bound with rate of r and thus with relative distance

\delta_{in} \geqslant H_q ^{-1}(1 - r) - \varepsilon, \qquad \varepsilon>0,

then C_{out} \circ C_{in} has rate of rR and \delta = (1 - R) \left (H_q^{-1}(1 - r) - \varepsilon \right ).

Expressing R as a function of \delta, r

R =1- \frac{\delta}{H^{-1}(1-r) - \varepsilon}

Then optimizing over the choice of r, we get that rate of the Concatenated error correction code satisfies,

\mathcal{R} \geqslant \max\limits_{0\leqslant r\leqslant {1- H_q(\delta + \varepsilon)}} r \left ( 1 - {\delta \over {H_q ^{-1}(1 - r) - \varepsilon}} \right )

This lower bound is called Zyablov bound (the bound of r<1 - H_q (\delta + \varepsilon) is necessary to ensure R>0). See Figure 2 for a plot of this bound.

Note that the Zyablov bound implies that for every \delta>0, there exists a (concatenated) code with rate R>0.

Remarks[edit]

We can construct a code that achieves the Zyablov bound in polynomial time. In particular, we can construct explicit asymptotically good code (over some alphabets) in polynomial time.

Linear Codes will help us complete the proof of the above statement since linear codes have polynomial representation. Let Cout be an [N, K]_{Q}Reed-Solomon error correction code where N = Q-1 (evaluation points being \mathbb{F}_{Q}^* with Q = q^k, then k = \theta(\log N).

We need to construct the Inner code that lies on Gilbert-Varshamov bound. This can be done in two ways

  1. To perform an exhaustive search on all generator matrices until the required property is satisfied for C_{in}. This is because Varshamovs bound states that there exists a linear code that lies on Gilbert-Varshamon bound which will take q^{O(kn)} time. Using k=rn we get q^{O(kn)}=q^{O(k^{2})}=N^{O(\log N)}, which is upper bounded by nN^{O(\log nN)}, a quasi-polynomial time bound.
  2. To construct C_{in} in q^{O(n)} time and use (nN)^{O(1)} time overall. This can be achieved by using the method of conditional expectation on the proof that random linear code lies on the bound with high probability.

Thus we can construct a code that achieves the Zyablov bound in polynomial time.

References and External Links[edit]

Source[edit]

http://wikipedia.org/

See Also on BitcoinWiki[edit]