Error Correcting Codes
Author: Prof. Adnan Aziz
ECC: widely used communication, storage
2 types of ECC:
block coding
convolutional coding
info seq divided into blocks of k-bits
u = (u0, u1, u2, ..., uk-1) “message”
2k possible messages
BLOCK CODES:
each u independently transformed into an n-tuple v = (v0, v1, v2, ..., vn-1) “codeword”
- combinational function
(n, k) block code
rate R = k/n
n tuple interpretation is associated with (n-k) redundant bits -> fight channel noise
Table
conv code:
also takes k-bit blocks produces n-bit blocks
however its output is “sequential” format
FSM same rate concept
e.g. Fig. 1.3
We will focus on block codes
Given a block code what is the best possible decoding ?
Decode ? Intuitively want to P(r|v)
Rationale
get from r
Input:
depends on (coding is injective)
equivalent to getting from r (easier conceptually)
P(E|r) = P(≠v|r)
P(E)= P(E|r)P(r) = P(E ( r)))
to minimize P(E) minimize each P(E|r) independently
Since P(E|r) = P(≠v|r)
min P(E|r) = max P( = v | r)
choose as v P(r|v)P(v)/P(r) (MLD)
if all information seq are equally likely P(v) is constant
So we want P(r|v)
= max Product P(ri|vi)
sometimes max x <-> max log x
log(P(r|v)) = log P(ri|vi)
MLD for BSC: intuitively want minimum distance
Here’s a rigorous argument why minimum distance decoding works best
ri≠vi for some is
ri≠vi -> P(ri|vi) = p
ri = vi -> P(ri|vi) = 1-p
d(r, v) = distance from r to v
log(P(r|v)) = d(r, v) log p + [n – d(r, v)] log(1-p)
= d(r, v) log(p/1-p) + n log(1-p)
è want minimum distance decoder
Shannon: every channel has a capacity C
R<C -> code of rate R with MLD
have “arbitrary small error probability” -> what does this mean ??
both block and conv
BSC: R = 1 – H(p) H = -[p log p + (1 – p) log (1 – p)]
argument: pick a “random function” mapping Bk -> Bn
Proof:
use simple bounds e.g. union bound (P(A B) <= P(A) + P(B))
Bad: never pick function at random??
huge k, n
MLD decoding
Error types: random error channel e.g. Binary Symmetric Channel (BSC)
space communication, wireless line of sight | memoryless
channel with memory good eg state/ bad state
cellular multipath
magnetic recording
simple solution: shuffle the bits (Can you think of a good way to do this?)
Focus: “F.E.C” Block Diagram has 1-way communication must use F.E.C
2-way alternative ->
Use ARQ
· add parity check (not just one bit)
to detect errors very good schemes for this
· have rx tell tx to retransmit
stop wait ARQ
continuous ARQ
Advantage of ARQ – cheap HW adaptive
Disadvantage – high error rate -> lots of retransmits latency
often combined
Code Performance
Block/words error rate Bit Error Rate
coding gain: given a BER difference between coded & non coded 3 db coding gain typically
è ½ the power !! (Tx Power)
Block Codes: rather than arbitrary function would like coding function that is easier to implement ( & has good error correcting capability )
Linear Block Codes ( focus on the set of code words )
Definition: BC of length n with 2 k codewords is called linear (n, k) code iff all the codewords form a k- dimensional vector subspace of the n tuples on GF(2)
101101 100011 = 001110
1.(101101) = (101101)
0.(101101) = (000000)
Definition:
a1, a2, ..., al are linearly independent iff -> for all i, ci = 0
c.(a+b) = ca + cb
Fact code is linear <-> sum of any two codewords is a codeword
a b c
a. (b c) = (a.b) (a.c)
Fact can find k linearly independent codewords g0 g1 ... gk-1 is C such that
v = u0g0 + u1g1 + .... + uk-1gk-1 for all v G
v= u. G encoding function
Systematic code: redundant check| msg
u0g0 u1g1 = u0 (g0 g1) u1g1 = u0g0 u1g1 u0g1
Fact: for any k x n G with k linear independent rows [gis are linearly independent]
(n-k) x n matrix H with n-k linearly independent rows “Parity Check Matrix”
such that hi orthogonal to gi for all hi H gi G
hi gi = 0 (mod 2)
So any code word is orthogonal to all elements of H
i.e. V HT = 0 for all v C
converse is also true
If receive r, sent v e = r + v (BSC)
decoder compute r. H T = 0 iff r is a codeword
if H fails, flip as few bits as possible in r to make parity check pass