Binary symmetric channel

From Wikipedia, the free encyclopedia

A binary symmetric channel (or BSC) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver receives a bit. It is assumed that the bit is usually transmitted correctly, but that it will be "flipped" with a small probability (the "crossover probablility"). This channel is used frequently in information theory because it is one of the simplest channels to analyze.

Contents

[edit] Description

The BSC is a binary channel; that is, it can transmit only one of two symbols (usually called 0 and 1). (A non-binary channel would be capable of transmitting more than 2 symbols, possibly even an infinite number of choices) The transmission is not perfect, and occasionally the receiver gets the wrong bit.

This channel is often used by theorists because it is one of the simplest noisy channels to analyze. Many problems in communication theory can be reduced to a BSC. On the other hand, being able to transmit effectively over the BSC can give rise to solutions for more complicated channels.

[edit] Definition

A binary symmetric channel with crossover probability p is a channel with binary input and binary output and probability of error p; that is, if X is the transmitted random variable and Y the received variable, then the channel is characterized by the conditional probabilities

Pr( Y = 0 | X = 0) = 1-p
Pr( Y = 0 | X = 1) = p
Pr( Y = 1 | X = 0 ) = p
Pr( Y = 1 | X = 1 ) = 1-p

It is assumed that 0 ≤ p ≤ 1/2. If p>1/2, then the receiver can swap the output (interpret 1 when it sees 0, and visa versa) and obtain an equivalent channel with crossover probability 1-p ≤ 1/2.

[edit] Capacity of the BSC

The capacity of the channel is 1 - H(p), where H(p) is the binary entropy function.

The converse can be shown by a sphere packing argument. Given a codeword, there are roughly 2 n H(p) typical output sequences. There are 2n total possible outputs, and the input chooses from a codebook of size 2nR. Therefore, the receiver would choose to partition the space into "spheres" with 2n / 2nR = 2n(1-R) potential outputs each. If R> 1 - H(p), then the spheres will be packed too tightly asymptotically and the receiver will not be able to identify the correct codeword with vanishing probability.

[edit] References

[edit] See also

In other languages