Talk:56 kbit/s line
From Wikipedia, the free encyclopedia
The figure is derived from the bandwidth of 4 kHz allocated for such a channel and the 16-bit encoding (4000 times 16 = 64000) used to change analogue signals to digital, minus the 8000 bit/s used for signalling and supervision.
I'm not sure how the 56 kbps figure is actually derived, but I do know (and the DS0 article agrees with me) that POTS voice signals are sampled at 8 bits/sample and an 8 kHz sample rate, rather than 16 bits and 4 kHz as this article says.
I'm not sure, but I believe an argument could be made that 4 kHz is indeed the bandwidth (Nyquist, right?), but that would mess up my math, and I am at least pretty darned sure about the 8 bits/sample thing. :-)
[edit] robbed bit signaling
This definition should also include a reference to robbed-bit-signaling (RBS) as it too plays a part in the bandwidth limit.
[edit] Article rewritten
I have now substantially rewritten the article. My understanding is that 56k lines only used the most significant 7 bits of each sample period to avoid interference from robbed-bit signaling in the least-significant bit of every sixth sample. A more modern system might have tried to phase-lock to the RBS, or to have used error-correcting codes to mitigate its effects (as well as fixing any residual BER on the line), and thus would have been able to achieve a rate of 64000 * 47/48 = 62666.6+ bps. However, that was way too complex a solution to the problem, given the 1960s-era engineering of the time 56k lines were created, and dropping the bottom bit was simply easier. -- The Anome 10:32, 12 October 2005 (UTC)