AES/EBU

From Wikipedia, the free encyclopedia

The digital audio standard frequently called AES/EBU, officially known as AES3, is used for carrying digital audio signals between various devices. It was developed by the Audio Engineering Society (AES) and the European Broadcasting Union (EBU) and first published in 1992, later revised in 1995, 1998, and 2003. Several different physical connectors are also defined as part of the overall group of standards. A related system, S/PDIF, was developed essentially as a consumer version of AES/EBU, using connectors more commonly found in the consumer market. These are now part of the expanded AES3 standard as well.

Contents

[edit] Hardware Connections

The AES3 standard parallels part 4 of the international standard IEC 60958. Of the physical interconnection types defined by IEC 60958, three are in common use:

More recently, professional equipment (notably by Sony) has been fitted with BNC connectors for use with 2-conductor, 75-ohm coaxial cable. This uses the same cabling, patching and infrastructure as analogue or digital video.

F05 connectors, 5mm connectors for plastic optical fiber, are more commonly known by their Toshiba brand name, TOSLINK. The precursor of the IEC 60958 Type II specification was the Sony/Philips Digital Interface, or S/PDIF. For details on the format of AES/EBU data, see the article on S/PDIF. Note that the electrical levels differ between AES/EBU and S/PDIF.

For information on the synchronization of digital audio structures, see the AES11 standard. The ability to insert unique identifiers into an AES3 bit stream is covered by the AES52 standard.

Other AES3 transport structures.

AES3 digital audio format can also be carried over an Asynchronous Transfer Mode network. The standard for packing AES3 frames into ATM cells is AES47, and is also published as IEC 62365.

[edit] The Protocol

Simple representation of the protocol for both AES/EBU and S/PDIF
Simple representation of the protocol for both AES/EBU and S/PDIF

The low-level protocol for data transmission in AES/EBU and S/PDIF is largely identical, and the following discussion applies for S/PDIF as well unless otherwise noted.

AES/EBU was designed primarily to support PCM encoded audio in either DAT format at 48 kHz or CD format at 44.1 kHz. No attempt was made to use a carrier able to support both rates, instead AES/EBU allows the data to be run at any rate, and recovers the clock rate by encoding the data use Biphase Mark Code (BMC).

The bit stream consists of the PCM audio data broken down into small samples and inserted into a larger structure that also carries various status and information data. The highest level organization is the audio block, which roughly corresponds to a number of samples of the PCM data. Each block is broken into 192 frames numbered 0 to 191. Each frame is further divided in 2 subframes (or channels): A (left) and B (right). Each subframe contains the information for one single sample of the PCM audio, or more simply, one channel of audio. Each subframe is organized into 32 time slots numbered 0 to 31, each of which corresponds roughly to a single bit. Not all of the time blocks are used to send actual audio data, a number of them are set aside for signaling use, and others for transmitting data about the channels. In normal use only 20 time blocks are used for audio, providing a 20-bit sound quality (compare with a CD at 16 bits per sample). So a complete audio block basically contains 192 samples from two channels of audio and other data, containing 12288 bits in total.

The 32 bits of the time slots are used as following:

[edit] Time slots 0 to 3

They do not actually carry any data but they facilitate clock recovery and subframe identification. They are not BMC encoded so they are unique in the data stream and they are easier to recognize, but they don't represent real bits. Their structure minimizes the DC component on the transmission line. Three preambles are possible :

  • X (or M) : 11100010 if previous state was "0", 00011101 if it was "1"
  • Y (or W) : 11100100 if previous state was "0", 00011011 if it was "1"
  • Z (or B) : 11101000 if previous state was "0", 00010111 if it was "1"

They are called X, Y,Z from AES standard; M, W,B from the IEC 958 (an AES extension).

[edit] Time slots 4 to 7

These time slots can carry auxiliary information such as a low-quality auxiliary audio channel for producer talkback or studio-to-studio communication. Alternately, they can be used to enlarge the audio word-length to 24 bits, although the devices at either end of the link must be able to use this non-standard format.

[edit] Time slots 8 to 27

These time slots carry 20 bits of audio information starting with LSB and ending with MSB. If the source provides fewer than 20 bits, the unused LSBs will be set to the logical "0" (for example, for the 16-bit audio read from CDs bits 8-11 are set to 0).

[edit] Time slots 28 to 31

These time slots carry associated bits as follows:

  • V (28) Validity bit: it is set to zero if the audio sample word data are correct and suitable for D/A conversion. Otherwise, the receiving equipment is instructed to mute its output during the presence of defective samples. It is used by players when they have problem reading a sample.
  • U (29) User bit: any kind of data such as running time, song, track number, etc. One bit per audio channel per frame form a serial data stream. Each audio block has a single 192 bit control word.
  • C (30) Channel status bit: its structure depends on whether AES/EBU or S/PDIF is used.
  • P (31) Parity bit: for error detection. A parity bit is provided to permit the detection of an odd number of errors resulting from malfunctions in the interface. If set, it indicates an even parity.

[edit] The Channel Status Bit in AES/EBU

As stated before there is one channel status bit in each subframe, making a 192 bits word each audio block. This means that there are 192/8 = 24 byte available each audio block. The contents of the channel status bit are completely different between the AES / EBU and the SPDIF. For the AES / EBU, the standard describes in detail how most part of the bits have to be used. Here is only an overview from a higher point of view, as only the aims of the 24 bytes are described:

  • byte 0: basic control data: it says if the audio is compressed or not, if there is any kind of emphasis, which is the sample rate.
  • byte 1: it says whether the audio stream is stereo, mono or other combinations.
  • byte 2: it indicates the audio word length.
  • byte 3: it is used only for multichannel applications.
  • byte 4: it indicates the suitability of the signal as a sampling rate reference.
  • byte 5: reserved.
  • bytes 6 - 9 and 10 - 13: these two slots of four bytes each are used to transmit ASCII characters.
  • bytes 14 - 17: these 4 bytes provide a slot of 32 bit in which there is a sample address incrementing every frame. It is used to number the frames.
  • bytes 18 - 21: as above, but in a different format, as it is relative to the actual time (it starts from zero at midnight).
  • byte 22: it contains information about the reliability of the audio block.
  • byte 23: the final byte is a CRC. If it is not received, it means that the transmission was interrupted before the end of the audio block, and so the audio block is ignored.

[edit] See also

[edit] References

[edit] External links