Musical Instrument Digital Interface

From Wikipedia, the free encyclopedia

MIDI (Musical Instrument Digital Interface, IPA: /ˈmɪdi/) is an industry-standard protocol that enables electronic musical instruments, computers, and other equipment to communicate, control, and synchronize with each other. MIDI allows computers, synthesizers, MIDI controllers, sound cards, samplers and drum machines to control one another, and to exchange system data.

Note names and MIDI note numbers.
Note names and MIDI note numbers.

MIDI does not transmit an audio signal or media — it transmits digital data "event messages" such as the pitch and intensity of musical notes to play, control signals for parameters such as volume, vibrato and panning, cues, and clock signals to set the tempo. As an electronic protocol, it is notable for its widespread adoption throughout the industry, and for continuing in use since its introduction in 1983.

Contents

[edit] History

By the end of the 1970s, electronic musical devices were becoming increasingly common and affordable. However, devices from different manufacturers were generally not compatible with each other and could not be interconnected. Different interfacing models included analog control voltages at various standards (such as 1 volt per octave, or the logarithmic "hertz per volt"); analog clock, trigger and "gate" signals (both positive "V-trig" and negative "S-trig" varieties, between −15V to +15V); and proprietary digital interfaces such as Roland Corporation's DCB (digital control bus), the Oberheim system, and Yamaha's "keycode" system. In 1981, audio engineer and synthesizer designer Dave Smith of Sequential Circuits, Inc. proposed a digital standard for musical instruments in a paper for the Audio Engineering Society. The MIDI Specification 1.0 was published in August 1983.

Since then, MIDI technology has been standardized and is maintained by the MIDI Manufacturers Association (MMA). All official MIDI standards are jointly developed and published by the MIDI Manufacturers Association (MMA) in Los Angeles, California, USA (http://www.midi.org), and for Japan, the MIDI Committee of the Association of Musical Electronic Industry (AMEI) in Tokyo (http://www.amei.or.jp). The primary reference for MIDI is The Complete MIDI 1.0 Detailed Specification, document version 96.1, available only from MMA in English, or from AMEI in Japanese.

In 1991, the MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) was ratified by the MIDI Manufacturers Association in 1991. The MSC protocol is an industry standard which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI (above), MSC does not transmit the actual show media — it simply transmits digital data providing information such as the type, timing and numbering of technical cues called during a multimedia or live theatre performance.

In the early 1980s, MIDI was a major factor in bringing an end to the "wall of synthesizers" phenomenon in progressive rock band concerts, when keyboard performers were often hidden behind huge banks of analog synthesizers and electric pianos. Following the advent of MIDI, many synthesizers were released in rack-mount versions, which meant that keyboardists could control many different instruments (e.g., synthesizers) from a single keyboard.

In the 1980s, MIDI has facilitated the development of hardware and computer-based sequencers, which can be used to record, edit and play back performances. In the years immediately after the 1983 ratification of the MIDI specification, MIDI interfaces were released for the Apple Macintosh, Commodore 64, and the PC-DOS platform, allowing for the development of a market for powerful, inexpensive, and now-widespread computer-based MIDI sequencers. The Atari ST came equipped with MIDI ports as standard, as was commonly used in recording studios for this reason. Synchronization of MIDI sequences is made possible by the use of MIDI timecode, an implementation of the SMPTE time code standard using MIDI messages, and MIDI timecode has become the standard for digital music synchronization.

A number of music file formats have been based on the MIDI bytestream. These formats are very compact; a file as small as 10 KB can produce a full minute of music or more due to the fact that the file stores instructions on how to recreate the sound based on synthesis with a MIDI synthesizer rather than an exact waveform to be reproduced. A MIDI synthesizer could be built into an operating system, sound card, embedded device (e.g. hardware-based synthesizer) or a software-based synthesizer. The file format stores information on what note to play and when, or other important information such as possible pitch-bend during the envelope of the note or the note's velocity.

This is advantageous for applications such as mobile phone ringtones, and some video games, however may be a disadvantage to other applications in that the information is not able to guarantee an accurate waveform will be heard by the intended listener, because each MIDI synthesizer will have its own methods for producing the sound from the MIDI instructions provided. One example is that any MIDI file played back through the Microsoft MIDI Synthesizer (included in any Windows operating system) should sound the same or similar, however, when the same MIDI bytestream is outputted to a synthesizer on a generic sound card or even a MIDI synthesizer on another operating system, the actual heard and rendered result may vary. One sound card's synthesizer might not reproduce the exact sounds of another synthesizer.

As such, MIDI-based mobile phone ring tones sound different on a handset than when previewed on a PC. In the same way, most modern software synthesizers can handle MIDI files but might render them completely different from another synthesizer, especially since most modern software synthesizers such as a VST Instrument tend to allow the loading of different patches and the modification of these patches to create different sounds for each MIDI input. The term "MIDI sound" has got a poor reputation from some critics, which may be the result of the poor quality sound synthesis provided by many early sound cards, which relied on FM synthesis instead of wavetables to produce audio.

Almost all music recordings in the 2000s are compatible with MIDI devices. In addition, MIDI is also used to control hardware including recording devices and sound effects modules, as well as live performance equipment such as stage lights and some types of digital effect pedals. MIDI allows computers, synthesizers, MIDI controllers, sound cards, samplers and drum machines to control one another, and to exchange system data.

[edit] Interfaces

MIDI connector diagram
MIDI connector diagram

All MIDI In and MIDI Out connectors are part of a MIDI interface. A MIDI interface moves internal binary data to the MIDI Out connector for transmission to another device's MIDI In connector, in MIDI message form. It also receives incoming MIDI messages arriving on the MIDI In connector (from another device's MIDI Out connector) into internal binary data. Many MIDI compatible instruments have a MIDI Thru connector, which can be used to connect a second instrument and pass along MIDI data received by the MIDI In connector of the first instrument. Such chaining together of instruments via MIDI Thru ports is unnecessary with the use of MIDI "patch bay," "mult" or "Thru" modules or boxes consisting of a MIDI In connector and multiple MIDI Out connectors to which multiple instruments are connected. Physically MIDI connectors are DIN 5/180° connectors.

All MIDI compatible instruments have a built-in MIDI interface. Some computers' sound cards have a built-in MIDI Interface, whereas others require an external MIDI Interface which is connected to the computer via the game port, the newer DA-15 connector, a USB connector or by FireWire or ethernet.

[edit] Messages

All MIDI compatible controllers, musical instruments, and MIDI-compatible software follow the same MIDI 1.0 specification, and thus interpret any given MIDI message the same way, and so can communicate with and understand each other. For example, if a note is played on a MIDI controller, it will sound at the right pitch on any MIDI instrument whose MIDI In connector is connected to the controller's MIDI Out connector.

When a musical performance is played on a MIDI instrument (or controller) it transmits MIDI channel messages from its MIDI Out connector. A typical MIDI channel message sequence corresponding to a key being struck and released on a keyboard is:

  1. The user presses the middle C key with a specific velocity (which is usually translated into the volume of the note but can also be used by the synthesiser to set characteristics of the timbre as well). ---> The instrument sends one Note-On message.
  2. The user changes the pressure applied on the key while holding it down - a technique called Aftertouch (can be repeated, optional). ---> The instrument sends one or more Aftertouch messages.
  3. The user releases the middle C key, again with the possibility of velocity of release controlling some parameters. ---> The instrument sends one Note-Off message.

Note-On, Aftertouch, and Note-Off are all channel messages. For the Note-On and Note-Off messages, the MIDI specification defines a number (from 0–127) for every possible note pitch (C, C, D etc.), and this number is included in the message.

Other performance parameters can be transmitted with channel messages, too. For example, if the user turns the pitch wheel on the instrument, that gesture is transmitted over MIDI using a series of Pitch Bend messages (also a channel message). The musical instrument generates the messages autonomously; all the musician has to do is play the notes (or make some other gesture that produces MIDI messages). This consistent, automated abstraction of the musical gesture could be considered the core of the MIDI standard.

[edit] Composition

MIDI composition takes advantage of the MIDI interface to allow musical data files to be shared among various electronic instruments by using a standard list of commands and parameters known as General MIDI (GM). Because the music is simply data and not actually recorded wave forms, it is therefore maintained in a small file format. Several computer programs allow manipulation of the data so that composing for an entire orchestra of synthesized instrument sounds is possible. The data can be reproduced by any electronic instrument that adheres to the GM standard. There are many websites that allow downloads of popular songs as well as classical music, and there are also websites where MIDI composers can share their works.

[edit] File formats

[edit] Standard MIDI File (SMF) Format

MIDI messages (along with timing information) can be collected and stored in a computer file system, in what is commonly called a MIDI file, or more formally, a Standard MIDI File (SMF). The SMF specification was developed by, and is maintained by, the MIDI Manufacturers Association (MMA). MIDI files are typically created using computer-based sequencing software (or sometimes a hardware-based MIDI instrument or workstation) that organizes MIDI messages into one or more parallel "tracks" for independent recording and editing. In most sequencers, each track is assigned to a specific MIDI channel and/or a specific General MIDI instrument patch. Although most current MIDI sequencer software uses proprietary "session file" formats rather than SMF, almost all sequencers provide export or "Save As..." support for the SMF format.

An SMF consists of one header chunk and one or more track chunks. There exist three different SMF formats; the format of a given SMF is specified in its file header. A Format 0 file contains a single track and represents a single song performance. Format 1 may contain any number of tracks, enabling preservation of the sequencer track structure, and also represents a single song performance. Format 2 may have any number of tracks, each representing a separate song performance. Sequencers do not commonly support Format 2.

Large collections of SMFs can be found on the web, most commonly with the extension .mid. These files are most frequently authored with the assumption that they will be played on General MIDI players.

[edit] MIDI Karaoke File (.KAR) Format

MIDI-Karaoke (which uses the ".kar" file extension) files are an "unofficial" extension of MIDI files, used to add synchronized lyrics to standard MIDI files. SMF players play the music as they would a .mid file but do not display these lyrics unless they have specific support for .kar messages. These often display the lyrics synchronized with the music in "follow-the-bouncing-ball" fashion, essentially turning any PC into a karaoke machine.

MIDI-Karaoke file formats are not maintained by any standardization body.

[edit] XMF File Formats

The MMA has also defined (and AMEI has approved) a new family of file formats, XMF (eXtensible Music File), some of which package SMF chunks with instrument data in DLS format (Downloadable Sounds, also an MMA/AMEI specification), to much the same effect as the MOD file format. The XMF container is a binary format (not XML-based, although the file extensions are similar). See the main article Extensible Music Format (XMF).

[edit] RIFF-RMID File Format

On Microsoft Windows, the system itself uses RIFF-based MIDI files with the .rmi extension. Note, Standard MIDI Files per se are not RIFF-compliant. A RIFF-RMID file, however, is simply a Standard MIDI File wrapped in a RIFF chunk. By extracting the data part of the RIFF-RMID chunk, the result will be a regular Standard MIDI File.

In recommended practice RP-29 ([1]), the MMA defined a method for bundling one Standard MIDI file (SMF) image with one Downloadable Sounds (DLS) image, however, this method was obsoleted by the introduction of the Extensible Music Format (XMF), which should be used for this purpose.

[edit] Usage and applications

[edit] Extensions of the MIDI standard

Many extensions of the original official MIDI 1.0 spec have been standardized by MMA/AMEI. Only a few of them are described here; for more comprehensive information, see the MMA web site.

[edit] General MIDI

The General MIDI (hereafter referred to as "GM") standard addresses the indeterminacy of the MIDI standard regarding the meaning of program change and controller messages and other synthesizer features: early synthesizers could, and actually did, sound completely different in response to the same MIDI messages and required different controller messages for similar purposes. The GM standard mandates an assignment of specific instruments to program change settings (for example, 3 is "Grand Piano"), the mapping of several controller numbers to important effects, use of channel 10 for percussions (a specific unpitched sound in place of each note), and various minimum specifications. Currently, only very old, very low-end or very specialized synthesizers do not implement the General MIDI standard or one of its successors; General MIDI compatibility is almost universal for music distributed in SMF formats, which relies on this standard for portability. Although dependent on the basic MIDI 1.0 specification, the GM and GM2 specifications are each separate from it. As such, it is not generally safe to assume that any given MIDI message stream or MIDI file is intended to drive GM-compliant or GM2-compliant MIDI instruments. General Midi 1 was introduced in 1991.

[edit] GS and XG

To improve the General MIDI Standard and match the improvements of newer synthesizers both Roland, with its GS specification, and Yamaha, with its XG specification, introduced stricter requirements while maintaining compatibility with GM commands. Adoption of these two standards has been generally limited to the respective manufacturer.

[edit] General MIDI Level 2

Later, companies in Japan's Association of Musical Electronics Industry (sic) (AMEI) developed General MIDI Level 2 (GM2), incorporating aspects of the Yamaha XG and Roland GS formats, further extending the instrument palette, specifying more message responses in detail, and defining new messages for custom tuning scales and more. The GM2 specs are maintained and published by the MMA and AMEI. General MIDI 2 was introduced in 1999 and it is commonly implemented in newer synthesizers.

[edit] SP-MIDI

Later still, GM2 became the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications where different players may have different numbers of musical voices. SP-MIDI is a component of the 3GPP mobile phone terminal multimedia architecture, starting from release 5.

GM, GM2, and SP-MIDI are also the basis for selecting player-provided instruments in several of the MMA/AMEI XMF file formats (XMF Type 0, Type 1, and Mobile XMF), which allow extending the instrument palette with custom instruments in the Downloadable Sound (DLS) formats, addressing another major GM shortcoming.

[edit] Alternative Tunings

By convention, instruments that receive MIDI generally use the conventional 12-pitch per octave equal temperament tuning system. Unfortunately this tuning system makes many types of music inaccessible because the music depends on a different intonation system. To address this issue in a standardized manner, in 1992 the MMA ratified the MIDI Tuning Standard, or MTS. This standard allows MIDI instruments that support MTS to be tuned in any way desired, through the use of a MIDI Non-Real Time System Exclusive message.

MTS uses three bytes, which can be thought of as a three-digit number base 128, to specify a pitch in logarithmic form. The following formula gives the byte values needed to encode a given frequency in Hertz:

p = 69 + 12\times\log_2 { \left(\frac {f}{440} \right) }

For a note in A440 equal temperament, this formula delivers the standard MIDI note number. Any other frequencies fill the space evenly. While support for MTS is not particularly widespread in commercial hardware instruments, it is nonetheless supported by some instruments and software, for example the free software programs TiMidity and Scala, as well as other microtuners.

[edit] Alternate Hardware Transports

In addition to the original 31.25 kBaud current-loop, 5-pin DIN transport, transmission of MIDI streams over USB, IEEE 1394 a.k.a FireWire, and Ethernet is now common (see below).

[edit] Over Ethernet

Compared to USB or FireWire, the Ethernet implementation of MIDI provides network routing capabilities, which are extremely useful in studio or stage environments (USB and FireWire are restricted to connections between one computer and some devices and do not provide any routing capabilities).

Ethernet is moreover capable of providing the high-bandwidth channel that earlier alternatives to MIDI (such as ZIPI) were intended to bring.

After the initial fight between different protocols (IEEE-P1639, MIDI-LAN, IETF RTP-MIDI), it appears that IETF's RTP MIDI specification for transport of MIDI streams over Ethernet and Internet is now spreading faster and faster since more and more manufacturers are integrating RTP-MIDI in their products (Apple, CME, Kiss-Box, etc...). Mac OS X, Windows and Linux drivers are also available to make RTP MIDI devices appear as standard MIDI devices within these operating systems.

IEEE-P1639 is now a dead project. The other proprietary MIDI/IP protocols are slowly disappearing one after the other, since most of them require expensive licensing to be implemented (while RTP MIDI is completely opened) or the MIDI implementation does not bring any real advantage (apart from speed) over original MIDI protocol.

[edit] RTP-MIDI Transport Protocol

The RTP-MIDI protocol has been officially released in public domain by IETF in December 2006 (IETF RFC4695).[1] RTP-MIDI relies on the well-known RTP (Real Time Protocol) layer (most often running over UDP, but compatible with TCP also), widely used for real-time audio and video streaming over networks. The RTP layer is easy to implement and requires very little power from the microprocessor, while providing very useful information to the receiver (network latency, dropped packet detection, reordered packets, etc.). RTP-MIDI defines a specific payload type, that allows the receiver to identify MIDI streams.

RTP-MIDI does not alter the MIDI messages in any way (all messages defined in the MIDI norm are transported transparently over the network), but it adds additional features such as timestamping and sysex fragmentation. RTP-MIDI also adds a powerful 'journalling' mechanism that allows the receiver to detect and correct dropped MIDI messages.The first part of RTP-MIDI specification is mandatory for implementors and describes how MIDI messages are encapsulated within the RTP telegram. It also describes how the journalling system works. The journalling system is not mandatory (journalling is not very useful for LAN applications, but it is very important for WAN applications).

The second part of RTP-MIDI specification describes the session control mechanisms that allow multiple stations to synchronize across the network to exchange RTP-MIDI telegrams. This part is informational only, and it is not required.

RTP-MIDI is included in Apple's Mac OS X, as standard MIDI ports (the RTP-MIDI ports appear in Macintosh applications as any other USB or FireWire port. Thus, any MIDI application running on Mac OS X is able to use the RTP-MIDI capabilities in a transparent way). However, Apple's developers considered the session control protocol described in IETF's specification to be too complex, and they created their own session control protocol. Since the session protocol uses a UDP port different from the main RTP-MIDI stream port, the two protocols do not interfere (so the RTP-MIDI implementation in Mac OS X fully complies to the IETF specification).

Apple's implementation has been used as reference by other MIDI manufacturers. A Windows XP RTP-MIDI driver[2] for their own products only has been released by the Dutch company Kiss-Box and a Linux implementation is currently under development by the Grame association.[3] So it seems probable that the Apple's implementation will become the "de-facto" standard (and could even become the MMA reference implementation).

[edit] Other applications

MIDI is also used every day as a control protocol in applications other than music, including:

Such non-musical applications of MIDI are possible because any device built with a standard MIDI Out connector should in theory be able to control any other device with a MIDI In port, just as long as the developers of both devices have the same understanding about the semantic meaning of all the MIDI messages the sending device emits. This agreement can come either because both follow the published MIDI specifications, or else in the case of any non-standard functionality, because the message meanings are agreed upon by the two manufacturers.

[edit] Beyond MIDI 1.0

Although traditional MIDI connections work well for most purposes, a number of newer message protocols and hardware transports have been proposed over the years to try to take the idea to the next level. Some of the more notable efforts include:

[edit] OSC

The Open Sound Control (OSC) protocol was at CNMAT. OSC has been implemented in the well-known software synthesizer Reaktor and in other projects including SuperCollider, Pure Data, Isadora, Max/MSP, Csound, vvvv and ChucK. The Lemur Input Device, a customizable touch panel with MIDI controller-type functions, also uses OSC. OSC differs from MIDI over traditional 5-pin DIN in that it can run at broadband speeds when sent over Ethernet connections. Unfortunately few mainstream musical applications and no standalone instruments support the protocol so far, making whole-studio interoperability problematic. OSC is not owned by any private company, however it is also not maintained by any standards organization. Since September 2007, there is a proposal for a standardized namespace within OSC for communication between and controllers, synthesizers and hosts.

[edit] mLAN

Yamaha has its mLAN[2] protocol, which is a based on the IEEE 1394 transport (also known as FireWire) and carries multiple MIDI message channels and multiple audio channels. mLAN is not maintained by a standards organization as it is a proprietary protocol. mLAN is open for licensing, although covered by patents owned by Yamaha.

[edit] HD-MIDI

Development of a major modernization of MIDI is now under discussion in the MMA. Tentatively called "High-Definition MIDI" (HD-MIDI), this new standard would support modern high-speed transports, provide greater range and/or resolution in data values, increase the number of MIDI Channels, and support the future introduction of entirely new kinds of MIDI messages. Representatives from all sizes and types of companies are involved, from the smallest speciality show control operations to the largest musical equipment manufacturers. No technical details or projected completion dates have been announced.[4][5]

See also: New Interfaces for Musical Expression

[edit] MIDI software

Further information: List of MIDI editors and sequencers
Further information: MIDI Show Control#MIDI Show Control software

[edit] Example MIDI files

[edit] See also

[edit] References

  1. ^ IETF RTP-MIDI specification
  2. ^ Windows XP RTP-MIDI driver download
  3. ^ Grame's website
  4. ^ MMA Hosts HD-MIDI Discussion at NAMM, MIDI Manufacturers Association.
  5. ^ Finally: MIDI 2.0, O'Reilly Digital Media Blog.

[edit] External links

[edit] Official MIDI Standards Organizations

[edit] Unofficial Sources

[edit] Other resources