|
This article has multiple issues. Please help improve it or discuss these issues on the talk page.
- It needs additional citations for verification. Tagged since April 2008.
- It may require copy editing for grammar, style, cohesion, tone, or spelling. Tagged since May 2011.
- It is written like a personal reflection or essay rather than an encyclopedic description of the subject. Tagged since November 2010.
|
Note names along with their accompanying MIDI note numbers
MIDI ( /ˈmɪdi/; short for Musical Instrument Digital Interface) is an electronic musical instrument industry specification that enables a wide variety of digital musical instruments, computers and other related devices to connect and communicate with one another.[1]
The primary functions of MIDI include communicating event messages about musical notation, pitch, velocity, control signals for parameters (such as volume, vibrato, audio panning, cues, and clock signals (to set and synchronize tempo) between multiple devices; these complete a signal chain and produce audible sound from a sound source. For users, MIDI enables a single player to sound as though they are playing two or more instruments simultaneously. As an electronic protocol, it is notable for its widespread adoption throughout the music industry.
Establishment of the MIDI standard occurred in the early 1980s,[2] yielding a number of significant benefits to musicians, recording artists, and hobbyists.
- Simplified connectivity – Decreased the complexity (and volume) of connection cables required between devices.
- Fewer contributors required – Beginning in the 1980s, musical acts can perform live with as few as one or two members who operate multiple MIDI-enabled devices simultaneously and successfully deliver a performance which sounds similar to that of a much larger group of live musicians.[3]
- Increased accessibility – Enabled users to create, edit, layer and build high-quality digital music recordings with less expense; professional musicians can now do this using a home recording space (or any other environment) without the need for renting a professional recording studio and staff.[4] It has also enabled hobbyists, with little or no musical training, to produce high-quality recordings with the powerful capabilities of MIDI music editing software.,[5][6]
- Portability of electronic music gear – Greatly reduced the amount and variety of equipment (as well as wired connections) that performing musicians needed to travel around with, haul, pack and unpack, set up and connect, in order to produce a variety of sounds.[7]
The initial MIDI specification, MIDI 1.0, was agreed upon in August, 1982 by the MIDI Manufacturers Association.[8] It provides both a software and a hardware standard for the method of the encoding, storage, synchronization, and transmission of musical performance and control data. The standard enables intercommunication between a broad array of electronic equipment.
All MIDI-compatible controllers, musical instruments and MIDI-compatible software follow the same MIDI 1.0 specification, and thus interpret MIDI messages the same way. MIDI 1.0 and General MIDI (GM) eliminated compatibility issues by the use of a standard set of commands and parameters.
- A hardware standard for physically connecting electronic musical instruments and related equipment together. (MIDI Interface, MIDI Adapters, MIDI Cables).
- A software standard that includes a data encoding scheme for storage and transmission of digital messages, sent at extremely high speed, that include data about the musical performance, as well as information known as control event messages of various types. Such message types include musical notation, pitch, velocity (how hard a key is struck on a keyboard, for example), control signals for parameters (such as volume, vibrato, panning, cues, and clock signals. (MIDI messages, MIDI file)
- Communication protocols for transmitting and synchronizing musical performance and control event data. (MIDI Machine Control, MIDI Show Control, MIDI timecode)
- Instrument categorization standards to specify the sound produced by a MIDI event; terminology such as timbres, patches or programs, dictate how a musical passage should sound be perceived by the human ear. For example: One passage is programmed to trigger the sounds of a piano, while another might be built to accompany it with the sound of a violin. Drum kits and other percussion sound "families" often have their own patches, enabling percussive phrases to effectively mimic the sound of a human drummer.
It allows an input device, also called trigger device, which a user plays or touches to perform audio from the sound source devices, such as a programmable bank of instrument or other sounds. MIDI takes advantage of MIDI 1.0 and technology Since MIDI data is stored as a series of instructions, MIDI data is extremely compact, especially when compared to the sound stored in digital audio files, in which sonic vibration data for each tiny fraction of a second must be captured and stored. Thus, the size of MIDI files in computer storage — even for long, complex compositions — is almost always much, much smaller than corresponding digital audio files.
|
This section has multiple issues. Please help improve it or discuss these issues on the talk page.
|
By the end of the 1970s, electronic musical devices were becoming increasingly common and affordable in North America, Europe and Japan. Proprietary digital interfaces such as Roland Corporation's DCB (digital control bus), the Oberheim system,[clarification needed] and Yamaha's "keycode" system[clarification needed] delivered inter-connectivity between devices; however, these designs were limited to allowing connectivity between devices produced by the same manufacturer.
Audio engineer and synthesizer designer Dave Smith, then of Sequential Circuits, Inc., came up with the idea for a digital music standard that eventually became MIDI while creating a new kind of keyboard synthesizer. Smith was developing an innovation to the analog synthesizer (a product later to become known as the Prophet 5). Up until that point, analog synthesizers could only play one note (or voice) at a time. Such keyboards controlled the timbre of the sound directly, as did each knob, switch and other control on the instrument.
Smith's innovation was to create a version of the analog synthesizer that was capable of playing many notes at the same time. Such capability, known as polyphony, was not new; indeed, ordinary pianos, organs, guitars and other instruments had such capability for centuries. But the way that Smith chose to execute his design was new. His idea was to make the system digitally controllable. It would be an instrument with multiple, identical, sound-producing engines ("voices") — one for each note — but all the parameters of each voice were digitally controllable. Now, when the player turned a knob on the front panel, instead of the knob directly controlling a single element of the signal path, its setting would be digitized, and the same parameter could be simultaneously affected on all of the voices. Similarly, instead of having the keyboard control a single note, a microprocessor was used that would rapidly and continuously scan all of the keys to detect which ones were currently pressed, and convert that information to a pitch control that would be assigned to each voice. In this manner, the musician playing the instrument would experience it as if the keyboard and all of the knobs and other controls were directly controlling a multi-voice instrument.
This innovation meant two important things: since all of the controls were digitized, their settings could be remembered, and the synthesizer could be provided a memory wherein "patches" could be stored and instantly recalled. More significantly, it abstracted the keyboard, knobs, pedals, and other controls away from the sound-producing circuitry and made it necessary to develop a protocol for communication between the former and the latter.
Smith had the insight that the data connection could be made accessible with input and output jacks on the instrument, and, if the protocol were standardized between manufacturers, would provide a means for myriad instruments and other devices to interoperate, controlling and being controlled by each other at the digital level. Thus, MIDI was born.
Following several months of discussion between US and Japanese manufacturers, in November 1981, Smith proposed a digital standard for musical instruments at the Audio Engineering Society show in New York. By the time of the January, 1983 Winter NAMM Show, Smith was able to demonstrate a MIDI connection between his Prophet 600 (a later version of the groundbreaking Prophet 5 analog synthesizer) and a Roland JP-6. The MIDI Specification 1.0 was published in August 1983. (See MMA)[9][10]
MIDI brought an unprecedented state of compatibility which revolutionized the market by ridding musicians of the need for excessive hardware.[11] In the early 1980s, MIDI was a major factor in bringing an end to the "wall of synthesizers" phenomenon in progressive rock band concerts, when keyboard performers were often hidden behind huge banks of analog synthesizers and electric pianos. Following the advent of MIDI, many synthesizers were released in rack-mount versions, which meant that keyboardists could control many different instruments (e.g., synthesizers) from a single keyboard.
In the 1980s, MIDI facilitated the development of hardware and computer-based sequencers, which can be used to record, edit and play back performances. In the years immediately after the 1983 ratification of the MIDI specification, MIDI features were adapted to several early computer platforms including Apple II Plus and IIe, Apple Macintosh, Commodore 64, Commodore Amiga and PC-DOS. This allowed the development of a market for powerful, inexpensive, and now-widespread computer-based MIDI sequencers. The standard Atari ST came equipped with MIDI ports and was commonly used in recording studios for this reason. Synchronization of MIDI sequences was made possible by the use of MIDI timecode, an implementation of the SMPTE time code standard using MIDI messages, and MIDI timecode has become the standard for digital music synchronization.
In 1991, the MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) was ratified by the MIDI Manufacturers Association. The MSC protocol is an industry standard which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI, MSC does not transmit the actual show media: it simply transmits digital data providing information such as the type, timing and numbering of technical cues called during a multimedia or live theatre performance.
Small file sizes made MIDI files a popular way of sharing music on the Internet in the early to mid 1990s, before broadband connections made it practical to share files in the MP3 format. Many gopher, and later web, sites hosted directories of MIDI files created by fans, thus avoiding the copyright issues that would later plague other forms of online music sharing.
MIDI initially made no provision for specifying timbre. In other words, each MIDI synthesizer had its own methods for producing the sound from MIDI instructions, with no standard sounds at all. For example, a producer might want a MIDI file played back through the Microsoft MIDI Synthesizer (included in any Windows operating system) to sound the same or similar on all machines. But because the quality of synthesis hardware might vary widely between machines—one might use a generic sound card, another might use professional-quality synthesis—there was no way to assure that what the listener heard was anything like what the producer intended.
This situation was the impetus for the introduction of General MIDI in 1991. It created a standard set of 128 familiar sound types (piano, organ, guitar, strings). While manufacturers were still unable to decide what 'piano' sounded like, they at least had a standard to aim for and a location in which to place it.
In the early decades of MIDI, computer hardware was not able to play many samples or synthesize quality sounds. Quality hardware was too expensive; sound cards kept the price down, but many relied on unsophisticated synthesis methods to produce audio. As a result the "MIDI sound" acquired a poor reputation with some critics.
MIDI technology was standardized and is maintained by the MIDI Manufacturers Association (MMA). All official MIDI standards are jointly developed and published by the MMA in Los Angeles, California, USA, and for Japan, the MIDI Committee of the Association of Musical Electronics Industry (AMEI) in Tokyo.
Primary reference for MIDI is The Complete MIDI 1.0 Detailed Specification, document version 96.1, available only from MMA in English, or from AMEI in Japanese. Though the MMA site formerly offered free downloads of all MIDI specifications, links to the basic and general detailed specs have been removed. Printed documents can be purchased. However, considerable ancillary material is available at no cost on the website.
- Electronic keyboards – Synthesizers and samplers which feature a built-in keyboard, MIDI keyboards (also referred to as MIDI controllers), or hardware music workstations
- Personal computers – Equipped with an internal MIDI-capable sound card
- MIDI interfaces – Used to connect MIDI devices to devices without built-in MIDI capability. A common usage scenario is enabling MIDI support on a computer via an external sound card, which connects via USB or FireWire. Other applications include inter-connectivity with analog (non-digital) audio outputs, microphone inputs, optical audio cables.
- Audio control surfaces, – Often resembling a mixing consoles in appearance, enable a level of hands-on control for changing parameters such as sound levels and effects applied to individual tracks of a multitrack recording or live performance output.
- Digital effects units – Apply audio effects such as reverb, delay, and chorus to simulate the sound of the music being played in a large hall, or in a canyon, or with multiple voices all playing at once, respectively.
- Digital percussion devices – Triggers percussive or other relatively short sounds, usually via specifying a pattern or order in which the sounds should be played, on a drum machine or rhythm machine (also referred to as simply "beat boxes").
- Other musical instruments – Non-traditional and DIY devices custom-built to accept MIDI input, or devices adapted with additional hardware to provide MIDI-compatible signals. The MIDI guitar and MIDI violin are two such instruments.
MIDI connectors and a MIDI cable
The original physical MIDI connection uses DIN 5/180° connectors. Opto-isolating connections are used, to prevent ground loops occurring among connected MIDI devices.
The MIDI transceivers physically and logically separate the input and output lines, meaning that MIDI messages received by a device in the network not intended for that device must be re-transmitted on the output line (MIDI-OUT) by means of a "soft through". This can introduce a delay, one that is long enough to become musically significant on larger MIDI chains.
MIDI-THRU ports started to be added to MIDI-compatible equipment soon after the introduction of MIDI, in order to improve performance. The MIDI-THRU port avoids the aforementioned retransmission delay by linking the MIDI-THRU port to the MIDI-IN socket almost directly. The difference between the MIDI-OUT and MIDI-THRU ports is that data coming from the MIDI-OUT port has been generated on the device containing that port. Data that comes out of a device's MIDI-THRU port, however, is an exact duplicate of the data received at the MIDI-IN port.
Such chaining together of instruments via MIDI-THRU ports is unnecessary with the use of MIDI "patch bay," "mult" or "Thru" modules consisting of a MIDI-IN connector and multiple MIDI-OUT connectors to which multiple instruments are connected. MIDI Thru Boxes also clean up any skewing of MIDI data bits that might occur at the input stage.
Some equipment has the ability to merge MIDI messages into one stream; this is a specialized function and is not universal to all equipment. Such MIDI Merge boxes digitally merge all MIDI messages appearing at its inputs to its output, which allows a musician to plug in several MIDI controllers (e.g., two musical keyboards and a pedal keyboard) to a single synth voice device such as an EMU or Proteus.
All MIDI compatible instruments have a built-in MIDI. Some computers' sound cards have a built-in MIDI, whereas others require an external MIDI which is connected to the computer via the newer D-subminiature DA-15 game port, a USB connector or by FireWire, Ethernet or by MADI (RME standard). MIDI connectors are defined by the MIDI standard. In the 2000s, as computer equipment increasingly used USB connectors, companies began making MIDI-to-USB data interfaces which can transfer MIDI channels to USB-equipped computers. As well, due to the increasing use of computers for music-making and composition, some MIDI keyboard controllers were equipped with USB jacks, so that they can be plugged into computers that are running "software synths" or other music software.
In popular parlance, piano-style musical keyboards are called "keyboards", regardless of their functions or type. Amongst MIDI enthusiasts, however, keyboards and other devices used to trigger musical sounds are called "controllers", because with most MIDI set-ups, the keyboard or other device does not make any sounds by itself. MIDI controllers need to be connected to a voice bank or sound module in order to produce musical tones or sounds; the keyboard or other device is "controlling" the voice bank or sound module by acting as a trigger. The most common MIDI controller is the piano-style keyboard, either with weighted or semi-weighted keys, or with unweighted synth-style keys. Keyboard-style MIDI controllers are sold with as few as 25 keys (2 octaves), with larger models such as 49 keys, 61 keys, or even the full 88 keys being available. Different models have different feature sets, the simplest being only keys, while the more extravagant have sliders, knobs, and wheels to provide more controlling options.[12] These include a variety of parameters that can be programmed within the controller, or sent to a computer to control software.
MIDI controllers are also available in a range of other forms, such as electronic drum triggers; pedal keyboards that are played with the feet (e.g., with an organ); wind controllers for performing saxophone-style music; and MIDI guitar synthesizer controllers. A wind controller is designed for performers who want to play saxophone, clarinet, oboe, bassoon, and other wind instrument sounds with a synthesizer module. When wind instruments are played using a MIDI keyboard, it is hard to reproduce the expressive control found on wind instruments that can be generated with the wind pressure and embouchure. A typical wind controller has an air-pressure level sensor and (usually) a bite sensor in the mouthpiece and touch sensors or keys (commonly approximating saxophone key arrangement) arrayed along the body. Additionally, controls such as buttons, touch sensors and pitch wheels for generating additional midi messages or changing the way the controller behaves (for example, note sustain or octave shifts) are typically located in positions where they can, more or less easily, be accessed while playing. A less common type of wind controller mimics the mechanics of valved brass instruments.
Pad controllers are used by musicians and DJs who make music through use of sampled sounds or short samples of music. Pad controllers often have banks of assignable pads and assignable faders and knobs for transmitting MIDI data or changes; the better-quality models are velocity-sensitive. More rarely, some performers use more specialized MIDI controllers, such as triggers that are affixed to their clothing or stage items (e.g., magicians Penn and Teller's stage show).[13]
A MIDI foot-controller is a pedalboard-style device with rows of switches that control banks of presets, MIDI program change commands and send MIDI note numbers (some also do MIDI merges). Another specialized type of controller is the drawbar controller; it is designed for Hammond organ players who have MIDI-equipped organ voice modules. The drawbar controller provides the keyboard player with many of the controls which are found on a vintage 1940s or 1950s Hammond organ, including harmonic drawbars, a rotating speaker speed control switch, vibrato and chorus knobs, and percussion and overdrive controls. As with all controllers, the drawbar controller does not produce any sounds by itself; it only controls a voice module or software sound device.
While most controllers do not produce sounds, there are some exceptions. Some controller keyboards called "performance controllers" have MIDI-assignable keys, sliders, and knobs, which allow the controller to be used with a range of software synthesizers or voice modules; yet at the same time, the controller also has an internal voice module which supplies keyboard instrument sounds (piano, electric piano, clavichord), sampled or synthesized voices (strings, woodwinds), and Digital Signal Processing (distortion, compression, flanging, etc.). These controller keyboards are designed to allow the performer to choose between the internal voices or external modules.
All MIDI compatible controllers, musical instruments, and MIDI-compatible software follow the same MIDI 1.0 specification, and thus interpret any given MIDI message the same way, and so can communicate with and understand each other. For example, if a note is played on a MIDI controller, it will sound at the right pitch on any MIDI instrument whose MIDI In connector is connected to the controller's MIDI Out connector.
When a musical performance is played on a MIDI instrument (or controller) it transmits MIDI channel messages from its MIDI Out connector. A typical MIDI channel message sequence corresponding to a key being struck and released on a keyboard is:
- The user presses the middle C key with a specific velocity (which is usually translated into the volume of the note but can also be used by the synthesizer to set characteristics of the timbre as well). The instrument sends one Note-On message.
- The user changes the pressure applied on the key while holding it down: a technique called Aftertouch (can be repeated, optional). The instrument sends one or more Aftertouch messages.
- The user releases the middle C key, again with the possibility of velocity of release controlling some parameters. The instrument sends one Note-Off message.
Note-On, Aftertouch, and Note-Off are all channel messages: embedded in the message is one of 16 channel IDs. This enables instruments to be set to respond to messages on specific channels while ignoring all others. (System messages, in contrast, are designed to be responded to by all connected devices.)For the Note-On and Note-Off messages, the MIDI specification defines a number (from 0–127) for every possible note pitch (C, C♯, D etc.), and this number is included in the message along with the velocity value.
Other performance parameters can be transmitted with channel messages, too. For example, if the user turns the pitch wheel on the instrument, that gesture is transmitted over MIDI using a series of Pitch Bend messages (also a channel message). The musical instrument generates the messages autonomously; all the musician has to do is play the notes (or make some other gesture that produces MIDI messages). This consistent, automated abstraction of the musical gesture could be considered the core of the MIDI standard.
MIDI composition and arrangement typically takes place using either MIDI sequencing/editing software on PC-type computers, or using specialized hardware music workstations. Some composers may take advantage of MIDI 1.0 and General MIDI (GM) technology to allow musical data files to be shared among various electronic instruments by using a standard, portable set of commands and parameters. On the other hand, composers of complex, detailed works to be distributed as produced audio typically use MIDI to control the performance of high-quality digital audio samples and/or external hardware or software synthesizers.
Digital Audio Workstations (DAW) are becoming one of the most centric and common tools in the studio, and many are specifically designed to work with MIDI as an integral component. Through the use of MIDI mapping, various MIDI controllers can be used to command the program. MIDI piano rolls have been developed in many DAWs so that the recorded MIDI messages can be extensively modified.[14] Virtual Instruments created by third party companies in one of a number of commonly used formats (for example, VST or RTAS) may be loaded as plug-ins thus providing a virtually limitless supply of sounds for a musician, and are designed to be commanded by MIDI controllers, especially in the DAW environment.
MIDI data files are much smaller than recorded audio waveforms. Many computer-sequencing programs allow manipulation of the musical data such that composing for an entire orchestra of sounds is possible. This ability to manipulate musical data has also introduced the concept of surrogate orchestras, providing a combination of half sequenced MIDI recordings and half musicians to make up an entire orchestral arrangement; however, scholars believe surrogate orchestras have the possibility of affecting future live musical performances in which the use of live musicians in orchestral arrangements may cease entirely because the composition of music via MIDI recordings proves to be more efficient and less expensive.[15] Further, the data composed via the sequenced MIDI recordings can then be saved as a Standard MIDI File (SMF), digitally distributed, and reproduced by any computer or electronic instrument that also adheres to the same MIDI, GM, and SMF standards.
Although not a wave audio file format, the Standard MIDI File was, due to its much smaller file size, attractive to computer users as a substitute before broadband internet became widespread. Later, the advent of high quality audio compression such as the MP3 format has decreased the size advantages of MIDI-encoded music to some degree, though MP3 is still much larger than SMF.
MIDI messages (along with timing information) can be collected and stored in a computer file, in what is commonly called a MIDI file. A number of music file formats have been based on the MIDI bytestream. These formats are very compact; a file as small as 10 kB can produce a full minute of music or more due to the fact that the file stores instructions on how to recreate the sound based on synthesis with a MIDI synthesizer rather than an exact waveform to be reproduced. A MIDI synthesizer could be built into an operating system, sound card, embedded device (e.g. hardware-based synthesizer) or a software-based synthesizer. The file format stores information on what note to play and when, or other important information such as possible pitch bend during the envelope of the note or the note's velocity. Small MIDI file sizes have also been advantageous for applications such as mobile phone ringtones, and some video games.
- Standard MIDI (.mid or .smf)
- The Standard MIDI File (SMF) specification was developed by, and is maintained by, the MIDI Manufacturers Association (MMA). MIDI files are typically created using computer-based sequencing software (or sometimes a hardware-based MIDI instrument or workstation) that organizes MIDI messages into one or more parallel "tracks" for independent recording and editing. In most sequencers, each track is assigned to a specific MIDI channel and/or a specific instrument patch; if the attached music synthesizer has a known instrument palette (for example because it conforms to the General MIDI standard), then the instrument for each track may be selected by name. Although most current MIDI sequencer software uses proprietary "session file" formats rather than SMF, almost all sequencers provide export or "Save As..." support for the SMF format. An SMF consists of one header chunk and one or more track chunks. There exist three different SMF formats; the format of a given SMF is specified in its file header. A Format 0 file contains a single track and represents a single song performance. Format 1 may contain any number of tracks, enabling preservation of the sequencer track structure, and also represents a single song performance. Format 2 may have any number of tracks, each representing a separate song performance. Sequencers do not commonly support Format 2. Large collections of SMFs can be found on the web, most commonly with the extension
.mid
but occasionally with the .smf
. These files are most frequently authored with the (rather dubious) assumption that they will only ever be played on General MIDI players.
- MIDI Karaoke (.kar)
- MIDI-Karaoke (which uses the ".kar" file extension) files are an "unofficial" extension of MIDI files, used to add synchronized lyrics to standard MIDI files. SMF players play the music as they would a .mid file but do not display these lyrics unless they have specific support for .kar messages. These often display the lyrics synchronized with the music in "follow-the-bouncing-ball" or progressive highlighting of the lyric text fashion, essentially turning any PC into a karaoke machine. None of the MIDI-Karaoke file formats are maintained by any standardization body but they follow General MIDI standards.
- XMF
- The MMA has also defined (and AMEI has approved) a new family of file formats, XMF (Extensible Music File), some of which package SMF chunks with instrument data in DLS format (Downloadable Sounds, also an MMA/AMEI specification), to much the same effect as the MOD file format. The XMF container is a binary format (not XML-based, although the file extensions are similar).
- RIFF-RMID
- On Microsoft Windows, the system itself uses proprietary RIFF-based MIDI files with the
.rmi
extension. Note, Standard MIDI Files are not RIFF-compliant. A RIFF-RMID file, however, is simply a Standard MIDI File wrapped in a RIFF (Resource Interchange File Format) chunk. For compatibility reasons many digital musicians overlook this format. One solution to this incompatibility is to extract the data part of the RIFF-RMID chunk, the result will be a regular Standard MIDI File. RIFF-RMID is not an official MMA/AMEI MIDI standard.
- Extended RMID
- In recommended practice RP-29 ([1]), the MMA defined a method for bundling one Standard MIDI file (SMF) image with one Downloadable Sounds (DLS) image, using the RIFF container technology. However, this method was deprecated when the MMA introduced the Extensible Music Format (XMF) which, because of its many additional features, is generally preferred for MIDI-related resource-bundling purposes in the future.
- Extended MIDI (.xmi)
- The XMI format is a proprietary extension of the SMF format introduced by the Miles Sound System, a middleware driver library targeted at PC games. XMI is not an official MMA/AMEI MIDI standard.
Many extensions of the original official MIDI 1.0 spec have been standardized by MMA/JMSC.
The General MIDI Level 1 ("GM") specification defines the feature set important for MIDI content interoperability across multiple players. It addresses the indeterminacy of the basic MIDI 1.0 protocol standard regarding the meaning and behaviour of Program Change and Control Change messages. Without GM, different synthesizers can, and actually do, sound completely different in response to the same MIDI messages. General MIDI 1 was introduced in 1991.
In order to improve upon the General MIDI standard, and to take advantage of the advancements in newer synthesizers, both Roland[when?] and Yamaha[when?] introduced new, proprietary, extended MIDI specifications—dubbed "GS" and "XG", respectively—along with numerous products based correspondingly upon them, designed with stricter requirements, new features, and backward compatibility with the GM specification. GS and XG are not mutually compatible, nor are they official MMA/AMEI MIDI standards. Adoption of each has been limited in general to its respective manufacturer; however, most popular MIDI/music software offerings now include them as built-in selectable options.
Later, after the success of General MIDI was firmly established, member companies of Japan's AMEI developed the General MIDI Level 2 (GM2) specification. Later still, GM2 became the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for mobile applications where different players may have different numbers of musical voices. SP-MIDI is a component of the 3GPP mobile phone terminal multimedia architecture, starting from release 5.
By convention, most MIDI synthesizers generally default to the conventional Western 12-pitch-per-octave, equal temperament tuning system. This tuning system makes many types of music inaccessible, because they depend on different intonation systems. To address this issue in a standardized manner, in 1992 the MMA ratified the MIDI Tuning Standard, or MTS. Instruments that support the MTS standard can be tuned to any desired tuning system by sending the MTS System Exclusive message (a Non-Real Time Sys Ex). The MTS SysEx message uses a three-byte number format to specify a pitch in logarithmic form. This pitch number can be thought of as a three-digit number in base 128. To find the value of the pitch number p that encodes a given frequency f, use the following formula:
- Failed to parse (Missing texvc executable; please see math/README to configure.): p = 69 + 12\times\log_2 { \left(\frac {f}{440\,\mbox{Hz}} \right) }.
For a note in A440 equal temperament, this formula delivers the standard MIDI note number as used in the Note On and Note Off messages. Any other frequencies fill the space evenly.
The MIDI Show Control (MSC) protocol (in the Real Time System Exclusive subset) is an industry standard ratified by the MIDI Manufacturers Association in 1991 which allows all types of media control devices to talk with each other and with computers to perform show control functions in live and canned entertainment applications. Just like musical MIDI (above), MSC does not transmit the actual show media: it simply transmits digital data providing information such as the type, timing and numbering of technical cues called during a multimedia or live theatre performance.
Audio mixers can be controlled with MIDI during console automation.
In addition to the original 31.25 kbit/s (baud is the signalling rate and is the reciprocal of the shortest signalling element; bits/sec is the data rate) current-loop transported on 5-pin DIN, other connectors have been used for the same electrical data, and transmission of MIDI streams in different forms over USB, IEEE 1394 a.k.a. FireWire, and Ethernet is now common.
A standard for MIDI over USB was developed in 1999 as a joint effort between IBM, Microsoft, Altec Lansing, Roland Corporation, and Philips.[16] To transmit MIDI over USB a Cable Number and Cable Index are added to the message, and the result is encapsulated in a USB packet. The resulting USB message can be double the size of the native MIDI message. Since USB is over 15,000 times faster than MIDI (480,000 kbit/s vs 31.25 kbit/s,) USB has the potential to be much faster. However, due to the nature of USB there is more latency and jitter introduced that is usually in the range of 2 to 10 ms, or about 2 to 10 MIDI commands. Some comparisons done in the early part of the 2000s showed USB to be slightly slower with higher latency,[17] and this is still the case today. Despite the latency and jitter disadvantages, MIDI over USB is increasingly common on musical instruments.
Some early MIDI implementations used XLR3 connectors in place of the 5-pin DIN. The use of XLR3 connectors allowed the use of standard low-impedance microphone cables as MIDI cables. As the 31.25 kbit/s current-loop requires only three conductors, there was no problem with the loss of two pins. An example of this use is the Octave-Plateau Voyetra-8 synthesizer.
Compared to USB or FireWire, the computer network implementation of MIDI provides network routing capabilities, which are extremely useful in studio or stage environments (USB and FireWire are more restrictive in the connections between computers and devices). Ethernet is moreover capable of providing the high-bandwidth channel that earlier alternatives to MIDI (such as ZIPI) were intended to bring.
After the initial fight between different protocols (IEEE-P1639, MIDI-LAN, IETF RTP-MIDI), it appears that IETF's RTP MIDI specification for transport of MIDI streams over computer networks is now spreading faster and faster since more and more manufacturers are integrating RTP-MIDI in their products (Apple, CME, Kiss-Box, etc.). Mac OS X, Windows and Linux drivers are also available to make RTP MIDI devices appear as standard MIDI devices within these operating systems. Additionally, IEEE-P1639 is now a dead project. The other proprietary MIDI/IP protocols are slowly disappearing, since most of them require expensive licensing to be implemented (while RTP MIDI is completely open), or the MIDI implementation does not bring any real advantage (apart from speed) over original MIDI protocol.
The RTP-MIDI protocol has been officially released in public domain by IETF in December 2006 (IETF RFC4695).[18] RTP-MIDI relies on the well-known RTP (Real Time Protocol) layer (most often running over UDP, but compatible with TCP also), widely used for real-time audio and video streaming over networks. The RTP layer is easy to implement and requires very little power from the microprocessor, while providing very useful information to the receiver (network latency, dropped packet detection, reordered packets, etc.). RTP-MIDI defines a specific payload type, that allows the receiver to identify MIDI streams.
RTP-MIDI does not alter the MIDI messages in any way (all messages defined in the MIDI norm are transported transparently over the network), but it adds additional features such as timestamping and sysex fragmentation. RTP-MIDI also adds a powerful 'journalling' mechanism that allows the receiver to detect and correct dropped MIDI messages.The first part of RTP-MIDI specification is mandatory for implementors and describes how MIDI messages are encapsulated within the RTP telegram. It also describes how the journalling system works. The journalling system is not mandatory (journalling is not very useful for LAN applications, but it is very important for WAN applications).
The second part of RTP-MIDI specification describes the session control mechanisms that allow multiple stations to synchronize across the network to exchange RTP-MIDI telegrams. This part is informational only, and it is not required.
RTP-MIDI is included in Apple's Mac OS X since 10.4 and iOS since 4.2, as standard MIDI ports (the RTP-MIDI ports appear in Macintosh applications as any other USB or FireWire port. Thus, any MIDI application running on Mac OS X is able to use the RTP-MIDI capabilities in a transparent way). However, Apple's developers considered the session control protocol described in IETF's specification to be too complex, and they created their own session control protocol. Since the session protocol uses a UDP port different from the main RTP-MIDI stream port, the two protocols do not interfere (so the RTP-MIDI implementation in Mac OS X fully complies to the IETF specification).
Apple's implementation has been used as reference by other MIDI manufacturers. A Windows XP RTP-MIDI driver[19] for their own products only has been released by the Dutch company Kiss-Box, another Windows RTP-MIDI driver[20] compatible to Windows XP up to Windows 7 (32bit and 64bit) has also been released and a Linux implementation is currently under development by the Grame association.[21] So it seems probable that the Apple's implementation will become the "de-facto" standard (and could even become the MMA reference implementation).
Some older instruments, for example electronic organs built in the 1970s and 1980s, are becoming beyond repair, due to lack of spares and/or of technicians trained on such equipment. The best candidates for upgrade are what are referred to as "Console" sized, or have at least 2x keyboards of 61 notes, and at least a 25 note (preferably 32 note concave) pedal board. Smaller "Spinet" sized organs are probably not considered worthy of conversion. In some cases, they can be modified into MIDI instruments. Terms coined from MIDI + modification are often used, such as midification or to midify.
An old electronic organ could have almost all of its discrete component electronics replaced by modern circuitry which will cause the instrument to output MIDI signals. The instrument would then become a specialised MIDI keyboard.[22] Its MIDI output would need to be fed to a MIDI engine of some sort.
See for example: Midification of an Organ
New electronic keyboards have MIDI functions as standard and can be connected to the computers with a PC-to-MIDI circuit or simply via USB. Other forms of MIDI controllers include wind controllers, drums, guitars, accordion and many others.
Old synthesizers are not often modified to transmit MIDI but people sometimes modify them to receive it. The modification involves adding a circuit board that converts digital MIDI signals into analog Control Voltages, as well as adding a MIDI jack. These circuit boards, often called MIDI-to-CV/Gate converters, also allow analog synthesizers to be played while receiving edit filter and envelope parameters from a modern MIDI device.[23] The circuit boards are usually designed specially for one model of synthesizer and it takes some expertise to install them. This allows pre-MIDI analog synthesizers to be controlled by digital sequencers, whereas they formerly required the user to actually play them.
MIDI 1.0 is also used as a control protocol in applications other than music, including:
Although traditional MIDI connections work well for most purposes, a number of newer message protocols and hardware transports have been proposed over the years to try to take the idea to the next level. Some of the more notable efforts include:
- The Open Sound Control (OSC) protocol was developed at CNMAT. OSC has been implemented in the well-known software synthesizer Reaktor, in other innovative projects including SuperCollider, Pure Data, Isadora, Max/MSP, Csound, vvvv, ChucK, Quartz Composer and LuaAV as well as in many general purpose programming languages such as C (liblo), Python (pyliblo), Haskell (hosc), Scheme (sosc), Java (JavaOSC, oscP5 for Processing) and Pure (pure-liblo). The Lemur Input Device, a customizable touch panel with MIDI controller-type functions, also uses OSC. Audiocubes, a collection of smart light-emitting objects, are an other example of a professional musical instrument working with MIDI as well as OSC. OSC differs from MIDI 1.0 over traditional 5-pin DIN in that it can run at broadband speeds when sent over Ethernet connections. However, the differences are smaller compared to MIDI when run at broadband speeds over Ethernet connections. Few mainstream musical applications and no standalone instruments support the protocol so far, making whole-studio interoperability problematic. OSC is not owned by any private company; neither is it maintained by any standards organization. Since September 2007, there is a proposal for a common namespace within OSC[24] for communication between controllers, synthesizers and hosts. This, too, would not be maintained by any standards organization.
- Yamaha introduced the mLAN protocol in 1999. It is based on the IEEE 1394 transport (also known as FireWire) and carries multiple MIDI 1.0 message channels and multiple audio channels. mLAN is open for licensing but it is a proprietary protocol, i.e. not maintained by a standards organization and covered by patents owned by Yamaha. Yamaha ceased developing new mLAN products since 2007.
- Development of a version of MIDI for new products which is fully backward compatible is now under discussion in the MMA. First announced as "HD-MIDI" in 2005[25] and tentatively called "HD Protocol" or "High-Definition Protocol" since 2008, this new standard would support modern high-speed transports and allow device discovery and enumeration, provide greater range and/or resolution in data values, increase the number of Channels and Controllers, support entirely new kinds of events, such as Direct Pitch in the Note message and a Note Update message, and at the same time decrease the complexity of messages.[26][27] Various transports have been proposed for use for the HD-Protocol physical layer, including calls for Ethernet-based protocols such as IEEE AVB and ACN to be used as the sole or primary transport in show control environments.[citation needed] As of January 2012, a draft of the HD Protocol and an UDP-based transport is being reviewed by MMA's High-Definition Protocol Working Group (HDWG), which includes representatives from all sizes and types of companies; the final specification should be completed later in 2012.[26][27]
There is a wide range of MIDI software available such as auto accompaniment applications, notation programs, music teaching software, music producing, games, DJ/remix environments and so on.
- ^ A Brief Introduction to Midi
- ^ History of MIDI
- ^ Mixdown Monthly, #26, June 26, 1996.
- ^ How Making Music with MIDI Works, from HowStuffWorks.com
- ^ What Is Midi? From www.homemusician.net
- ^ MIDI Multi-Track Recording Software, from HowStuffWorks.com
- ^ What is MIDI? from www.wisegeek.com
- ^ The Complete MIDI 1.0 Detailed Specification
- ^ Chadabe, Joel (May 1, 2000). "Part IV: The Seeds of the Future". Electronic Musician (Penton Media) XVI (5). http://emusician.com/tutorials/electronic_century4/.
- ^ Billboard 95 (5): 41. February 5, 1983. ISSN 0006-2510.
- ^ Paul, Craner (Oct). "New Tool for an Ancient Art: The Computer and Music". Computers and the Humanities 25 (5): 308–309. JSTOR 30204425.
- ^ "The beginner's guide to: MIDI controllers". Computer Music Specials. http://www.musicradar.com/tuition/tech/the-beginners-guide-to-midi-controllers-179018. Retrieved 11 July 2011.
- ^ "Types of MIDI Controllers – Part 1". Function. http://djmidicontrollers.com/useful-tips/types-of-midi-controllers-part-1. Retrieved 11 July 2011.
- ^ "Digital audio workstation - Intro". http://homerecording.guidento.com/daw.htm. Retrieved 11 July 2011.
- ^ "the-digital-orchestra". http://sks.sirs.com.ezproxy.socccd.edu/cgi-bin/hst-article-display?id=SCA0984-0-4434&artno=0000127008&type=ART&shfilter=U&key=&title=The%20Digital%20Orchestra&res=Y&ren=Y&gov=Y&lnk=Y&ic=N.
- ^ Universal Serial Bus Device Class Definition for MIDI Devices
- ^ The Truth About Latency: Part 2
- ^ IETF RTP-MIDI specification
- ^ Windows XP RTP-MIDI driver download
- ^ Windows RTP-MIDI driver download
- ^ Grame's website
- ^ "Converting an old organ to MIDI". Word Press. http://www.cibomahto.com/2010/01/converting-an-old-organ-to-midi/. Retrieved 10 July 2011.
- ^ "Studio Set-Up Guides". http://www.vintagesynth.com/resources/setups.php. Retrieved 10 July 2011.
- ^ common namespace within OSC
- ^ Finally: MIDI 2.0, O'Reilly Digital Media Blog.
- ^ a b MMA Hosts HD-MIDI Discussion, MIDI Manufacturers Association.
- ^ a b Winter NAMM 2012 - General Meeting for MIDI developers by MMA
|
|
General |
|
|
Standards |
|
|
Portable |
|
|
Embedded |
|
|
Storage |
|
|
Peripheral |
|
|
Note: interfaces are listed in speed ascending order (roughly), the interface at the end of each section should be the fastest
Category
|
|