Now let's talk about protocols that transmit not time, but messages. These protocols provide synchronization by certain events, but in this case, there is no constant synchronization. In the early 70s, in the era of emerging computer technology, different companies experimented and came up with newer and newer standards for data transfer, increasing the speed, reliability and range of data transfer. By that time, the show industry already needed a more advanced and functional synchronization protocol, since LTC was no longer enough. The time had come for the birth of a new synchronization interface, which would make a definite breakthrough in the show industry.

As I have said in previous articles, all new developments in the world of synchronization were often motivated by the music industry. In the late 70s, musical synthesizers were especially popular. These were electric musical devices that modulated certain sounds controlled by voltage. For each key of sound, there was a generator. Each synthesizer model was characterized by an effect or sound that it could modulate.

In those days, the musician’s workplace consisted of many different synthesizers and it was very difficult to manage all of them. Needs grew. And in the early 80s, rapidly developing technologies in the world of electronics offered a solution to this problem. It was a step toward digital program control. Synthesizer manufacturers were able to successfully agree on the development of and support for a single standard for synthesizer control interface, which appeared in 1983.

The idea was to separate the sound-forming module and the control module, which were interconnected via a digital channel. The protocol, which was developed for these purposes, essentially transmitted the state of keys. Later, its functionality was expanded with more capabilities. Thanks to that system, a musician could control several synthesizers from one keyboard all at once. It was possible to record and play a created melody. It is difficult to overestimate the impact of the coming of a new protocol on the music and show industry as a whole.

Ladies and gentlemen, let me introduce you to the MIDI Musical Instrument Digital Interface.

Despite its age, MIDI is still widely used in various parts of the show industry.

The MIDI language consists only of control commands and parameters for these commands. Commands in the MIDI language are called messages. Messages are divided into two main types: the first controls sound generation, which means that they say how loudly to play which note. The second performs service functions, meaning they control changes in the tone generator and synchronization settings.

Messages of the first type are called Channel Messages.

Messages of the second type are called System Messages.

Channel messages are divided into Channel Voice Messages and Channel Mode Messages.

System messages are divided into System Common Messages, System Real Time Messages, and System Exclusive Messages.

In the image above, you can understand which MIDI protocol it is, and which type applies.

Further, when we work directly with the MIDI protocol, we will encounter such a parameter as the MIDI Channel, which is a software channel for transmitting messages. There can be 16 channels in total. This parameter is applicable only to messages of the “Channel” group, each message of MIDI note. MIDI CC and MIDI PC transmit information about which channel this message belongs to. These channels are needed for transmitting in one stream the notes for different instruments. This will allow us to simultaneously send two identical MIDI notes, but to generate different musical instruments.

MIDI note

A MIDI note message contains three bytes in which information about the channel number, note and its status is encoded. A note can be active: Note on or inactive Note off. Each note has its own specific number, from 0 to 127. Each number corresponds to a specific note in a specific octave. To synchronize devices, these notes can transmit different events to which you need to synchronize. Although, unlike a musical device, the receiving equipment does not generate sound, but performs a certain action.

For example, you can configure through the lighting control desk and on the multimedia server that when they receive the note “C4” of the first channel, you need to start the light cue and video content. When this note is received via MIDI, the multimedia server and the lighting control desk will launch their programs simultaneously. The peculiarity of this method is that with a large number of different MIDI note commands, you can get confused by the notation. In addition, when changing the show and moving the Cuelist to the console, you need to follow the MIDI command links, so they will perform what we need.


MIDI CC (Control Change) and MIDI PC (Program Change) messages are very similar to MIDI notes. Messages of this type are encoded in three bytes. They also contain information about the channel, parameter number and its status. These messages are used to change the musical program with a set of playback tools, as well as other synthesizer settings.

Most often while using MIDI CC, MIDI PC messages synchronize linear parameters. For example, the level of the fader or knob on the MIDI controller. Using these messages, it is possible to transfer the state of 128 control parameters, where each channel has a range from 0 to 127.

This is the basic theoretical information that is needed to understand what a MIDI message is and how it can be used. But this is only the beginning. In the next article, we will talk about the MIDI protocol which was not inherited from the music industry, but which was specially created for synchronization in the show industry, namely MIDI Show Control.