Contributed by weerd on from the sounds-like-midi dept.
Some time ago, there was a huge MIDI-related commit from Alexandre Ratchov (ratchov@). He has summarized his work in a new installment of OpenBSD Journal's developer blog.
MIDI is for electronic musical instruments what Ethernet is for computers. It is a slow (3125 bytes/s) unidirectional point-to-point serial link between keyboards, synthesizers, hardware multitrackers and so on. MIDI is aimed to allow one piece of equipment to control another one, possibly making all of them cooperate on the same (typically music-related) project. For instance, MIDI keyboards can send notes to play to a synthesizer in real-time; or a hardware multitracker can send clock ticks to a drum machine to stay in sync. The protocol is real-time, which simply means that messages have to be executed as soon as they are received, there are no timestamps involved.
Please read on for the rest of Alexandre's story:
Thanks, Alexandre, for your work on midicat(1) and this piece of background information.
MIDI was designed around 1985 when electronic components were slow and expensive, so it is simple and not over-engeneered. It's still wide-spread, but it hasn't changed a lot since it was designed, probably because -- unlike computers -- musicians are not faster now than they were in 1985; so no need for faster links or more complicated interfaces.
OpenBSD has support for input and output MIDI ports (dumb serial ports), which means that a sequencer application can control a hardware synthesizer module (output port). Or a midi keyboard (input port) can control a software synthesizer running on OpenBSD.
But having only that was limiting because there was no way on OpenBSD for one program (eg. sequencer) to control another one (eg. a soft synthesizer). I mean, it's ridiculous: MIDI interconnected equipment could cooperate, while software running inside the same box couldn't. The aim of recent MIDI developments was to allow any program to send MIDI data to other programs as though other programs were real MIDI hardware.
Technically, the problem is the same as writing an audio server and making audio applications use it instead of the kernel device drivers. Same problem, same solution. Moreover, aucat(1) internals are not specific to audio: most of the code is a generic framework for non-blocking I/O. So the necessary changes were: slightly polish the existing code, write the MIDI-specific bits and create a midicat(1) link to aucat(1).
The result is that once midicat(1) is setup, a MIDI sequencer can be used to record and edit music sequences while rendering the result in real-time with a softsynth; well, assuming programs are ported to the new API (see the audio/midish port or the diff for audio/fluidsynth on ports@).
One may think that this bloats aucat(1), actually it ends up being advantageous. MIDI bits put new constraints on the aucat(1) internals and exposed hidden bugs, in turn improving its correctness and robustness.
Another important point is that now we have a user-land API to write MIDI code, and thus to work on all kinds of interesting projects around audio and MIDI.
(Comments are closed)