Wednesday, February 3, 2010

USB MIDI and You - 7 Bit and 14 Bit

USB MIDI ventures out of the traditional audiophile's knowledge base and into the computer science field. I feel like this topic isn't covered enough in the digital DJ area. Many digital DJs use USB Midi, but it seems many of us do not understand it. Hopefully this post will help.

What is USB?

Universal Serial Bus - a method of transmitting and receiving data between the computer and an external device. It provides 5 Volts DC and up to 0.5 Amps to power connected devices. USB 1.1 and USB 2.0 are the most common devices on the market. USB 2.0 is over 10x faster and handles more data than the USB 1.1 does, but they look the same physically.

What is MIDI?

Musical Instrument Digital Interface - an interface used to transmit and receive bits of data between a controller (keyboard, synthesizer, drum pad, DJ controller, etc.) and a processor that outputs audio notes, which differ depending on the digital MIDI notes received from the controller. A traditional keyboard's keys just tells the central processor to output that note's sound (which is saved inside the electronics memory). Traditional MIDI uses its own 5-pin connector, but now-days more devices are using USB because it is very fast, provides power, and it is standard amongst most platforms.

What's the difference between Traditional 5-Pin MIDI and USB MIDI?

USB Midi provides a power source also, so no need for an additional power supply unless the controller needs more than 5V/0.5A. USB 1.1 is roughly 300x faster than traditional MIDI connectors. The 5-PIN connections are typically connected in serial (i.e. daisy-chained) so they are all on the same channel, unless each device is placed on a different MIDI channel. USB connections are typically connected in parallel so they are on separate channels, even while connected on a hub, since the hub separates the data for you.

What's a MIDI Channel, is it needed for USB?

Midi Channels are mostly useful when daisy-chaining MIDI devices. Daisy-chaining is when you connect the output of device 1 into the input of device 2 and the output of device 2 into the input of the next device. So without MIDI channels, both devices could be sending the same MIDI message on the same MIDI channel... which is like two people talking at the same time - you don't hear both people 100% of the way they're meant to be heard. MIDI Channels attach a channel message to the MIDI message so that the processor knows which message came from which device. Since USB MIDI usually puts one device on a separate USB port (rather than daisy-chaining them), there is no need for a software-channel. Most computer software recognizes separate USB devices, but if a software does not recognize the difference between the devices.. MIDI channels may be useful in separating the two devices. This is a rare case, since most commercial PC software uses the operating system to recognize separate USB ports. It never hurts to use separate MIDI channels though.

Is Latency a problem? Does USB MIDI have latency problems?

Even the smallest latency is less than ideal, since Zero latency is preferred. Zero Latency is preferred to sync up your inputs to the audio and visual outputs of the controller and computer. If you have any latency, your timing will be off slightly when mixing. I wouldn't say that it is significant though. The largest latency problem is audio latency that is created due to slow signal processing and Digital to Analog conversion. ASIO low-latency audio drivers are solutions for this problem. Audio latency can be reduced to 2-5 millseconds of delay, which is nearly impossible to distinguish. USB latency is even less than that. Transmitting a single 14-bit MIDI message (i.e. pressing one button) over a USB 1.1 cable creates around 1 microsecond of delay... and there's 1000 microseconds in one millisecond. So no, don't worry about USB latency with MIDI.

What is 14-bit MIDI?

One of the standard bit sizes (the number of bits of data in a message) for MIDI messages, particularly in USB MIDI DJ controllers, is 7 bits. More recently, MIDI controllers have been handling higher resolution 14-bit MIDI messages. There's twice as many bits in each message. Each bit of data contains a 0 or a 1, but when you combine bits.. such as 7 bits... those 7 bits can contain 128 combinations of 0's and 1's... so 7-bit MIDI can transmit 128 different values. 128 values!? That's plenty for each control, isn't it? Well.. for simple buttons and knobs that don't need much resolution, such as on & off buttons, controls for volume control, etc.

However, some controls could benefit from higher resolution data, such as frequency selectors for advanced filtering, perfectly beatmatching with the pitch fader, and scratching or manually "playing" the record by hand. 14-bit MIDI handles this much better than 7 bit, since 14 bit is capable of holding 16,384 different values per message. So when you move the control on your pitch fader, the computer can now recognize 16,384 different positions on the pitch. When you do this with 7-bit, it only recognizes 128 different positions on the pitch. These kinds of things don't matter in the analog world (turntables for example), where it recognizes the exact position of the pitch.

Why do I care about Bit-Size?

Since turntables pitch faders and cartridges are analog, the amount you move the pitch fader or the record directly translates to what you hear without skipping any audio. Analog controls sense the exact value, but when you have a digital interface, such as MIDI, connected to an analog control, such as a pitch fader, some data is lost. The Bit-Size of the digital message data determines its resolution. More bits = higher resolution. Higher resolution = less data loss = closer to a full analog representation.

What happens with a digital MIDI controlleranyway?

Even though you may be turning an analog knob that has full 360 rotation, the computer only recognizes bits and pieces of your rotation. So if you turn your knob one full rotation... the computer doesn't receive a continuous rotation, like an analog turntable does. Instead, the computer receives steps of information. Instead of starting at 0 and continuously, progressively working its way up to 360 (recognizing down to the slightest movement, such as 0.04792579878528942 degrees), the MIDI interface causes the computer to receive something like this: 0, 3, 6, 9, 12, ...., 354, 357, 360.. etc. in samples. Twice as high resolution would cause it to be more like: 0, 1.5, 3, 4.5, 6, 7.5, 9, 10.5, 12, .... etc. Half the original resolution would cause it to be more like 0, 6, 12, ... etc.

Could you give me an example?

The pitch fader on a traditional turntable allows you to perfectly beatmatch two turntables. 7-Bit MIDI allows 128 values on your pitch fader. So if you are using a +/- 8% pitch fader on a 100 BPM track, your pitch fader ranges across 16 BPM (-92 BPM to 108 BPM). Your 7-BIT MIDI Pitch fader can represent 128 different BPM values in that range. So your computer will be up to 1/8 of a BPM off of where you may want it to be. So in a minute or two, the crowd is going to hear mismatched beats.. so in 8 minutes, you'd be a whole beat off, just because 7 bits won't let you be more precise. There is a notable difference between 128 BPM and 128.125 BPM.

14-bit would allow 16,384 values, so in the same situation, it would be 1/1000 of a BPM off of where you want it. Can you tell the difference between 100 BPM and 100.001 BPM? Probably not, and it'd take 16 hours and 40 minutes for you to miss a whole beat due to the inaccuracy. I seriously doubt you'll be playing the same tracks for 16 hours and 40 minutes... :)

If you increase the percentage of the fader sensitivity (to 10, 50, 100%) or the track's BPM is higher, your problems get worse. If your fader sensitivity is set lower (5, 4, 3%) or the track's BPM is lower, your problems are less noticeable but still exist. The higher the resolution (more bits in MIDI data), the more accurate the computer will respond to your actions.

So it effects Pitch, I'll use the Auto-Sync. What else?

Jog wheels are a big problem also. You've got 360 degree rotation on the jog wheel of course, and that's split into 128 values... you can already see the problem, can't you? So the computer only adjusts itself (i.e. moves the playmarker in the song.. which outputs the audio) every 2.81 degrees rotated around the jog wheel. So when you manually try to play the song, without any software sensitivity adjustment, by pressing around the jog wheel, what happens? It skips half the sounds being played back! Same goes for scratching... it skips half the sounds! So how is this adjusted? The software's sensitivity must be adjusted so that you have to use larger movements to move the playmarker less. So you might have to move your hand two or three times further to get the same sound as you would had it been a very high resolution MIDI interface. Further distance = slower scratching and playback. This is why you don't notice the skipping as bad - the software makes adjustments... but you still lose in one way or the other.

With 14-bit jog wheels, the computer receives a message once every 0.022 degrees moved. So the computer knows where your hand is down to 2.2% of a degree. The MIDI conection is no longer the bottleneck for transmitting the data, and sensitivity does not need to be reduced on the software side of things. The latency is still in the range of microseconds when sending this much data.

Why is MIDI Digital instead of Analog?

Analog data requires more bandwidth and takes up more space. It also requires periodic calibration of the controls, and it suffers from noise issues, which could create an unsteady response. Digital data requires less bandwidth and space because it only needs to know when the value changes, whereas analog always wants to know what the value is at that time. No control calibration is needed with Digital, and noise is not an issue. Noise is only an issue in extremely high-power situations, where a power generator could generate enough Electromagnetic interference that it causes a 0 bit to turn into a 1 bit, or vise versa.

Digital data may have some data loss; but thanks to high resolution sampling, the loss is negligible. The point is that it's important to have high resolution controllers. If you went too large (say 256-bit, like some graphics processors), then you could have some serious slow-down due to processing time.


  1. big thx for the up! must've taken a sh*tload of time to get all that typed, why are you not making a new banger instead? ^^ peeeace!

  2. Amazing explanation. I was looking at the Moog Phatty 3.1 OS update and wondered what 14 bit MIDI was! This was a perfect writeup and will save it for reading if I forget!