Enter the maze

Combing for sound

Your band has played some gigs and now you are going to record a demo. You set up the equipment, scatter microphones around, hit record and off you go. You have a great time and it felt really good. You play back the track and...it sounds horrible! The bass has all but disappeared and what are those other weird sounds? Surely you weren't playing that badly. Probably not, but maybe you need to learn a bit more about microphones and effects called 'microphone bleed' and 'comb filtering'. With any luck though, some time soon Alice Clifford from the Centre for Digital Music at Queen Mary, University of London will have made the problem a thing of the past.

Musicians and mikes

A microphone is really easy to use. It will reproduce any sound in a specific area around it as an electrical signal, but anything from elsewhere isn't picked up. That means to record an acoustical musical instrument (say, a saxophone) you point the microphone directly at it. If it's an electrical instrument (like an electric guitar) you point the microphone at the amplifier. If the sound source falls within the sensitive area of the microphone, the sound of the instrument can be reproduced and so played back through speakers or recorded to hard disk.

There isn't that much that can go wrong with this, but there can be problems. Other instruments could be picked up in the microphone's signal (that's 'microphone bleed'), or there might be some reverberation or random noise on the microphone signal. You solve problems like this using soundproofing and being careful about where you place the microphones. The idea is to make sure the interfering sounds end up in those areas that the microphone can't 'hear'.

Often you want to record more than one instrument at the same time or several microphones need to be placed close together to record one instrument like a drum kit. This leads to a specific problem when more than one microphone picks up the same sound. When the microphone signals are mixed together, copies of that sound can be added together. The trouble is that they may have taken different amounts of time to get to the different microphones, so may not be perfectly synchronised.

This is where 'comb filtering' comes in. Why that name? It has nothing to do with bad hair days, even though that's what it makes your music sound like! It is because the electronic signal involved looks like a comb. In signal processing, a 'filter' is a device or process that modifies features of a signal. In this case the process of combining the out of time sounds can be thought of as a filter. The sounds interfere with each other in a way that cuts out some groups of frequencies whilst boosting others. Draw a picture of the signal's interference with its spikes and gaps at different frequencies and it looks like the teeth of a comb.

As well as other weird sounding effects, comb filtering can lead to large groups of frequencies being cut out that should have been kept. For example, you may find your bass guitar recording suddenly has no bass frequencies in it. The effects can be great for giving expression to a guitar solo, but it isn't what you want on a classical harp recording.

You may find your bass guitar recording suddenly has no bass frequencies in it

You can reduce the problem by paying attention to the positions of microphones in relation to each other and to other instruments. Ideally all microphones would be placed exactly the same distance from each instrument but that is impractical. A standard rule used by sound engineers is the '3 to 1' rule: always place a second microphone or instrument at least 3 times the distance from the first instrument to the first microphone. The idea is to make the strength of the sound in the second one low enough that it doesn't affect the first one. Basically, the aim is to reduce microphone bleed simply by placing the instruments and microphones a long way from each other.

So next time you are recording your music band, you should pay close attention to where you place the microphones. Sound engineers painstakingly position microphones for good reasons! If the audio that comes out sounds different to what you expect, try moving microphones as it might improve things. The only trouble is perfect positioning is not always possible. It would be much better if you could just do some clever audio processing after the sounds have been picked up by the microphone.

This is where Alice Clifford's work comes in. She has set herself the problem of how to analyse audio signals and apply processing to remove the comb filtering effect. One way this can be done is by "nudging" audio regions in a digital recorder so visually they line up and so will be played back at the same time. Often, the comb filtering may not be apparent, but as soon as the audio is nudged in line, the kick drum has more of a thump or a guitar has more presence. Of course, nudging regions may not be very accurate. Alice is aiming to come up with a way that solves the problem completely. If successful it will mean that in future sound engineers would not have to move the microphones at all.

So at the moment you have to be extra careful over where you place the microphones when recording your band. With any luck though Alice will solve the problem, and her signal processing software will mean messing about with the position of microphones before recording will be a thing of the past.