I’ve heard that microphone preamplifiers (mic preamps) sound better when they’re working harder. Is that true? – Anonymous
“Sound better” and “working harder” are sort of nebulous concepts, but we know that tube guitar amps generally do “sound better” when they’re “working harder,” so let’s start there. Color, warmth, and character are all usually references to a device’s signature sound.
But what we’re really talking about is harmonic distortion, meaning there are frequencies coming out of the device that weren’t present in the input signal. These are generally integer multiples of the input frequency: put in 1 kHz, get out 1 and 2 and 3 kHz, etc. This is why tubes, tapes, and vinyl have a “sound.”
Having a sound means that a device, by definition, is not transparent. With guitar amps, this is by design. Turning them up makes the tubes distort more and… well, it’s rock ‘n’ roll, folks.
Older mic preamp designs utilized transformers, so they were nonlinear right off the bat, and the total harmonic distortion (THD) would increase with signal level, particularly in the low-frequency range, as is the case with transformers. But modern preamps are built with discrete transistors and op-amps, and they’re very linear right up until the point of clipping. So a competently designed modern preamp won’t show higher THD as the gain is turned up, with the exception of boutique models that are designed to do exactly this for dramatic effect, such as the Silk function on Neve preamps.
There’s one technical surprise, though. Preamps do have one spec that changes as gain is altered: Noise Factor (NF), which is a measure of how closely the circuit approaches a theoretically perfect noiseless amplifier, in dB. All electronic devices, even the humble resistor, generate some amount of noise simply because they’re made out of atoms, and atoms have electrons, and electrons move around even when we don’t force them to. Pesky little buggers.
This random movement creates a white-spectrum hissing called thermal noise, or Johnson noise, and it’s determined largely by the source resistance. (Turn up an amp all the way with no input signal, and that’s what you’re hearing.)
We’ll spare the math, but let’s say a mic has a thermal noise voltage of about -126 dBu. If we then add 60 dB of gain at the preamp, the noise is raised to -66 dBu, even if the preamp itself is perfectly noiseless.
That said, no preamp is truly noiseless, but some approach the theoretical limit to within a couple of dB at the highest gain settings. That’s because when you turn up the gain, you’re removing resistance from the circuit’s gain control network, and the generated Johnson noise decreases as a result. This is counterintuitive because more hiss is produced, but that’s simply the amplified thermal noise from the mic itself.
As gain increases, the preamp itself contributes less and less noise of its own, so in a way, the preamp is actually performing better at higher gain levels. Don’t get crazy ¬– we’re talking about a couple of dB here – but it’s a real effect.
So to answer the question: Yes, but probably not in the way you thought.
Got a question? Send it to Jonah via email at [email protected].