Dynamic Processing

Supported By
ProSoundWeb

Spotlight On Signal Processing: Part 2 In The Evolutions Of Large-Scale System Optimization

A journey through the development of audio filtering starting with the staples of the analog world through the filters in modern DSP that mine their predecessors but can also go beyond.

Editor’s note: Go here to read part 1 of this series.

“In the beginning there was graphic EQ.”

The first standard tool for system equalization was the graphic equalizer. Early versions were 10 bands at octave intervals, but the 1/3rd-octave version took over the market completely by the late 1970s.

The 31 bands were standardized to a series of 1/3rd-octave intervals beginning with 31 Hz. There was no standardization of the shape of the filters, however. One model might use 1/3 octave filters, another would use octave filters.

One of the primary attractions of the graphic equalizer was that its front panel settings seemed to represent the response it was creating (hence the name “graphic”).

This was mostly true if the settings were all flat, but once the sliders were moved the resemblance faded because the parallel filters also affect the range of their neighbors. The parallel filter interaction dramatically affected how closely the “graphic” shape of the front panel actually resembled the curve that was being created.

The reality is that the picture on the graphic EQ front panel was never accurate but some (especially the ones with wide filters) were wildly inaccurate. This confused people because they attempted to use these tools for what I call “ear to eye training.” Engineers learned to distrust the “graphic” settings. They knew it wasn’t doing what it showed on the front panel – but they didn’t know what it “was” doing. The inaccuracy of the graphic EQ created a lot of false conclusions.

Graphic equalizers with narrow filters create a ripple in the response, which increases as the cuts deepen. The center frequencies cut deeper than the mid-point between. Wider filters reduce the ripple, but increase the overlap, which decreases the accuracy of the front panel. Graphics with narrow filters more closely correlate to the panel but have higher ripple. I know of only one graphic EQ that old engineers still have romantic feelings for: the Klark Teknik DN360 (wide filters, low ripple and low accuracy). Bottom line: front panel accuracy doesn’t matter much when you are tuning by ear, but ripple does.

The Graphic Details

A substantial culture arose of what I would call the “graphic EQ code of conduct,” a set of visual rules that governed the fader placement. The foremost of these was the “move the neighbors” rule, which mandated that a deep cut at 500 Hz meant you had to move the 315, 400, 630 and 800 Hz faders down as well to make it look like a gradual curve. Never mind that this causes 500 Hz to cut much deeper and wider than you intended.

The ubiquitous (at the time) Klark Teknik DN360 graphic EQ, joined by other popular models from BSS (FCS-960) and dbx (231).

Another of the folk legends was the belief that cutting more than 6 dB would create a “phase problem” of some mysterious unquantifiable variety. This was taken seriously: Everyone knows you don’t push that fader past 6 dB! I can’t say this phasor vortex never happened to somebody, but I can say I never saw such a thing occur on my analyzer (which reads phase). The phase problems that we did see were primarily the side effects of amplitude problems associated with ripple and having the wrong center frequency and bandwidth to do the job.

This gets to the heart of the graphic EQ’s principal shortcoming: fixed center frequencies and bandwidth choices. Simply put, the graphic could never succeed as an optimization tool because the problems it is trying to solve do not have a single fixed bandwidth and are not obliged to fall on the ISO approved center frequencies. Our challenge is more complex than that.

At one of the first concerts that John Meyer and I measured with SIM, we came up against the graphic EQ rule mentality. We measured a 10 dB peak at 100 Hz and knocked it out with a single cut on the graphic EQ. We high-fived each other at the perfection of the lucky match of center frequency and bandwidth as we watched the amplitude and phase flatten out.

A short time later the system engineer saw the single deep cut on the graphic and freaked out. He then reworked the settings and made them look nice and gentle on the graphic, which looked good there but no longer solved the problem.

He explained to us that he needed to do this because we were messing up the phase response. It never occurred to him that we could actually see the phase response right there on the analyzer. On that evening John and I knew that we could never beat the graphic EQ police and needed to make a better tool.

Parametric EQ

The inadequacies of the graphic equalizer became totally apparent once we began to see high-resolution frequency response data. Our analyzer could now show us a problem centered at 450 Hz, but we were stuck with the graphic EQ’s fixed filters at 400 Hz and 500 Hz in that area. The inability of the graphic EQ to create a complementary response to what we measured was impossible to ignore.

The parametric equalizer was immediately seen as the superior tool, since it had independently adjustable center frequency, bandwidth and level. Anything that we could see on the analyzer that was worth equalizing could be precisely complemented with the parametric filters.

The high-resolution transfer function analyzer put an end to the usage of the graphic EQ (although it took a long time to die). Now we could see the phase response (one mystery solved) and we could also see the actual amplitude response of the combined filters (second mystery). The analyzer proved that the graphic could never satisfy our needs. The graphic EQ is now used only for gross tonal shaping by ear, i.e., an artistic tool or for combat EQ (stage monitors), not a system optimization tool.

There were several reasons why parametric equalizers had attained such minimal acceptance before that time. The first was that people had trouble visualizing in their mind what the filters were doing. Filters could be set anywhere – including right on top of each other. You had to look at all the settings and then conjure in your mind what it all meant. This made many engineers understandably insecure. Most modern parametrics accurately display their response on their front panel display or software program, even incorporating filter interaction. The parametric response is no longer a mystery.

The second issue was that most commercially available parametric EQs used a filter topology that was poorly suited to system optimization. The filters were asymmetric, having a different type of response for peaks and dips. The dip side of these devices used a notch filter topology, which does not properly complement the peaks it’s trying to treat in the sound system. The notch-type parametrics with wide peaks and narrow dips actually mimic the problematic comb filter effects rather than compensate them.

SIM measurements that highlight some of the problems with graphic EQ.

The high-resolution measurements taken with SIM showed us the advantages of complementary phase equalization: a parametric EQ with symmetric second-order filters with minimum phase shift, which became the guiding principles behind the 1984 development of Meyer Sound’s CP-10 equalizer. We could now put the “equal” in equalization by producing an equal and opposite amplitude and phase response to the peaks and dips found in the field.

As previously stated, there was a lot of resistance to parametric EQ in those days because of the lack of a graphic user interface. The SIM analyzer gave us something better than a graphic interface. We could view the actual measured amplitude and phase of the EQ without repatching or taking it off line.

Transfer function measurement allows us to probe across any two points in the signal path of our sound system. We can monitor the EQ output versus the EQ input and see precisely what response the device is creating. From the outset, the SIM system was always set up to be able to view the EQ electrical response as well as the response of the speaker system in the room.

The Digital Age

The digital era dawned with the introduction of digital delay lines. These replaced the previous generation of analog delays (yes there were such things but their dynamic and temporal range was very poor).

The first-generation digital delay was a noise floor choke point, so it was used only sparingly, when absolutely needed. The digital delay within the modern DSP is different from its first-generation version only for its higher dynamic range and resolution (and better A/D conversions).

The systems of that era had digital delays, analog equalizers, and analog level distribution, all in separate devices, each of which had only a few inputs and outputs.

Once digital equalizers started to hit the market we quickly reached a tipping point in favor of merging all of these functions under one roof. Go to a rental house tomorrow and ask for a component digital delay or analog EQ. There will be hundreds to choose from once you blow the dust off.

There is great advantage to minimizing the number of A/D conversions, the wiring, patch bays, ground references, power supplies, etc. All of these functions are now done with multichannel input and output devices.

We have evolved to two families of DSP: open topology and fixed topology. The open topology systems (e.g., BSS Soundweb, QSC Q-SYS, Peavey MediaMatrix) are inputs, outputs and a mountain of malleable memory. They are an open interior waiting for us to arrange the furniture. Users can pull “devices” off the virtual shelf and “wire” them up to customize them as needed.

Meyer Sound CP-10 parametric EQs in the racks at Carnegie Hall circa 1988.

Fixed topology devices (e.g., Lake Controller, Meyer Sound Galileo) have pre-ordered the parameters and signal routing, incorporating all the features relevant to system optimization (and more).

The filters in the modern DSP mine the filters of our analog world but can also go beyond them to make exotics. Very few of our optimization needs can’t be solved by the analog filter shapes (parametrics, band filters and all-pass filters), so these are the still the workhorses. The digital exotics such as FIR filters and free-pass filters require adult supervision.

But there’s good news: a digital version of the graphic EQ can still be found as an option in most of these devices. Works great with vinyl.

Study Hall Top Stories

Supported By

Linea Research is a leading designer and manufacturer of high performance power amplifiers and DSP controllers for the entertainment, live sound, installed sound and commercial sound markets.