Tech Topic: Modern Analyzer Development

The SIM 1 platform gave us all the information we needed to set delays, levels, EQ and aim loudspeakers. The data could be stored in memory and post-processed.

In short, it could do almost everything a modern analyzer software program can do now. But because it was built around the “some assembly required” HP3582A, it required 63 steps, not to mention taking up 24 rack units, weighing 600 pounds and costing $80,000.

We did a lot of shows, learned a lot of lessons, and were ready to move forward to the next generation.

Our initial strategy was to get one of the companies that manufactured analyzers to undertake the project. They were the big players in the measurement market and we were a relatively small loudspeaker manufacturer. We laid out the plan but there were no takers for a market so small.

Hewlett-Packard did come up with a proposal to network 21 HP3582’s together with a computer to display a composite response. This would have cost around $400,000, weighed almost a ton and required “only” 100 rack spaces. Very practical!

They already had customers that were using this type of system: U.S Navy submarines. It was clear that we had to either get Pentagon funding or do it ourselves.

Bringing It Together

SIM II, introduced in 1991, was the first analyzer to put all the linear spans together into a single high-resolution display. This was accomplished by taking 8 separate time records and using only the top octave (its highest resolution) and throwing out the rest. The linear pieces were spliced together as a log series to create a single trace of 24 points/octave data. It was not technically 1/24th octave data, since each octave contained 24 linearly spaced data points, which is why it’s termed “quasi-log.”

Meyer Sound used the term “Constant Q transform” to describe this groundbreaking implementation of the Parks-McClellan algorithm. All of the modern analysis systems in large-scale use in sound reinforcement have implemented an adaptation of the algorithm. Each octave (going down) is derived from a time record twice as long as the one above, while maintaining the same number of data points. The result is constant, high resolution.

SIM 3 (2003) represented an incremental expansion of capability, ease of operation and reduction of size, helping the optimization process become more mainstream.

The introduction of SIM II represented the first time that people had seen A) full-range high-resolution data, B) full-range log phase response, and C) coherence/signal to noise over frequency. This development created a permanent place for the FFT analyzer at front of house because the data was now displayed in a way that the average engineer could understand. This did not mean that the work of optimization was suddenly easy, but it was, at last understandable and teachable. SIM 3, the next generation, was very much in the mold of its predecessor but with faster processing joined by a smaller size and price point.

FFT Goes Mainstream

The SMAART platform was introduced by SIA in 1995 under the leadership of Sam Berkow. The key breakthrough with SMAART was that it could run the FFTs on a standard PC without the need for a dedicated DSP engine. SMAART was a program whereas SIM III was a 4-rack-space piece of hardware with internal A/D, DSP, delay line, etc. (and software).

SMAART required each user to provide all the peripherals, which was a challenge at the time, while SIM was complete but also very expensive (especially for something that people were unsure about what it was/did). With SMAART, there was “some assembly required” but it had implemented the quasi-log FFT and its price point was so low that sales expanded quickly. The FFT had moved into the mainstream, but the majority of early SMAART users were basically using the tool as a high-resolution RTA.

We now had high resolution at FOH, but still, the mentality was equalization rather than optimization. SMAART (now stylized as Smaart) has evolved greatly under the guidance of Jamie Anderson and the team at Rational Acoustics, moving FFT optimization squarely into the mainstream. They’ve taken advantage of the ever-increasing processing power of the personal computer to expand its capabilities with each of its 8 generations.

I can still remember arriving at tunings 30 years ago and having people laughing at what a stupid idea it was to bring all this laboratory science into their world. I’d be lectured about how I lived in “theory world” and they lived in the “real world.” I don’t hear that any more. The FFT is now the new normal. It just took a long time to get there.