To Infinity & Beyond: The Intersection Of Modern Production Technology

May 31, 2013, by Ken DeLoria

live sound international

As the world continually progresses, new technologies and methodologies inevitably present themselves.

Most developmental efforts are intended to aid technology users improve their work output in respect to quality and efficiency, while also aiming to make a profit.

This cycle occurs in every industry from heavy construction to microscopic surgery. We, as the providers and operators of entertainment and communication equipment, are constantly in the midst of it all.

And we must adapt to these changes if we wish to stay relevant.

Original ideas come from many sources. Often, it’s a manufacturer that envisions a better way to accomplish a given task, and that vision drives the creation of a new product. Smart phones are a recent, topical example.

Other times, end users pick up the torch and run with it, wishing to make their workflow more efficient and themselves more marketable. Think smart phone apps, frequently developed by individuals or small groups, which foster productivity and convenience while sometimes also earning income for their creators.

The Urge To Merge
Humans are builders, developers, creators, designers, and informed users. It’s in our DNA and it’s there to stay.

Case in point: a new way of looking at the big picture is emerging in professional production. Formerly disparate practices (think sound versus lighting versus video) are beginning to intersect.

Soundcraft’s recently debuted Si Performer small-format audio console includes on-board DMX lighting control, and is an example of an ongoing paradigm shift in product design and live production capability.

Imagine how this might be taken even further. What if an audio console had a second small fader and a few knobs that controlled lighting, located next to each audio fader? A single engineer could bring up a lead guitar solo in the mix, while simultaneously controlling the brightness and color of the lights that are focused on the guitar player. Ditto for keys, drums, vocalists, and so on.

It would be hard to deny that this could be a valuable intersection of technologies with a potentially wide appeal, especially in respect to budget-challenged events.

With lighting controls adjacent to audio controls, there would be no distraction to the console operator. “Tap” buttons might also be added to each channel to allow rhythmic flashing of lights. MIDI, DMX, or another protocol yet to be developed could make rapid scene changes that alter outboard audio effects, onboard plug-ins, and lighting scenes, all at the same time.

Or, perhaps the noise gates on the drum channels could be used to trigger various synchronized lighting effects within the same console, changeable as desired for different segments of a performance, either on the fly or by means of scene presets.

A future implementation might also include a button on each channel that would instantly repurpose the audio controls for lighting or video operation, thereby reducing the number of physical controls, the size of the console, and the cost of manufacturing.

Master Of The Universe
As far-fetched as it may seem right now, such “master controllers” might very well be the consoles of the future.

We’ll leave it to the product managers to conduct the market research to determine if such ideas are valid – or just vapid speculation.

But no one can deny that the economic value of employing a single system operator – instead of two or three – might make all the difference between economic success and failure for a wide range of production technology users, such as small and mid-sized churches, clubs, and various live events.

Moreover, the operator would naturally come to embrace the holistic properties of the event or presentation, rather than only concentrating on one narrow aspect such as sound or lights.

Of course this scenario is not likely to fit large events that employ large crews that are already overworked. But such events are few and far between in contrast to the thousands and thousands of modest events that take place every day, all around the world.

A Preset Is A Preset?
Numerous presentations and shows are tightly scripted. A scripted event is in sharp contrast to, well, let’s call it an “open” performance, one in which the engineer must make decisions on the fly that might vary significantly from performance to performance. Jazz artists, jam bands, some rock bands, and most artists who improvise as part of their live performance fall into this latter category.

That said, the vast number of theatrical shows, AV presentations, worship gatherings, and choreographed pop concerts are often highly predictable. Most of the heavy lifting went into setting up scene presets, usually by a gifted sound designer, and usually well in advance of the performance run.

So if scene presets are already being used to control audio, why not have the same scene presets control lighting as well? Sure, the lighting cues might not fall directly on the same clock cycle as the audio cues, but that could easily be controlled with a simple timer routine.

“Cue #10” might fade the lights to onehalf and then five seconds later change audio to a series of new settings. But it would still be Cue #10 – instead of having no relationship to the audio cue list.

Dream Come True
The concept could go further by including motion control for scenery and property changes. Theme parks have been using synchronized control for their attractions for many years; how cool would it be to have all the control programmable and operable from one console, not several that are merely sharing time code? The value and simplicity of keeping a single piece of gear in stock for emergency replacement, rather than many, would make it a maintenance supervisor’s dream come true.

Events that require complex motion control, such as major concerts and more elaborate AV presentations, could benefit by intersecting truss movement, scenery changes, fog machines, and other motion control applications into the same master system that handles sound, video and lighting.

In the future, it ’s not unlikely that we’ll see “master consoles” (with optional remote tablets, of course) that handle all of the show requirements, particularly when many of the routine tasks are per-programmed and the operator is not expected to run complex audio, lighting and video on the fly, but rather, to step through scenes that were programmed during pre-production.

The point is that significant advantages could be realized when all production aspects are managed by a master controller, instead of individual, unrelated controllers that are the norm of the methodology used today.

Where’s The Copper?
And then there are data transport systems. We now have bi-directional digital delivery systems that can carry audio, video, Ethernet, RS -232 and RS -422 control signals, production communications, and in some cases, almost any other signals that need to get from point A to point B.

Data transport systems not only preserve signal integrity, but some allow for I/O break-outs anywhere they might be needed, and without requiring additional cable infrastructure.

When you think about it, a comprehensive data transport system is just a step or two away from becoming a console, or more to the point: a futuristic master controller.

The I/O is already there, microphone preamps are already present, so the next step is to add one or more control surfaces, DSP, control software, and perhaps DMX, AVB, MIDI, NTSC, HD, and whatever else might be needed to accomplish a full suite of tasks. Voila! The new multimedia Master Control Console has just been born.

Perhaps the hardware comes from one source and the application software from another. Just as smart phones can run apps created by thousands of unrelated developers, perhaps apps will be superimposed on data delivery systems so that a client can shape them into whatever he/she might want. Now that would indeed be a true intersection of skill sets!

Same Or Different?
Will adherence to the traditional approach of dividing the workload among different production departments make this speculative outlook no more than a pipe dream? Or will the economic advantage of reducing headcount override the long-standing tradition of dividing tasks and responsibilities?

It will be interesting to see which manufacturers (and users) embrace a new approach to intersecting technologies, and which remain tied to the present mentality. Like most new ideas, there will be some who will immediately be excited while others will utterly dismiss it.

But growth and advancement is built into our genetic coding, making it highly likely that some form of the concepts we view as being far-fetched today will become standard operating procedures tomorrow.

I recently spoke with representatives from several audio console manufacturers. Some indicated that they do not see media and control intersection as being important at this time, while others said they’re watching this very closely.

But there’s no mistaking that our beloved entertainment and communication industry is in a state of transition; the question is the direction it’s headed.

As production technology inevitably marches forward, stay attuned to ways to maximize your personal value in the new paradigm of intersection.

Engineer #1: “The only constant is change.”
Engineer #2: “Yes, well it used to be that way.”
Engineer #1: “Oh. What happened?”
Engineer #2: “It changed.”

Ken DeLoria is senior technical editor for ProSoundWeb and Live Sound International and has had a diverse career in pro audio over more than 30 years, including being the founder and owner of Apogee Sound.



Return to articleReturn to article
To Infinity & Beyond: The Intersection Of Modern Production Technology
http://www.prosoundweb.com/article/to_infinity_beyond_the_intersection_of_modern_production_technology