Musical Time: Using a Conductor's Beats to Vary The Speed of a Video and Other Media For A Live Show

In the 1980s I worked for the legendary Associates and Ferren, where we did things like build time-code controllable film projectors for Pink Floyd, Roger Waters, and others. A click track (metronome) would be generated along with the film, and that was sent to the drummer’s monitor, who would then synchronize himself to the fixed, immutable media. As long as they stayed at that tempo and didn’t stop or jump ahead or anything, this would allow the entire band to synchronize to the film.

This always seemed like a limitation to me, and one of the many reasons I wrote the first edition of my book (now in its 5th, self published edition) back in the mid 1990s was that it seemed that a trend was underway, using show control technology to give control over show elements to performers (rather than the other way around). Well I was wrong; that trend died out, and rigid time-based approaches rule the world of show control today, even in some surprising places. But that’s a story for another blog entry; today I want to talk about using musical beats and measures to control linear media like video or audio.

Way back in 2006, a couple years before I started this blog, my then-colleague Dr. David Smith, one of the developers of the Sinfonia virtual orchestra technology and I worked together using his Sinfonia system and the show control system Medialon Manager to allow a musician to conduct lighting and video.

Sinfonia maintained a “tempo map” and could actually generate a prediction of when the next beat would occur based on the current tempo as conducted. We sent that into Manager using MIDI (networks were not quite as ubiquitous back then, this would likely be done using OSC or something over a network today). Based on the beat prediction from Sinfonia, I then manipulated a timeline in Manager, and the timeline controlled (with varispeed) the Dataton Watchout video, and also trigger light cues.

Here’s a video from a demo I did in May 2006, starting at the demo part at 8:28 (jump manually to that time if the embed doesn’t work right; rewind to 00:00 to see the whole presentation including technical explanation):

We presented the research at an American Society for Engineering Education conference and LDI 2006, and got a write up in Live Design which you can read here (archived capture here) and also Lighting and Sound America (archive here). And that was about the end of it. Today, nearly 15 years later it would be a LOT easier to implement this approach (the 2006 version of Manager didn’t even have floating point variables!) but beyond experimental performance I haven’t seen this approach being used widely anywhere. The technical problems are easy to solve, so it seems that producers and performers just don’t want it (although positional tracking is widely used now on time based shows—write up here). But if you see any examples of people using musical tempo to vary the speed of linear media in real time on a show, please, let me know!

Previous
Previous

RDMNet

Next
Next

MIDI 2.0