Wired Article On NIN Tour

Too bad I missed the Nine Inch Nails show when it was in town (I was a big NIN fan back in the 90's, but I kind of lost interest in recent years), because the tour looks pretty interesting tech-wise, and apparently features something I've long advocated for (see the third edition of my book): performer-driven show technology.  


From the Wired article:

... Reznor and other band members are able to trigger and control various video loops and effects directly from the stage. The musicians can also interact directly with those visuals onscreen during the show, thanks to a sophisticated array of sensors and cameras.

For the most part, those visuals come from Reznor and Rob Sheridan, Reznor's creative partner and the art director for NIN. But the two had considerable help from a few outside parties in putting together the production.

Roy Bennet, a veteran lighting designer who worked with Reznor on the Downward Spiral and Fragile tours, designed and put together the LiTS set according to Trent's initial specs.

It was also Bennet who suggested bringing in the other key part to the show, a company called Moment Factory.

Responsible for the technology driving most of the interactive tech elements, Moment Factory is a boutique Canadian outfit that's worked on a number of Cirque du Soleil shows and has produced other industrial visual installations.

For the interactive portions of the show, all the onscreen video is rendered by Moment Factory's custom rig, a trio of Linux-based devices collectively known as "the brain."

"They build what they call games," Reznor explains. "Each [interactive] song might have two or three settings ... or games. It's basically particle-based animation."

Those particles can interact with any of the various inputs Reznor and the band have selected.


For the most part, the article seems to get the tech details right, although they keep talking about "transparent" screens, which doesn't make any sense (see through, maybe?) and they have this sentence: 

As with any production of this magnitude, there are also the inevitable glitches and hiccups. According to Reznor and Sheridan, many of those can be traced back to an archaic Windows machine known as the Hippotizer, as well as an antiquated lightning console that it interacts with called the Grand Ma.

Of course, the GrandMA is current and widely used product from MA Lighting, and the Hippotizer is also state-of-the-art stuff from Green Hippo.  The industry-standard control protocol (DMX) which they likely used is most certainly archaic and antiquated, but there is absolutely no reason that these machines, both of which already use Ethernet, should not be running ACN.  In any case, both these systems run countless shows without incident, so this all sounds like someone got something wrong in the telling, likely based on pushing these products into areas for which they were not designed. And the statement that, "For the next leg of the tour, Sheridan is working to permanently move the entire lighting and visual system over to a Mac rig running ArKaos VJ software" doesn't really seem likely to me.  ArKaos (marketed by Rose Brand in the states as the core of their Panorama system) is pretty cool stuff (we have it at my school), but it wouldn't be like (as the author might be envisioning) running the whole show off of one Mac laptop.  And in the article there is a lot of Windows bashing, but of course, Macs crash too.

In any case, the article is worth a quick read, and if anyone sees the show or has any details, please post a comment!