The wow factor
If you’ve ever been to a big stadium concert at least once in your life, or even a big-budget stage production of a music or theater show, then you’ve already experienced the sheer power of live visual effects.
On a personal level, I remember seeing Muse perform a stadium gig as part of their Resistance Tour, way back in 2010.
I only knew the band through their music, never having subjected myself to poor-quality cellphone video of any of their concerts.
The live experience was, to put it mildly, eye-opening, in more ways than one. The stage design alone was worthy of its own lengthy analysis, and there was never a moment when the visuals distracted from the performance.
Every screen, every prop, every spotlight only contributed to the wow factor of seeing an enormous show play out on a massive scale.
But rock shows no longer run the game when it comes to big, bold, and jaw-dropping concert visuals.
Over the last 20-30 years, DJs and EDM artists have steadily grown in popularity, and during this rise to power, their live performances have become more complex, more elaborate, steadily taking advantage of emerging technologies that let artists consistently up their game.
Here at Current Artisan, we were curious to find out how top-tier DJs, bands, and solo artists create such memorable and mind-blowing visuals for their live shows. After some intensive searching, we traced the best industry visuals back to a single brand.
In this article, we’ll take a look at the machine behind some of the most impressive DJ visuals software currently in existence: the disguise media platform.
Whether you’re a DJ yourself or just a fan hoping to gain a better understanding of how large-scale concerts come together, this will be a fascinating journey through high-level effects and the hardware and software that makes them possible.
Always in disguise
While doing research for this article, there was certainly a rabbit-hole effect at play. The tech behind concert visuals has become so advanced that understanding it takes a bit of background info.
So when disguise calls itself a media server, an outsider such as myself still doesn’t have a good grasp of what the platform actually does.
Luckily, I had the chance to speak with a disguise expert by way of Naoki Ogawa, a certified disguise trainer out of Japan, currently working in Los Angeles.
Ogawa walked me through the importance and capabilities of a media server like disguise.
Basically, a media server is a physical hardware server that is used to coordinate many different elements of on-stage visuals and effects, as well as managing recorded content in post-production.
disguise, in particular, makes use of powerhouse processing to create and arrange a huge number of impressive visual effects.
What’s that? You’d like some proof that disguise is at the absolute forefront of concert visuals?
The following is a list of just some of the artists who have used disguise to create incredible concert experiences: Katy Perry, Muse, Beyonce, Billie Eilish, Post Malone, Foo Fighters, Chance the Rapper, The Killers, The Rolling Stones, and Katy Perry.
So why has disguise become the go-to platform for the world’s biggest performers? Well, to summarize, it’s a kind of toolkit for video production and stage design professionals. It serves many different functions, many of which are on the cutting edge of technology, giving us a glimpse into the future of live performance.
Ogawa guided me through just a few of the key features of disguise and how they can be used to create next-gen visuals.
The dry run
Whether we’re talking about live concerts or big Hollywood movies, visual effects can only be planned to a certain degree. For movie studios, principal photography needs to be completed first, then the visual effects team can get to work on making those plans a reality.
For concert experience designers, planning is crucial, but it inevitably has its limits. In the past, it was impossible to accurately visualize the end product until finally setting up the rig in an actual venue.
The problem? Not every roadblock can be anticipated before the show itself. This can cause some major last-minute problems.
To circumvent some of the issues, disguise utilizes a feature that simulates all of the show’s visuals long, long before the house lights go down.
“The disguise media server provides us a simulation view in a virtual 3D environment, which helps the designers see how everything will look on stage. Of course, we usually still need to do some tweaking on-site, but the simulation is super helpful and saves the production a lot of money.”
Instead of having to stage a full-scale rehearsal in a concert space, a production can run the show right in the disguise server to look for problems or, even more importantly, areas for improvement.
Even the most popular DJs worldwide never really get to experience one of their own shows from the perspective of an audience member. disguise, however, can let performers do just that. Suddenly, the artists themselves can become directly involved in the visuals and effects for their own show.
With past concert visuals systems, this kind of pre-planning and collaboration just wasn’t possible.
Imagine that you’re planning a live concert, one that involves many different elements: lighting, onscreen visuals, audio, camera operation, and the choreography of the performers themselves.
You could run each of these elements separately, but then there would be many more ways for things to go wrong.
With disguise, all of these systems can be controlled and/or coordinated via a central hub. The disguise platform integrates every component of a live show.
A popular technique in contemporary concert design is to make it appear as though the visuals are interacting directly with the music and the performers.
This helps transport audience members to an otherworldly place, one where music really does have a physical impact on the world around it.
For a long time, achieving this effect was difficult, expensive, and time-consuming.
To get around all this trouble, disguise uses a system called Blacktrax, which is used to coordinate visuals based on the performer’s exact position on stage at any given time.
“The Blacktrax system is a tracking system. If you set up the system in a specific area and attach sensors to the performer or object, you can get their position data in real-time. When we use that position data, we can make effects that follow performers on stage without the need for pre-programming.”
Just take a look at the video below, which Ogawa helped produce.
Using older systems, the performer would have to follow all choreography to a T, never allowed to make a mistake, lest the visuals get out of sync with their movements.
Blacktrax follows the performer’s movements exactly, communicating every movement to the disguise server instantaneously.
It helps bridge the gap between music and the real world, helping musicians and especially DJs seem less like performers and more like gods, capable of coordinating unbelievable visuals.
AR adds another dimension
An extension of this interactive visuals technology is disguise’s real-time rendering capabilities. For those who don’t work in video, rendering is the key final step of video post-production. It solidifies effects, editing choices, and the overall look of the video footage.
Despite our many advancements in processing power and digital storage, video rendering remains a slow and painful process.
Ogawa explained just how long rendering can take:
“Generally, one second of video is a series of 24 to 30 images. That means five minutes od video requires 9000 images to be processed. To render that many images, it normally takes two or three hours. If the contents are 3D, it takes even longer.”
disguise has done away with rendering wait times completely. Instead, the system makes use of real-time rendering. The obvious advantage is being able to make changes more quickly, but this capability also for the creation of more complex visuals.
One standout example is the use of AR effects, which Ogawa explained.
“Real-time rendering allows for more possibilities. With Blacktrax, we can get more complicated results, such as AR. When we use the camera tracking system, real-time rendering generates video content by calculating the camera angle and distance. As a result, we can create visual effects that exist on a different level.”
That’s right: AR isn’t just for Pokemon GO anymore. Now performers can feature interactive 3D models in their shows, and in some cases even interact with them.
DJs can dance alongside cartoony figures in bright colors. Audience members could be given on-theme virtual masks and headwear. The possibilities that spring forth from this advanced tech are nearly limitless.
Where do we go from here?
All of this technology can be a bit staggering. DJs and other contemporary performers can do just about anything they want to create an unforgettable concert-going experience.
It can be difficult to imagine anything more complex, more impressive, or more useful in a live performance space.
But Ogawa, disguise, and concert designers themselves are always on the lookout for the next big thing.
As music continues to evolve and advance, so will the tech we use to enhance musical experiences. It’s the best kind of arms race, one where the results are always positive, helping audiences around the world connect more viscerally with the music they’ve loved for so long.
For any aspiring DJs out there, you might not be able to afford the disguise server just yet, but the principles it highlights are universal.
Don’t let your visuals distract from the music. Everything, from lighting to camerawork to onscreen footage and visuals, should complement your music, your performance.
When sight and sound are seamlessly blended, musical nirvana is suddenly within reach.