In the midst of prevailing “doom and gloom” amongst our peers, we saw signs of hope: there was more effects work in movies than ever before, and the entertainment industry’s longstanding technology model had failed enough times that jettisoning it was an obvious part of the solution. Instead of building a giant “render farm” to perform the data processing required to create photo-realistic images for film, our idea was to build Atomic Fiction with the cloud as a core component of its business model. Doing so would allow us to save on up-front capital expenditure and give us the ability to scale up in risk-free lockstep with our clients’ creative appetites. At that time, an overwhelming majority of our most experienced colleagues had the following advice for us: “that’s a stupid and irresponsible approach. It’ll never work. Don’t do it.” That’s exactly how we knew it was an awesome idea!
In the ensuing years, we faced our fair share of challenges in the pursuit of a cloud-based visual effects workflow. Each hurdle that we faced was overcome, thanks to some ingenuity combined with bull-headed naïveté, and we celebrated some amazing successes as a result! We were the first, and only, visual effects company to complete work on A-List projects like Transformers, Flight, Star Trek, Cosmos, and Game of Thrones entirely using cloud-based tools. Despite (or, perhaps, because of) those wins, our biggest challenge was still to come.
Robert Zemeckis, director of Back to the Future, Who Framed Roger Rabbit?, and Forrest Gump, had witnessed the benefits of our cloud-based workflow when he employed Atomic Fiction to craft the iconic plane crash sequence in Flight (Denzel Washington). In 2013, on the heels of that project’s success, he approached us and said “I want you guys to do The Walk for me. We’ve got an incredibly tight budget, but you’re going to need to recreate the Twin Towers and they have to look absolutely real. And you need to build 1974 New York from scratch. Oh, and it’s all going to be featured in about 30 minutes’ worth of the movie.” Our hearts jumped in to our throats. Our palms started to sweat. And then, thanks to our faith in the future of the cloud, we said “Sure, Bob!”
Let’s take a look at the challenges that we knew our team would face on The Walk:
30 minutes worth of photo-real environments to create, start to finish, in 8 months.
Each second of screen time would, according to our calculations, require about 5,000 processor hours to realize.
Given the nature of the deadlines, our teams needed to be able to spike to 15,000 cores simultaneously on-demand in order to stay on schedule.
Using traditional infrastructure, it would cost $7m to build and run the infrastructure required. Even amortized over its useful lifespan, and billed monthly to the project, that would’ve set us back $2m. We had a fraction of that to spend.
On top of the film’s requirements, we had two very simple business goals for our data processing infrastructure:
Because we don’t realize our profits until the end of the project, we want to spend as little as possible to get up and running.
We don’t want to carry any continuing expenses beyond the end of the project, just in case we have less (or no) rendering to do once the film is done.
Those requirements sound kind of insane in the traditional way of thinking because…well…they are! That scale of infrastructure is normally reserved for companies with 4-figure employee counts and, at that point, we were just cresting 100! We knew that, in order to achieve the necessary combination of scale and economics, we’d have to tap heavily in to the cloud. Since no existing cloud rendering solution could address our needs at that scale, we decided to develop our own software, leveraging Google Cloud Platform, and we called it Conductor
To understand why the technical demands on The Walk were so intense, it’s important to illustrate what rendering is. In the beginning of a shot’s life, an artist works with a very light-weight “wire frame” representation of a scene, which looks something like this:
Its relative simplicity makes the scene quick to interact with, so that an artist (or many of them) can work with programs like Maya and Katana to create “recipes” for how cameras move, surfaces are textured, and the lighting falls on objects. Once that “recipe” is defined, it needs to be “put in to the oven and baked.” This process of calculating how light is received by, and reflects off of, millions of individual surfaces is called rendering. Here’s the final result:
A single iteration of that one frame, which amounted to 1/24th of a second of The Walk, had 83GB of input dependencies and took 7 hours to render on a 16-core instance. That’s impressively intense on its own, right? Now imagine 1,000 instances all being asked to do other, similarly intensive, parts of the film all at the same moment in time!!!
In order to handle that kind of scale, our artists turned to Conductor: a general-purpose cloud orchestration platform. Conductor manages the entire cloud rendering workflow from end-to end: it facilitates uploads and downloads of dependent data, queuing, security and scales cloud resources while maintaining optimal performance. Thanks to Google Cloud Platform’s efficiency, availability of resources, and per-minute billing, it was a clear choice as the cloud to serve as Conductor’s back end.
By the time that Atomic Fiction had finished The Walk, battle-hardening Conductor on Google Cloud Platform, Atomic Fiction’s team of talented had successfully created effects sequences that Deadline Hollywood called “best I have seen all year” and USA today crowned “a visual spectacle.”
The stats associated with The Walk’s production are immensely impressive, and constitute the largest use of cloud computing in the history of filmmaking. Here’s the final Conductor score card:
In total, 9.1 million core hours - over a millennium worth of computing - was used to render The Walk.
Movie projects are heavily back-loaded, and The Walk was no exception. 2 million core hours were rendered in the final month alone.
Peak scale of over 15,000 simultaneous cores was achieved while rendering I/O intensive tasks.
Artist productivity was increased by 20%
Savings vs traditional infrastructure was in excess of 50%
Because Atomic Fiction had Conductor at their disposal, Zemeckis was able to realize 4 more minutes of the movie’s climactic scene than he would’ve otherwise been able to given the budget.
When looking at The Walk’s Conductor processor hours graphed out monthly, it becomes strikingly obvious how varied processing needs are over the course of a project, and how Conductor allowed Atomic Fiction to scale to huge heights without leaving any legacy weight to carry to the next project.
The true proof of the pudding came when Robert Zemeckis, having just screened the final cut of the film, called Atomic Fiction co-founder and visual effects supervisor, Kevin Baillie. Said Zemeckis, “I think this is the most spectacular effects movie I’ve ever made. I can’t imagine making a movie in any other way.” Coming from the director who has pushed the art of visual effects perhaps more than any other director alive, that’s the highest honor imaginable.