top of page

Retour à Hairy Hill rendered on Conductor®

How a UV Map Aversion and ‘Frozen’ Documentary Inspired the Papery Characters of Daniel Gies’ Latest Animated Short

Retour à Hairy Hill’ may look like a stop motion production, but the animated short is entirely computer generated. Directed by Daniel Gies, co-founder of E.D. FILMS, it features characters made of digital folded paper, both a creative and technical feat achieved through a mix of animation styles. About two-thirds of the film was rendered in the cloud and the remainder via real-time game engine. While technology was essential to bringing the project to life, Gies wanted it to remain invisible, a principle that has guided the entirety of his artistic career.

“I was really drawn to capture something that’s more handmade and create a feeling of moving through a watercolor painting,” Gies explained. “I love the feeling of paper and how working with pen and ink produces imperfect results. It’s easier to fix digital art, but that can also lead to a rigidness in the design and aesthetic.”

Scheduled to debut in 2023, a teaser for Retour à Hairy Hill’ can be found here.

Envisioned as a tribute to family folklore, the fantastical fable is based on the true story of a young woman in the northern wilderness. More than ten years in the making, the film was created by a handful of artists, with one or two working on it at any given time. Gies and his team, which included keyframe animators from Agora Studio, used a variety of creative tools on the short, including Autodesk Maya, SideFX’s Houdini, and Epic Games’ Unreal Engine, and rendered in the cloud using Redshift through Conductor. The character textures began as hand-drawn ink outlines, and various scene elements, like trees, were hand painted, while others were digitally painted, an approach that brought a physical essence to the production.

Oddly enough, the short’s aesthetic is the result of Gies’ distaste for UV mapping in 3D. He shared, “If you’re trying to draw on a 3D object, it’s not that easy. One day I built a paper puppet with an idea of what a UV-mapped face would look like. I started drawing it freehand, then I cut it out and played around with it. I was able to visualize what a paper maquette could do without having to measure and build it first–I could almost deconstruct it in my head. That turned into, ‘Wow, I could do this in the computer and actually animate it!’ I love stop-motion, but knew that I’d never be able to animate it properly.”

After determining the creative path, animation and texture proved the biggest challenges for Gies and team. While digital tools excel at cloth simulations, simulating realistic paper is far more difficult due to the material’s inherent rigidity. To achieve the desired look, they used Vellum in Houdini and kept meshes low resolution, so the output behaved more closely like paper. They defined the paper look after experimenting with different paper varieties and examining how light passed through them. During their research, they noted how ink would show through paper when backlit and wanted to re-create that effect when rendering. To ensure the paper had the proper thickness, the team tried various techniques, including displacement maps. They ended up making a high-resolution paper character that Gies sculpted and wrapped onto the low resolution cloth-simulated mesh.

“I couldn’t find any tutorials or anything on how to do this, then I watched a documentary on Frozen’ about the cloth, and there it was! You basically link your character to the original animated mesh to make it work,” Gies shared.

Gies and team rendered nearly 70 percent of the short in Redshift on Conductor, including the interiors with a hand painted feel, and the remaining 30 percent in Unreal, when the main character leaves the house. After downloading the Conductor plugin, they were able to set up renders directly in Maya and submit them to Redshift. Before running the render in full, Conductor double checked scenes to make sure all the necessary files were present and linked.

“With Conductor, it feels like the technology is speaking to artists who don’t really have the head space to tackle the complexity of render wrangling, and it’s a super straightforward interface,” Gies concluded. “It’s very easy to go online, check your account, and see the state of your shot. You can watch the frames being rendered, and see if a shot failed. As you download your footage, it’s sent directly to your Maya folder, just as if you rendered it on your own computer. The user experience is impressive.”


bottom of page