Union VFX
Union VFX
It was the biggest single shot Union VFX had ever created and it posed a big challenge: How would it find the extra storage and processing power required to create this ambitious and intricately detailed rocket take-off scene whilst running other projects concurrently?
Union was commissioned to create a complex scene for the “epic season finale“ which closed out the first series of For All Mankind, a dystopian drama for Sony Pictures Television which premiered on Apple TV+. The show was created by Ronald D Moore, a screenwriter and television producer famed for his work on Star Trek and Battlestar Galactica, and is set in an alternative version of the 1960s USA where Russia won the space race and beat America to land the first man on the moon.
The studio needed to scale their hardware quickly but carefully, and worked with pixitmedia to build a high-performance pipeline capable of bursting to the AWS Cloud supported by ngenea.
“We were under time pressure and working with quite a few variables,” says Lucy Cooper, Managing Director of Union VFX.
“When it came to building the pipeline, we had to move fast! The sheer volume of data necessitated the solution we chose.”
“We worked closely with pixitmedia, to refine our studio workflow. ngenea gave us a safe and secure way of getting the data into the cloud, and back again in an intelligent way that would keep transfer costs as low as possible.”
The final episode of For All Mankind’s first season concludes with the launch of a Sea Dragon rocket: an immense spacecraft designed to be launched at sea. This behemoth was dreamed up by Robert Truax for Aerojet in 1962, standing 150 metres (490ft) tall with a diameter of 223 metres (75ft). Though never constructed, Sea Dragon is the biggest rocket ever conceived by humanity.
The team started work in August 2019 on the 4K single shot of the rocket launch consisting of 2,544 frames. The rocket was modelled and textured using SideFX Houdini, but because it launched from under the ocean, artists also had to create simulations for the surrounding water, foam, fire and smoke that was generated as the craft blasted skyward.
The whole shot was built in Nuke and featured many other CG elements including helicopters, buoys, pyro thrusters, and the engine’s plume. Each of these had to be generated with multiple separate simulations and renders due to the length of the shot, requiring complex interactions between the elements over time. Artists then used digital matte painting techniques to create the sky and the USS Enterprise battleship, which was shown in the deep background of the composite.
The renders and simulations involved in the shot were huge, and simply too data-intensive for Union’s on-premises systems alone. The rocket plume, for instance, was the most demanding part of the project and took eight days to simulate on powerful systems fitted with 96x CPUs and more than 768GB RAM working flat out to generate 4 billion voxels.
To complete the project, Union turned to AWS Thinkbox Deadline for render management and pixitmedia’s ngenea. The AWS Cloud setup was configured to automatically instruct ngenea to replicate only the required dataset to pixstor, a storage solution built to handle demanding media and VFX workflows. The job was then rendered in the cloud using pixstor as primary storage, with Deadline ensuring ngenea only migrated finished content rather than temporary data – saving time and storage space. When the project was completed, data was archived in Glacier archive storage to save cost.
Lucy adds: “We were quite invested with pixitmedia and were in the process of upgrading our studio to pixstor before we won the job so we approached them to see if we could use ngenea to manage storage during the production of For All Mankind. Having virtual versions of files in multiple locations, was both useful and cost-effective.”
The ngenea platform allows dynamic data management and is designed to enable globally distributed workflows. It was developed to let customers quickly and securely transport data between the cloud or other forms of storage – automatically moving data to “right cost” resources. It’s particularly useful for a demanding VFX pipeline set up to store and share huge amounts of data.
“As we were already a pixstor/pixitmedia customer and were aware of their roadmap on the ngenea product it was definitely the best direction for us to use on this project.” says Marc Brewster, Head of Technology at Union VFX. “Its ability to intelligently control disparate ‘lumps’ of storage into a single namespace means that whether its cloud storage or other on-premise non-pixstor storage we can maximise usage and reduce data duplication and transfer.”
“All studios have data battles now,” Lucy says. “No matter how much on-prem storage you have, it gets filled up. At the rate hardware is developing at the moment, it doesn’t make sense for us to buy hardware over and over again. Although we have hardware on-prem, the model of flexing and expanding capacity when it’s needed is very attractive. It makes us more agile and flexible.”
Building a pipeline can be a big challenge – particularly when working under time and cost.
Lucy has this advice for anyone looking to ramp up their system and build a powerful new workflow: “Try to ring fence time to plan and work through the whole process of building a pipeline. Then look at the balance of what you can do in-house and what makes sense to do in the cloud.”
“For us, it makes most sense to do predictable work in the cloud so on-prem capacity can be used to cope with the more unpredictable, experimental stuff. This approach allows you to predict the price of cloud rendering reliably which is essential.”
The collaboration between pixitmedia and Union demonstrates how studios can quickly upgrade their systems to deal with demanding projects. Barry Evans, pixitmedia CEO and co-founder, adds: “When studios land new commissions, they often have many other projects running at the same time – and find they need to ramp up their storage and rendering infrastructure very quickly.”
“The partnership between pixitmedia and Union gave the studio the ability to overcome a shortfall in storage and render power, and helped them successfully deliver this data-hungry, epic project on top of their existing workload.”