‘Everest’ visual effects powered by Verne Global’s green Icelandic data centre

everest film

ComputerworldUK speaks with Rui Gomes, CTO at RVx

Share

Icelandic director Baltasar Kormákur’s survival thriller Everest was lauded for its dazzling visual effects upon release this September.

But recreating the world’s highest mountain for the big screen was no small feat.

“Basically we built Everest,” says Rui Gomes, CTO at Reykjavik-based visual effects firm RVX, a task that he explains was “extremely” data-intensive. “[For] Everest we used over 200 terabytes of data, and many, many hours of rendering,” he adds.

RVX was set up as a branch of UK visual effects studio Framestore in 2008, before rebranding two years ago. It has since helped produce films such as Two Guns, The Deep and, most recently, Everest. 

To support its growing data demands as it takes on new projects, RVX partnered with co-location provider Verne Global to provide high performance computing (HPC) capabilities for its graphics rendering processes.

The Verne Global data centre – which relies on 100 percent renewable energy - was set up on a 44-acre data centre campus in a former NATO base in nearby Keflavik in 2012. 

“I needed a place to host our equipment and it was important for us that it was as 'green' as possible,” says Gomes, adding that Verne Global “offered us the best conditions that met our requirements”.

Previously RVX had relied on an on-premise data centre which had been difficult to manage as the business grew - requiring new cooling and additional hardware that could take weeks to provision.

“We would reach the point where we need to scale past what our server room here [could handle]. Then it would be a two-month project just to get it to the next level, and we would reach that level during Everest. And then it would be on to the next level again.” 

One of RVX’s biggest challenges from a technical perspective is ensuring that its team of visual effects artists are not constrained by the growing volumes of data.

“Things move quickly [with film production]: you say 'here is the footage', then all of a sudden you have 20 terabytes of data coming in the next day,” he says, adding that each film places ever larger demands on its IT infrastructure.

RVX now has around 500 TB of data stored at Verne Global’s facilities, using around 3,000 processor cores. “And it is growing: every time a new project comes it is typical we need to double,” says Gomes.

It has also been able to run its infrastructure at a lower cost. 

“Verne worked with us to put up what they call PowerDirect. So basically you have direct cooling from outside and direct power. It is not backed up by UPS [uninterruptable power supply] or anything like that - I take care of that and have my own UPS at very, very good cost," he says.

“The cost benefit is the best we could get. It is a tier one [co-location arrangment] because most of the time these machines don't contain any data, they do the calculation and they save it into storage, so we move all of our rendering there.”  

Move to the cloud?

While such peaky workloads might be suited to the public cloud, Gomes says that RVX decided against shifting its data out to the likes of Amazon Web Services or Google. Latency would make certain editing processes too slow, for example, while working with large film studios such as Universal requires tight restrictions on where information can be kept.

“The data centre is not dead," says Gomes. “For contractual reasons we can't render in a Google or Amazon cloud because we need to be having all the footage within our controlled environment. I need to vouch in every step that I know who has access and where that footage can go.”

Find your next job with computerworld UK jobs