Article - Issue 59, June 2014

Creating the Impossible

Stephen Harris

Download the article (471 KB)

The 3D space thriller Gravity dominated this year’s Academy Awards, winning in seven categories. Although the film features live action footage of the actors, with the exception of the actors’ faces, computers were used to create just about everything else that is on the screen. The company that won that award is one of several in the UK that lead the way in providing visual special effects for film.

(See video links at foot of page)

Film makers have used special effects (SFX) since the early days of cinema. SFX have provided substitutes for actions and events that are impossible, or too expensive, to stage. Over the years, the technology of SFX has moved from building, and destroying, scale models to computer-generated visual effects (VFX) like those that contributed to the success of Gravity in this year’s Oscars.

The film industry in the UK has a long history of producing memorable special and visual effects, from Hammer horror films and Stanley Kubrick’s 2001: A Space Odyssey through to the Harry Potter series, which began filming in 2000. Spurred on by the Potter phenomenon, the UK’s VFX industry has established a dominant position in VFX through innovations in computer animation.

There are three companies in particular – the Moving Picture Company (MPC), Double Negative (DNeg) and Framestore, which was responsible for most of the work on Gravity – that have worked on many of the most effects-heavy films of the past decade. Their successes have included five of the last six productions to win VFX Academy or Bafta awards.

The VFX industry is estimated to generate over £200 million a year in the UK. This commercial and artistic success is cited as one reason for the recent increase in state support for the UK’s film industry, raising tax relief rates and lowering spend thresholds (the minimum amount of the film’s budget that has to be spent in the UK).

To achieve such success, the SFX firms have had to develop software tools and computer systems to enable them to produce ever more realistic images. Some of these processes have been refinements of how different software packages work together and how to manage data transfer. Other progress has involved bringing innovative new approaches to computer simulation.

THE ROLE OF VFX

Filmmakers want ever-greater detail and realism from the computer-generated images (CGI) that are used to create VFX and are making greater use of these images because they offer greater flexibility in telling a story. CGI images are wholly created within a computer, while VFX are video special effects rendered by a computer on real or computer-generated images. CGI can enhance sets, tailor costumes, create special effects and conjure the illusion of casts of extras.

For the film Gravity, VFX were not just part of the postproduction process but integral to the planning, storyboarding and shooting of the film, an aspect of ‘virtual production’ where the movie and the world it portrays are put together inside a computer.

One significant advance in SFX technology has been the development of industrystandard software packages. Software tools from companies such as Autodesk and The Foundry, a hugely successful VFX software developer also based in the UK, have to a degree democratised the VFX process. Even small firms can now use the kind of tools once available only to big studios.

For their part, big studios have also begun to release more open-source software that has helped the whole industry to improve its skills and allowed smaller UK firms to adapt these programs into more capable tools. One way to improve industry packages has been to build software that works across a range of applications by creating ‘plug-ins’, so that creative artists can carry out tasks more quickly and can combine different simulation tools for one sequence.

Part of the increasing realism of CGI is down to the ability of computers and software to handle greater numbers of what the industry calls ‘assets’ – computer models of characters, objects or entire film sets – in ever-finer detail. CGI artists build these assets in computer animations as assemblies of polygons. Film assets that once comprised a few thousand polygons now consist of millions of polygons – see Game On (Ingenia 43).

LIGHTING GRAVITY

Gravity

VFX artists use a method known as ray tracing to simulate the path of photons around a scene and to ‘illuminate’ their animations. Ray tracing generates an image by tracing the path of light through pixels in an image plane and simulates what happens when the light interacts with virtual objects.

Ray tracing is used for generating visual realism but requires significant computation time. This makes ray tracing best suited for applications where the image can be rendered (translating mathematical approximations of 3D objects into finalised visual representation) slowly ahead of time. The technique can simulate a wide variety of optical effects, such as reflection and refraction, scattering, and chromatic aberrations.

The film Gravity was unusual in that almost every frame was animated in the computer before shooting began. The filmmakers then planned how to light the shooting of the real actors according to the details of the CGI, rather than the other way around.

The filmmakers needed to change the light on the actor’s face to give a more realistic impression that they were spinning through space. For example, spinning people on earth would have shown blood rushing to the face which doesn’t happen in zero gravity.

Framestore helped develop a 10m3 box lined with nearly 200 panels – each lined with more than 4,000 LEDs – to recreate the light from the computer’s ‘virtual set’. This allowed the filmmakers to hold the actors inside the box and to alter the light scheme around them. A camera on a robotic arm then moved according to the shots as described by the computer animation.

This also allowed the filmmakers to alter the lighting scheme in real time without having to physically change the lighting array. They could then use this new lighting scheme to make corresponding adjustments to the animation in post-production.

COMPUTER POWER

The Framestore modellers who worked on Gravity built 2,200 computer models, including a representation of the International Space Station, from 100 million polygons. The animator moved each model a small amount at a time within a virtual set. Unlike traditional stop-motion animation, where models must be positioned for every frame of the film, the computer can reduce the artists’ workload by filling in most of the movement. The animator is still making all the creative decisions. They have to decide on the direction of the movement, the force, the position they want the model to end up in and how fast it will move. The computer takes out some of the laborious, meticulous work by filling in the frames between those points.

Today, computer simulation can model almost exactly how each muscle or strand of hair moves and deforms in the real world, or how the millions of particles that make up water or fire flow and interact. By building and adapting the software that carries out these simulations, and by developing tools to translate the resulting data into computergenerated images, VFX companies can create multiple layers of additional detail that can be added on to their basic animated models. To create fire, for example, the artists define the 3D volume of the flame and the number of particles that comprise it. They can then manipulate parameters such as heat, combustion, convection, drag and gravity. A simulation run then models how the particles move frame-by-frame – their velocity, direction and density – so the artists can see the overall motion of the fluid represented by a mesh across the particles.

Surfaces can be particularly difficult to get right because of the change in density of particles. Water may go from liquid to foam to air, but it usually has a relatively consistent surface, whereas a plume of smoke has different densities along its edges.

Frame store

The animation process called previsualization (previs), shown in the top two frames here, is a way that filmmakers plan scenes as a step beyond illustrated storyboards. It is unusual for an entire film to be ‘prevised’, but with Gravity an animated version of the movie was created before shooting, containing everything but the actors © Framestore

A common technique used by many fluid simulation programs is fluid-implicit particle (FLIP) solving. This is a subset of computational fluid dynamics (a way of calculating the movement of fluids using numerical algorithms) that has been used to model things like biological cells and atmospheric pollution. This combines modelling of individual particles, which works for waves and other fine close-up effects, while modelling the entire fluid as a volume within a grid is better for large expanses.

Once VFX artists are happy with the movement, they then apply tools to give a realistic appearance in terms of colour, shading and texture. This turns the data about each particle into an instruction for the computer to render it in a certain way, based on information about the fluids’ real-world characteristics, such as intensity or reflectivity.

This process, called ‘look development’, can be particularly challenging when creating physically rare or impossible effects. For example, when the Framestore team wanted to produce fire in zero gravity, it found very few reference points to draw from. The artists were limited to watching a YouTube video of a match being lit in space and then working with special effects artists to shoot footage of flames hitting a metallic ceiling.

When it came to creating the simulation, the graphic artists turned off the parameters for buoyancy, gravity and drag, but this resulted in a sequence that had very low detail and was almost dull. The team then had to alter and enhance the simulation to find a balance between realism and what would look exciting to an audience.

The ability to produce some of the most impressive shots of this kind – integrating simulation software with design tools to create effects that are both physically impossible and visually believable – has enhanced the reputation of British firms in this business. Framestore, for example, produced a 60-foot never-ending wave for the The Voyage of the Dawn Treader, while Double Negative (DNeg) created a wall of flame for Harry Potter and the Half-Blood Prince.

DESTROYING ASSETS

VFX artists don’t just use simulations to create assets, they can also destroy them. The challenge for firms is to select and develop the right tools to do this. On Gravity, for example, the Framestore team tested several solutions before it found an approach that was fast and flexible enough to deform and then shatter its extremely detailed model of the International Space Station. The artists finally settled on a rigid body dynamics (RBD) system using open-source software. This turned the model into a group of rigid, interconnected components, held together by a single mesh so that it would appear to bend and deform before it broke apart.

Instead of employing this RBD-based technique, the company MPC has taken another approach to making its creations explode or shatter onscreen. They have developed a new tool named Kali, after the Hindu goddess of destruction. This program is based on the finite element method (FEM), another technique used throughout engineering to analyse the structures of objects and buildings and calculate how they deform under different forces.

The principle of FEM is to divide an object into many smaller shapes that can be separately modelled in order to approximate how the whole structure behaves. In order to apply this idea in the most efficient way, Kali creates a secondary representation of the CGI object from tetrahedral shapes and simulates how that would explode by modelling the constituent pieces – rather than breaking up the original model into lots of new sections.

This enables animators to make changes to the original model without having to redo the entire destruction simulation. Instead, Kali alters the tetrahedral version of the object to match any design changes in the original model and changes the destruction simulation accordingly. The software then applies the simulation instructions to the original model and the object appears to blow apart on the screen – see Kali.

VFX artists can add detail to simulations without having to redesign the rendered model. This has proved useful when blowing apart large and complex models that need to go through multiple design changes. In the 2011 film Sucker Punch, MPC used Kali to destroy a Japanesestyle pagoda with approximately 10.7 million external and 11 million internal faces, with each pillar of the building further divided into up to one million tetrahedra. MPC later used Kali for the explosive spacecraft collision in Prometheus, Ridley Scott’s Alien ‘prequel’.

DATA PROCESSING

The quality of CGI has improved dramatically over the past decade, in no small part thanks to the creativity of artists and the skills of software developers. Arguably, the biggest challenge for VFX teams has been developing creative engineering solutions that enable them to handle the sheer volume of work they are now expected to process. VFX firms need to know how complex the required modelling and rendering processes will be so that they can produce the most realistic effects within the time and computing power available.

Some sequences can take days to simulate and can use gigabytes of data storage space. FLIP-solvers have become popular partly because they require the computer to evaluate simulations fewer times per frame than purely particle-based techniques. VFX producers also need to manage storing data with enough elements to accurately describe the simulation without taking up too much disk space or processing time.

Most simulations have to be solved frame by frame because the behaviour of a particle or other object in one frame depends on its previous actions and those of the others around it. In an attempt to reduce this data processing load, VFX firms are turning to specialised graphics processing units (GPUs). GPUs can simultaneously solve thousands of relatively simple equations, while central processing units can handle a small number of much more complex problems. As with almost every aspect of VFX work, there has to be a balance between using time to increase the accuracy of the model and giving artists more chances to run simulations to improve the appearance of the final sequence.

As VFX have become more popular with filmmakers, replacing aspects of live photography, special effects and scale models, both the number of sequences that need effects and the amount of work done on each sequence have shot up, from around 600 to 800 edits for a typical film to around 1,200 to 1,500. A film like Gravity is, in effect, almost an entirely animated production: some estimates put it at 80%, with the added dimension of recording and integrating live footage of actors’ faces, meaning that nearly every frame of the entire film is computer-generated to some extent.

To cope with this rapid increase, VFX firms have developed and refined their software to streamline the workflow, to manage data transfer between their different packages more efficiently and to further optimise their use of servers, which today can number thousands.

Other innovations allow teams of artists to work on different aspects of an asset at the same time, often in different locations around the world, rather than designing and animating each element in sequence. For example, at MPC, an asset management system has been developed to enable in-progress assets to be developed by various departments at the same time to enable a parallel workflow. This allows animators to start creating movements of characters once the initial rigs and geometry have been made available. The resulting geometry caches can be forwarded to the lighting department for block lighting, while the animators continue to finalise their animations.

KALI

Kali

Fractured tetrahedral meshes, showing how a single mesh can be created with different material properties and specific interior structure. In this case, an unbreakable rabbit is embedded in a breakable sphere © MPC

Kali is a piece of software that simulates how objects break apart in order to create animation sequences of destruction in films. It uses the finite element method (FEM), a numerical technique that can model and visualise structures. Engineers use FEM to explore what happens at the boundary of a shape when that shape is deformed by specified forces. With its origins in aerospace and civil engineering, FEM is now employed throughout engineering to analyse how objects and buildings respond to stress.

In the context of visual effects, FEM is generally used to model elastic and plastic deformation in solid objects on the screen. The physics of elastic deformation considers a piece of matter to be a continuous block of material that can deform at any and every one of the infinite points within it.

The laws that govern how this deformation occurs are based on a set of differential equations that can be difficult or impossible to solve exactly. FEM creates an approximation of a continuous solid by breaking it up into a finite number of simple connected pieces, each of which represents a volume of material. These simple pieces are chosen to be easy to work with mathematically.

In VFX, the elements are usually tetrahedra, the simplest linear representation of a solid. The software uses FEM to calculate stress and strain across the different elements, determining how much deformation there should be. The software can also use this stress and strain information to calculate where the material is likely to fracture.

Sucker Punch, MPC

To create slow motion shots of wood being splintered, burnt and destroyed in the film Sucker Punch, MPC developed the Kali software that could configure bendable surfaces. The company’s technical directors then defined precise properties for each material so the surfaces would break in a very realistic way. Metal would shear and bend, while wood planks would bend, crack and fracture into thousands of splinters © MPC

CHALLENGES REMAIN

VFX’s successes have been hard-earned. It requires a different skillset for directors to choreograph scenes using computer graphics compared to the way they would conventional live action shots. With a film like Gravity, the resulting film is made, in essence, three times: once in pre-production, then during the shoot and, finally, in postproduction. The realism that results from doing it well, though, means that more films are calling upon VFX.

To maintain their position, UK firms must continue to find ways to produce more realistic animations in ever-increasing volumes. The industry still relies heavily on the artistic input of animators, often in painstaking manual fashion. A key challenge will be to further automate this process and improve software interfaces to make them more intuitive and interactive. This will allow artists to produce lifelike images more quickly and rely increasingly on the system to animate their work.

VFX companies also need to continue to push the capabilities of their simulation engines to match the behaviours of real world systems more precisely and with higher fidelity. They also need to increase the simulation speed so that artists are not left waiting for days to see the final result.

Another key area with room for improvement is the modeling of light. Although well developed, this has yet to take full account what happens of when curved surfaces reflect or refract beams of light and when the rays concentrate in specific patterns or need to generate rainbow effects.

Perversely, despite the continuing improvements in hardware, software and system architecture, the increasing complexity of simulations means that the time it takes to render animations has actually gone up, rather than down. Any time saved by technological improvements has gone back into increasing the quality of the imagery. More use of GPUs may offer a breakthrough in this area, but some believe artists may have to take details to the level at which the human eye stops noticing improvements before extra computing power makes a real impact. Running the simulations that bring VFX to life can actually take far longer than watching the movie itself!

So the ultimate prize for VFX firms will be when computing resources have been sufficiently developed to enable directors to play back and review their work in real time.

Stephen Harris, senior reporter with The Engineer, wrote this article. He talked to Richard Graham, VFX Project Manager, and Martin Preston, Head of Research and Development, Framestore; Hannes Ricklefs, Head of Software (UK), and Ben Cole, Head of Software (Vancouver), Moving Picture Company; Jeff Clifford, R&D Manager, Double Negative; Nick Manning, Industry and Field Marketing Manager, Autodesk Media & Entertainment; Darren Cosker, lecturer in computer science at University of Bath and Royal Society Industry Fellow

View MPC’s use of Kali in making X-Men and behind the scenes work on Prometheus

[Top of the page]