Skip to main content
Back

Creating the Impossible

The 3D space thriller Gravity dominated this year’s Academy Awards, winning in seven categories. Although the film features live action footage of the actors, with the exception of the actors’ faces, computers were used to create just about everything else that is on the screen. The company that won that award is one of several in the UK that lead the way in providing visual special effects for film.

(See video links at foot of page)

Film makers have used special effects (SFX) since the early days of cinema. SFX have provided substitutes for actions and events that are impossible, or too expensive, to stage. Over the years, the technology of SFX has moved from building, and destroying, scale models to computer-generated visual effects (VFX) like those that contributed to the success of Gravity in this year’s Oscars.

The film industry in the UK has a long history of producing memorable special and visual effects, from Hammer horror films and Stanley Kubrick’s THE ROLE OF VFX

Filmmakers want ever-greater detail and realism from the computer-generated images (CGI) that are used to create VFX and are making greater use of these images because they offer greater flexibility in telling a story. CGI images are wholly created within a computer, while VFX are video special effects rendered by a computer on real or computer-generated images. CGI can enhance sets, tailor costumes, create special effects and conjure the illusion of casts of extras.

For the film LIGHTING GRAVITY

VFX artists use a method known as ray tracing to simulate the path of photons around a scene and to ‘illuminate’ their animations. Ray tracing generates an image by tracing the path of light through pixels in an image plane and simulates what happens when the light interacts with virtual objects.

Ray tracing is used for generating visual realism but requires significant computation time. This makes ray tracing best suited for applications where the image can be rendered (translating mathematical approximations of 3D objects into finalised visual representation) slowly ahead of time. The technique can simulate a wide variety of optical effects, such as reflection and refraction, scattering, and chromatic aberrations.

The film COMPUTER POWER

The Framestore modellers who worked on Gravity built 2,200 computer models, including a representation of the International Space Station, from 100 million polygons. The animator moved each model a small amount at a time within a virtual set. Unlike traditional stop-motion animation, where models must be positioned for every frame of the film, the computer can reduce the artists’ workload by filling in most of the movement. The animator is still making all the creative decisions. They have to decide on the direction of the movement, the force, the position they want the model to end up in and how fast it will move. The computer takes out some of the laborious, meticulous work by filling in the frames between those points.

Today, computer simulation can model almost exactly how each muscle or strand of hair moves and deforms in the real world, or how the millions of particles that make up water or fire flow and interact. By building and adapting the software that carries out these simulations, and by developing tools to translate the resulting data into computergenerated images, VFX companies can create multiple layers of additional detail that can be added on to their basic animated models. To create fire, for example, the artists define the 3D volume of the flame and the number of particles that comprise it. They can then manipulate parameters such as heat, combustion, convection, drag and gravity. A simulation run then models how the particles move frame-by-frame – their velocity, direction and density – so the artists can see the overall motion of the fluid represented by a mesh across the particles.

Surfaces can be particularly difficult to get right because of the change in density of particles. Water may go from liquid to foam to air, but it usually has a relatively consistent surface, whereas a plume of smoke has different densities along its edges.

The animation process called previsualization (previs), shown in the top two frames here, is a way that filmmakers plan scenes as a step beyond illustrated storyboards. It is unusual for an entire film to be ‘prevised’, but with Gravity an animated version of the movie was created before shooting, containing everything but the actors © Framestore

A common technique used by many fluid simulation programs is fluid-implicit particle (FLIP) solving. This is a subset of computational fluid dynamics (a way of calculating the movement of fluids using numerical algorithms) that has been used to model things like biological cells and atmospheric pollution. This combines modelling of individual particles, which works for waves and other fine close-up effects, while modelling the entire fluid as a volume within a grid is better for large expanses.

Once VFX artists are happy with the movement, they then apply tools to give a realistic appearance in terms of colour, shading and texture. This turns the data about each particle into an instruction for the computer to render it in a certain way, based on information about the fluids’ real-world characteristics, such as intensity or reflectivity.

This process, called ‘look development’, can be particularly challenging when creating physically rare or impossible effects. For example, when the Framestore team wanted to produce fire in zero gravity, it found very few reference points to draw from. The artists were limited to watching a YouTube video of a match being lit in space and then working with special effects artists to shoot footage of flames hitting a metallic ceiling.

When it came to creating the simulation, the graphic artists turned off the parameters for buoyancy, gravity and drag, but this resulted in a sequence that had very low detail and was almost dull. The team then had to alter and enhance the simulation to find a balance between realism and what would look exciting to an audience.

The ability to produce some of the most impressive shots of this kind – integrating simulation software with design tools to create effects that are both physically impossible and visually believable – has enhanced the reputation of British firms in this business. Framestore, for example, produced a 60-foot never-ending wave for the DESTROYING ASSETS

VFX artists don’t just use simulations to create assets, they can also destroy them. The challenge for firms is to select and develop the right tools to do this. On Gravity, for example, the Framestore team tested several solutions before it found an approach that was fast and flexible enough to deform and then shatter its extremely detailed model of the International Space Station. The artists finally settled on a rigid body dynamics (RBD) system using open-source software. This turned the model into a group of rigid, interconnected components, held together by a single mesh so that it would appear to bend and deform before it broke apart.

Instead of employing this RBD-based technique, the company MPC has taken another approach to making its creations explode or shatter onscreen. They have developed a new tool named Kali, after the Hindu goddess of destruction. This program is based on the finite element method (FEM), another technique used throughout engineering to analyse the structures of objects and buildings and calculate how they deform under different forces.

The principle of FEM is to divide an object into many smaller shapes that can be separately modelled in order to approximate how the whole structure behaves. In order to apply this idea in the most efficient way, Kali creates a secondary representation of the CGI object from tetrahedral shapes and simulates how that would explode by modelling the constituent pieces – rather than breaking up the original model into lots of new sections.

This enables animators to make changes to the original model without having to redo the entire destruction simulation. Instead, Kali alters the tetrahedral version of the object to match any design changes in the original model and changes the destruction simulation accordingly. The software then applies the simulation instructions to the original model and the object appears to blow apart on the screen – see DATA PROCESSING

The quality of CGI has improved dramatically over the past decade, in no small part thanks to the creativity of artists and the skills of software developers. Arguably, the biggest challenge for VFX teams has been developing creative engineering solutions that enable them to handle the sheer volume of work they are now expected to process. VFX firms need to know how complex the required modelling and rendering processes will be so that they can produce the most realistic effects within the time and computing power available.

Some sequences can take days to simulate and can use gigabytes of data storage space. FLIP-solvers have become popular partly because they require the computer to evaluate simulations fewer times per frame than purely particle-based techniques. VFX producers also need to manage storing data with enough elements to accurately describe the simulation without taking up too much disk space or processing time.

Most simulations have to be solved frame by frame because the behaviour of a particle or other object in one frame depends on its previous actions and those of the others around it. In an attempt to reduce this data processing load, VFX firms are turning to specialised graphics processing units (GPUs). GPUs can simultaneously solve thousands of relatively simple equations, while central processing units can handle a small number of much more complex problems. As with almost every aspect of VFX work, there has to be a balance between using time to increase the accuracy of the model and giving artists more chances to run simulations to improve the appearance of the final sequence.

As VFX have become more popular with filmmakers, replacing aspects of live photography, special effects and scale models, both the number of sequences that need effects and the amount of work done on each sequence have shot up, from around 600 to 800 edits for a typical film to around 1,200 to 1,500. A film like KALI

Fractured tetrahedral meshes, showing how a single mesh can be created with different material properties and specific interior structure. In this case, an unbreakable rabbit is embedded in a breakable sphere © MPC

Kali is a piece of software that simulates how objects break apart in order to create animation sequences of destruction in films. It uses the finite element method (FEM), a numerical technique that can model and visualise structures. Engineers use FEM to explore what happens at the boundary of a shape when that shape is deformed by specified forces. With its origins in aerospace and civil engineering, FEM is now employed throughout engineering to analyse how objects and buildings respond to stress.

In the context of visual effects, FEM is generally used to model elastic and plastic deformation in solid objects on the screen. The physics of elastic deformation considers a piece of matter to be a continuous block of material that can deform at any and every one of the infinite points within it.

The laws that govern how this deformation occurs are based on a set of differential equations that can be difficult or impossible to solve exactly. FEM creates an approximation of a continuous solid by breaking it up into a finite number of simple connected pieces, each of which represents a volume of material. These simple pieces are chosen to be easy to work with mathematically.

In VFX, the elements are usually tetrahedra, the simplest linear representation of a solid. The software uses FEM to calculate stress and strain across the different elements, determining how much deformation there should be. The software can also use this stress and strain information to calculate where the material is likely to fracture.

To create slow motion shots of wood being splintered, burnt and destroyed in the film Sucker Punch, MPC developed the Kali software that could configure bendable surfaces. The company’s technical directors then defined precise properties for each material so the surfaces would break in a very realistic way. Metal would shear and bend, while wood planks would bend, crack and fracture into thousands of splinters © MPC

CHALLENGES REMAIN

VFX’s successes have been hard-earned. It requires a different skillset for directors to choreograph scenes using computer graphics compared to the way they would conventional live action shots. With a film like Gravity, the resulting film is made, in essence, three times: once in pre-production, then during the shoot and, finally, in postproduction. The realism that results from doing it well, though, means that more films are calling upon VFX.

To maintain their position, UK firms must continue to find ways to produce more realistic animations in ever-increasing volumes. The industry still relies heavily on the artistic input of animators, often in painstaking manual fashion. A key challenge will be to further automate this process and improve software interfaces to make them more intuitive and interactive. This will allow artists to produce lifelike images more quickly and rely increasingly on the system to animate their work.

VFX companies also need to continue to push the capabilities of their simulation engines to match the behaviours of real world systems more precisely and with higher fidelity. They also need to increase the simulation speed so that artists are not left waiting for days to see the final result.

Another key area with room for improvement is the modeling of light. Although well developed, this has yet to take full account what happens of when curved surfaces reflect or refract beams of light and when the rays concentrate in specific patterns or need to generate rainbow effects.

Perversely, despite the continuing improvements in hardware, software and system architecture, the increasing complexity of simulations means that the time it takes to render animations has actually gone up, rather than down. Any time saved by technological improvements has gone back into increasing the quality of the imagery. More use of GPUs may offer a breakthrough in this area, but some believe artists may have to take details to the level at which the human eye stops noticing improvements before extra computing power makes a real impact. Running the simulations that bring VFX to life can actually take far longer than watching the movie itself!

So the ultimate prize for VFX firms will be when computing resources have been sufficiently developed to enable directors to play back and review their work in real time.

Stephen Harris, senior reporter with The Engineer, wrote this article. He talked to Richard Graham, VFX Project Manager, and Martin Preston, Head of Research and Development, Framestore; Hannes Ricklefs, Head of Software (UK), and Ben Cole, Head of Software (Vancouver), Moving Picture Company; Jeff Clifford, R&D Manager, Double Negative; Nick Manning, Industry and Field Marketing Manager, Autodesk Media & Entertainment; Darren Cosker, lecturer in computer science at University of Bath and Royal Society Industry Fellow

View MPC’s use of Kali in making X-Men and behind the scenes work on Prometheus

Keep up-to-date with Ingenia for free

Subscribe