Radiosity and Raytracing.

Radiosity allows us to create a more realistic atmoshphere using light as it would actually be seen. If you look in the picture above, the left picture uses direct illumination or standard lighting. The left pictures lighting is very straight cut and doesnt look quite realistic. Whereas in the right picture they have used radiosity and this allows the lighting to have a gradient on its edges. This is more like what happens in real life.

Photometric units are a measuring unit for light intensity. Examples of photometric  units are lumens and candelas etc.
A lumen is a unit of luminous flux. Luminous flux is the measure of the perceived power of light. The candela  is the base unit of luminous intensity.

Local illumination algorithms describe how individual surfaces reflect or transmit light. These algorithms can predict the intensity, colour and distribution of the light leaving the surface, from which it was shone upon. They are called shaders in 3d studio max.

The global algorithms take into account the ways in which the light is transferred between surfaces in the model. in 3ds max there are two global illumination algorithms, these are ray-tracing and radiosity.

A shader is a local algorithm and an algorithm is a mathematical function.


Rendering Hardware

What is an API? – API stands for Application Programming Interface and it is an interface of an application that allows the user to interact with the computer. When you click to create an object within a program this will help create the object. The API is there to help the user behind the scenes and it is something we take for granted.

 What is GPU? – GPU stands for Graphics Processing Unit. GPU’s are graphics rendering devices for personal computers or games consoles etc. Modern GPU’s are very good at what they do and can come with different capabilities and prices.  I think one of the leading graphics card manufacturers are nVidia and their flagship graphics card is the nVidia GEForce 6800 Model, made for gaming.

What is a shader? – A shader is a set of instructions within the software to perform the graphics rendering. A shader can be seen as part of the renderer.
Vertex shaders affect vertex properties like position, color, and texture coordinate.
Geometry shaders can add and remove vertices from a mesh.
Pixel shaders calculate the colour value of individual pixels when the polygons produced by the vertex & geometry shaders are rasterized.

3D Development Software

3D Studio Max – 3D Studio Max is a 3D graphics application developed by Discreet, also known as Autodesk Media and Entertainment. 3D Studio Max has strong modeling capabilities and a flexible plugin architecture. This program is mostly used by video game developers, TV commercial studios and architectural visualisation studios. It is one of the most widely-used 3D programs that you can buy off the shelf. 3D Studio Max has strong modeling capabilities and a flexible plugin architecture. This program is mostly used by video game developers, TV commercial studios and architectural visualisation studios.


 Cinema 4D This is a commercial cross platform high-end 3D graphics application. Cinema 4D is produced by a company called MAXON Computer. It is capable of procedural and polygonal/sub divisional modeling, animating, lighting, texturing and rendering. Cinema 4D is also noted for being very easy to use and artist friendly among high end 3D applications and is also known for having a very flexible interface. Cinema 4D was used in ‘Open Season’, ‘Monster House’ and ‘Polar Express’.

open-season-review1.jpg monster-house21.jpg polar01_polarexpresspefc-941.jpg

 Afterburn – AfterBurn is a volumetric effects plugin for 3D Studio Max software. 3ds max doesn’t come with any volumetric effects apart from “volumetric light”. AfterBurn is one of very few plugins to create and render true volumetrics within 3ds max. AfterBurn can be used to create Billowing smoke, Explosions, Thick or Thin Clouds, Dust and Flowing Water. Afterburn has been used in many feature films such as Armageddon, Flight of the phoenix, K-19: The Widowmaker, Paycheck and Scooby Doo 2.

armageddon-31.jpg water_hype_b_a_01_3171.jpg

 Maya – Maya is a high-end 3D computer graphics and 3D modeling software package. It was developed by Alias Systems Corporation but is now owned by Autodesk. Maya is mostly used in the film and TV industry, as well as for computer and video games. The software is released in two versions, These are: Maya Complete and Maya Unlimited. Maya Unlimited comes with tools that are not in the Maya Complete package, these are:

  • Maya Classic Cloth – Cloth simulation to automatically simulate clothing and fabrics moving realistically over an animated character.
  • Maya Hair – A simulator for realistic-looking human hair implemented using curves and PaintEffects.
  • Maya Fur – Animal fur simulation similar to Maya Hair. It can be used to simulate other fur-like objects, such as grass.
  • Maya Live – A set of motion tracking tools for CG matching to clean plate footage.
  • Maya Fluid – A realistic fluid simulator, this is for smoke, fire and clouds etc.


Unreal Ed

CG Animation Timeline

1960 – LISP (List Processing Language) Programming Language was developed by John McCarthy for artificial intelligence applications.


1962 – The Sketchpad System was developed by Ivan Sutherland for Interactive Computer Graphics.


1964 – Boeing in Seattle created a 3D Animation of an aircraft carrier landing. Plotted drawings were done by William Fetter and W. Bernhart.

1966 – Hummingbird was created by Charles Csuri. This was the first example of computer generated animation.


1971 – Robert Abel & Associates opens.
             Animated Faces were created by Fred I. Parke at the University of Utah.pioneer_headshot1.jpg

1972 – Ed Catmull developed an animation scripting language and he created an animation of a smooth shaded hand. 
This was at the University of Utah.
 Also at the University of Utah, Fred Parke created the first computer generated facial animation.
MAGI animated computer-rendered polygonal objects.


1974 – The National Research Council of Canada released ‘Hunger/La Faim’. This was a short film that was directed by Peter Foldes and made use of interactive  keyframing techniques.


1979 – Computer group opens at George Lucas’ Industrial Light and Magic.


1981 – A few companies opened, these would focus on 3D Animation.
Wavefront, Santa Barbara.
Digital Productions, Los Angeles.
R. Greenburg Associates (RGA), New York.
Polygon Pictures, Tokyo.

1982 – ‘Tron’ was the first movie with over 20 minutes of computer animation in it. The movie was rubbish but it helped contribute to the acceptance of CG Animation in Hollywood.
In Star Trek II; The Wrath of Khan, The Genesis Effect was the first fully animated visual effects shot.


1983 – Bill Reeves at Lucafilm published techniques for modelling particle systems.
Silicon Graphics Inc. opens. Also! Alias Research and Omnibus Computer Graphics opened in Toronto.

1984 – ‘Still Life Etude’ was an early simulation of light, fog, rain and skies. This was created at Hiroshima University.
Porter and Duff at Lucasfilm published a paper on digital compositing using an alpha channel. This paved the way for effectively combining live action and CG Imagery.
Also! Pixar opened.

1985 – At OSU, Girard and MacieJewski published a paper describing the use of inverse kinematics and dynamics for animation. Their techniques have become an accepted part of 3D animation today.
This guy called Ken Perlin also published a paper on noise functions for textures. He was from NYU. He applied this technique to add realism to character animations.
‘The Last Starfighter’ was the first live action feature film to use realistic computer generated models in place of traditional models.
Abel and Associates produced ‘Brilliance’, a commercial which featured a sexy female robot which had convincing realistic motion.


1986 – ‘Luxo Jr’ by John Lasseter was nominated in the AMPAS Animated Short Category.
Softimage opened in Montreal and VIFX in Los Angeles.


1987 – John Lasseter at Pixar published a paper describing traditional animation methods.
Blue Sky Studios opened in New York.
Side Effects Software opened in Toronto.
Stanley and Stella: Breaking the Ice by Symbolics Graphics and Whitney Demo Productions contains early flocking animation.


1988 – ‘Locomotion’, a short film by Pacific Data Images was an early example of squash and stretch.

1989 – ‘Knickknack’ a short film by pixar
‘Don’t Touch Me’ by Kleiser-Walczak was an early character animation using motion capture techniques.

1991 – Terminator 2 was the first mainstream blockbuster movie to use multiple morphing effects and simulated natural human motion. This set the standard for CG animation and effects to follow.

1992 – ‘Aladdin’ is Disney’s first film to contain a fully computer animated character.
‘The Seven Wonders Of The World’ by Electric Images highlights a new way to produce architectural visualisation by pushing back the boundaries.

1993 – Jurassic Park sets the standard for using Inverse Kinematics in creating realistic living creatures.
Babylon 5, the sci-fi TV series is produced with entirely off the shelf computers and software using Lightwave 3d and Amiga computers.

1994 – The wildebeest stampede in Disney’s ‘The Lion King’ is a spectacular integration of 3d computer animation flocking systems with traditional animation.

1995 – Toy Story was released to huge commercial and critical success and was the first full length 3D CG feature film.

1996 – ‘The Fight’, a short film by Acclaim Entertainment proves the viability of motion capture for character animation.

1997 – Pixar’s ‘Geris Game’ by Jan Pinkava is modelled with subdivision surfaces.

1998 – ‘Antz’ by Dreamworks/PDI
‘A Bug’s Life’ by Disney/Pixar
‘Bunny’ by Chris Wedge at Blue Sky Studios.
‘Bingo’ by Chris Landreth created using Alias/Wavefront’s new software, Maya.

1999 – ‘Toy Story 2’
Warner Brothers ‘Iron Giant’ uses computer animation to great effect in animating the title character.
‘Fishing’ by PDI presents a watercolour rendering effect.
‘Fiat Lux’ by Paul Debevec is a landmark in image based rendering.

2000 – Pixar’s ‘For the Birds’ by Ralph Eggleston
Victor Navones ‘Alien Song’ proves incredibly popular and is widely viewed on the web
The debut of PS2, XBOX and Gamecube boosted the quantity and quality of 3D animation being produced.

2001 – Pixar short ‘Mike’s New Car’ by Pete Docter and Roger Gould.
Dreamwork’s ‘Shrek’ won an oscar for the newly created award, Best Animated Feature. ‘Jimmy Neutron’ and ‘Monsters Inc’ are runners up.
Lord of the Rings: The Fellowship of the Ring was released pushing the boundaries on 3D animation and compositing.
Final Fantasy: The Spirits Within is the first fully 3D animated feature film to contain CG actors. Although it was technically brilliant, it failed at the box office.

2002 – Lord of the Rings 2 created a unique character in Gollum by using a combination of performance capture and keyframe animation.
Blue Sky Studios released ‘Ice Age’.

The Cartesian Co-Ordinate System

3D software packages use the Cartesian coordinate system to create the illusion of working in three-dimensional space.
This system is the same coordinate system used for teaching algebra.


It was a French mathematician called Rene Descartes who first developed the Cartesian coordinate system, this was in 1637.
He created this system in an effort to merge algebra and Euclidean geometry.
His work has played an important role in the development of analytic geometry, calculus and cartography.

The 2-dimensional Cartesian system is commonly defined using two axes, both perpendicular to each other as you can see in the picture below.
The horizontal axis is labelled X.
The vertical axis is labelled Y.
These axes form the X-Y plane.

The point where the X and Y axes meet is known as the origin.
This is labelled O.
This origin represents the centre of the coordinate universe.

The Z axis was added in the 19th century.
The Z axis is called the depth axis and it runs at right angles to the xy plane and also extends forever in both directions. This third axis enables us to locate any point in three-dimensional space.


Dreamworks CG Pipeline

The Dreamworks Animation team use the same stages of production as Pixar when making a film.
I suppose its a worldwide standard technique that is used to create CG Films.


Every aspect of this picture above was created by computer, every blade of grass, every leaf, every strand of fur and every ray of light. A lot of hard work and sleepless nights go into an animated film and this is why it can take over four years to make one.

Stages of Production


Films start off with ideas and concepts. Some of these ideas can be completely original or some ideas can be drawn from various things, things like childrens fairy tales, songs, films from the past, childhood memories and many more. Once an idea has been settled upon a script needs to be written, this is the blueprint for the film and gives everyone a view of what will happen.

This is a picture of the script from the film ‘Shrek’.

When a script is prepared and ready it is handed to the studios Storyboard Artists this can be a fairly hard job as they have to translate these words into pictures. They do this by making a series of sketches to bring the story to life. This has similarities to a comic book. There could be hundreds and thousands of storyboards drawn through the pipeline.

This is a picture of a storyboard from the film ‘Shrek’.

Visual Development
Once the script and storyboards are set the studios Visual Development Department plans how the film will look. The department develops the look and feel to each and every sequence. Everything in the entire film has to be designed, this can range from the main characters to the most minuscule prop. This stage consists of drawings, sculptures paintings and blueprints. All are a form of concept art.

Concept art from Madagascar.

Once the studio has the storyboards and characters designed and ready to go voices are needed to record the characters lines in the film. Casting for animated movies is very different from casting for live films as the studio pick the actor from the way they sound rather than the way they look. The studio will record the actors before they start to animate and will sometimes videotape the recording sessions making sure they capture key expressions and reactions.

Picture of an actor recording sound for animated film ‘Madagascar’


From the initial character designs the studios Modelers will create a 3D model that will later be used for planning and animation. This modeling can be created in programs such as 3D Studio Max and other programs.

Model of character from ‘Madagascar’

The modelers will start with an armature, this is a wire frame around the characters model. Armatures break down the character models into workable geometry thus allowing the modelers to give the animator the ability to move the 3D figure in any way they want. This is called ‘rigging’

This is an image of the above model with an armature on it.

Basic Surfaces
Once the armature has been set up the studio adds basic surfaces to the character. This is the state the studio needs the character to be in for their next step.

Model above with basic surfaces added.

This step is down to the studios Layout Artists, they use rough renditions of the characters to put together the movement of the character(s) in the scene(s). This layout stage is where the studio draw the blueprint from where they get the camera movements, character placements, spacing, lighting, geography and scene timing. (A good example of this, which you should all remember was when Mike showed us the video of how the movie ‘Antz’ was made. It had a clip where the antz were moving in a limited way and it looked very basic. This was the layout stage.)

Layout stage from the film ‘Shrek 2’


Character Animation
Once the Layout Artists have everything working well in the layout stage they can then hand it over to the Animators. The animators will start bringing the characters to life using lots of controls that were created in the character rigging phase, they also synchronise the characters to the voice recordings. The characters look pretty impressive at this stage but not as impressive as they will when completely finished.

This is an image of the Character Animation stage from ‘Shrek 2’

After the character animation stage the next step is adding lighting and visual effects. With live films it is fairly easy to capture such things like nature and peoples expressions but in the computer animation industry these all have to be designed and created by the Effects Artists. The effects artists start with what the character animators gave them and turn it into something special.

This is an image of a roughly animated character from the film ‘Madagascar’

What the effects artists do next is add basic effects to the scene like the transparency effect in the water, like in the picture above.

Next the effects artists will add further, more detailed effects onto the scene. In the picture above you can see how there are reflections and shadows in the water. All of these are made by the computer.

Then the effects artists complete the scene with even more effects. The artists add foam to the surface of the water which is a realistic effect when waves occur in the sea and they also add bubbles under the surface. They then finish it off by adding spray and splashes to the water. When all the elements have been composited it is then sent to the Lighting Department for the final textures.


Finishing Touches
The end of the process is here and the studio is now ready to add the sound effects, add the final score into the film, mix the soundtracks correct the colour and release the film worldwide.
As you can see the production pipeline for Dreamworks Animation Studio doesnt differ that much from the Pixar Studio. This is mainly because big animation studios have techniques that are used worldwide. I believe the only thing that would differ between studios is the people they employ and the skills and personal techniques the employees use when making a film. I believe it would take each studio the same amount of time and same processes.

Making a film is like people following a recipe to cook a meal, they would each have to follow the same recipe but some people would add their own personal preferences.

Thats just my hypothesis though. =|

Pixar Pipeline Research

When the people at Pixar make a film they take it through four stages, these are: 

  • Development
  • Pre-production
  • Production
  • Post-production

Development is where they create the storyline of the film. Somebody will present an idea for a film and if it is good storyboards are created and concept art is drawn up. Along with this a script will be looked at and maybe changed and manipulated into a way that is more suitable for audiences.

Voice talent is also recorded; temporary voices are recorded often by the Pixar artists for storyboard reels to give the film a bit of “feel”. Later on in the production line professional actors come in and re-record the voices for the film, many times the artists rendition of a voice is good enough to be kept for the original.

Reels are then made; these are essential to the validating of the sequence and are the first instance that the timing is understood

The art department then creates the characters look and feel, captivating the emotions and characteristics of the character often doing this quite well. They go off the storyboards and their own initiative to create a character. They also design sets, props and visual looks for surfaces.

Models are then sculpted and/or created in 3d software and are then given something called “avars” which allow the character to move. These are like hinges.

After this the sets are then dressed allowing the director to encapsulate the look and feel for the film.

The shots are then laid out; the layout crew choreographs the characters in the set and uses a visual camera to capture the shot for the scene. They often produce multiple versions of shots to give the editorial department a choice for cutting the scene, often maximising storytelling effect. Once the scene has been cut it is released for animation.

The animators then take over choreographing movements and facial expressions in each scene using the characters “avars” at their disposal.

Lighting completes the look and feel of the film, creating different scenes and making the film feel and look realistic.


Here you can see a picture took of some storyboards for the film Finding Nemo. A storyboard consists of rows upon rows of sketches all annotated and pinned on a board. This gives the makers an idea of what is supposed to happen and when, they can re-arrange anything they want to and change the images as they wish. Storyboards can take up to six months to create.




This is a picture of a rather large piece of conceptual art from the latest movie Ratatouille. Concept art is used to create a visual representation of a design idea. Concept art is used in films, games and comic books.


These are two models of characters used in Pixar films “A Bugs Life” and “Finding Nemo”. Models are used to show what a character will look like and to convey their emotions.


This is a piece of conceptual art depicting a scene in the film. You can see how the picture shows facial features and the artists and animators will be able to see what is going on in the scene.


This image is the above concept art turned into the finished scene. You can see it has the lighting, materials and textures all added into the scene.




This is a computerized zoetrope with characters from Toy Story 2 on it. This would spin around and create a 3D animation of the characters. You could manipulate the characters and then see how a walk cycle or action would look like.