Beruflich Dokumente
Kultur Dokumente
The world of 3D is huge. The sheer number of industries, software and tools that
are involved in learning 3D can be daunting. Before you ever drop the money for
the best 3D software, you need to know exactly what it is you want to learn how to
do. In this article youll get a crash course of some of the most popular fields
related to 3D and the steps within the pipeline. That way you can find out exactly
what interests you most, what programs youll need as well as some great first
steps to starting out on the right path towards learning 3D.
What do you want to do?
If youre reading this, you probably already have an idea of what you want to do.
For your convenience, you can use the links below to jump to the sections
relevant to you.
I want to make movies or TV shows
I want to create product designs
I want to make games
I want to create my own art
I want to create visual effects
I want to make movies or TV shows
The process of making an animated movie can take years with it being common
for a typical movie to take three to five years. TV shows generally dont take as
long simply because of their shortened production schedules, but both movies
and TV shows can take hundreds of well-trained artists to reach the final product.
Needless to say, making a movie or TV show is no small undertaking.
Working in the television or film industry as a 3D artist can mean working on
movies like Toy Story, live action movies with CG integrated in like The Avengers
or hit TV shows like Dr. Who. Regardless of whether or not the movie is a fully
animated feature or a mixture of live action and 3D, the disciplines are the same.
The creation process just varies between the two
Using an animated movie as an example, when a company like Pixar makes an
animated movie they first need to come up with the story, create the concepts for
the characters and the environments that the characters will be interacting with.
Once that has been completed then the process of actually creating the 3D world
begins. This is where the 3D artist shines as they get to model, texture, animate
and render all of the virtual worlds and characters.
When working on a TV show the process is similar to working on feature films and
the same disciplines are involved. Everything is planned outside of the 3D world
much in the same way, but with TV shows the deadlines are much tighter.
Typically a TV episode will be released once a week where as a movie can take
years to create.
Finding where you fit into the pipeline would be the next step. Use the links below
to jump to more detail about how to get started with that step of the pipeline.
Modeling is the process of taking the 2D concepts and building them in 3D. Once
a model is created, they need to have things like color and materials applied to
them. This is the process of texturing. Before motion can be added to a model, it
needs to have a skeleton built during the rigging step. Once rigged, a model can
be brought to life through animation. After the model has been animated, it needs
to be rendered to turn it into a series of still images that will eventually turn into
the final movie.
Last, but not least, while compositing isnt really considered as a step of the 3D
pipeline, it is important enough for making movies and TV shows that we wanted
to include it because, in some instances, it can include some 3D work.
Compositing is the process of taking the rendered images and adding polish,
whether thats integrating them into live footage (VFX) or adding the final touches
of an all-3D animation.
Because many of the aspects of creating game art is very similar to creating art
for movies, it is common for an artist working in the movie industry to switch over
to the game industry or vise versa. Learning any discipline within the pipeline can
help prepare you for the game or move industry. Its more about which one that
you find more appeal in.
Finding where you fit into the game art pipeline would be the next step. Use the links
below to jump to more detail about how to get started with that step of the pipeline.
Just like movies, every game asset needs to be modeled. This is the process of
taking the 2D concepts and building them in 3D. Because games need to be able
to run smoothly on a gamers computer or console, they need to have restrictions
on how much resolution the models can have. As a result, a lot of extra detail is
faked for game models in the texturing process. Once the model is looking good,
it needs to be rigged before motion can be added to it. After that, a model needs
to have a number of animations created for it. For example, a character would
need both a walk animation and a run animation created so that when the gamer
tells the character to go from walking to running, the game engine will be able to
make the character look like the character is actually running as they move faster.
Texturing
Once a 3D model is built, the computer doesnt know what sort of surface is being
created so it is created with a single, flat color. For example, should the wall on a
house have wallpaper on it or is it painted? These sort of things are done in the
texturing step.
Texturing is required in all of the fields mentioned above. In the film industry often
times a texture artist is a specific job within the pipeline. With the game industry
often times a 3D modeler will be required to create textures for the models they
create. Just like modeling, Maya, 3ds Max, CINEMA 4D, and Softimage are some
great applications that you can use to get started with texturing in 3D.
Ironically enough, though, jumping into a 3D application really isnt the best first
step for an aspiring texture artist. Before jumping into a 3D application it is
incredibly important for you to be familiar with an image editing application like
Photoshop. The reason for this is simply because a lot of textures start off from a
photograph or at least a photographic reference. To get up to speed in
Photoshop, follow along with this learning path.
If youre already familiar with Photoshop, you can jump ahead and start learning
how to texture in your favorite 3D application:
Texturing in Maya
Texturing in 3ds Max
Texturing in Softimage
Rigging
In the film and game industry there are artists who are responsible for creating the
skeletons for the character. These skeletons are called the rig. In order for an
animator to be able to bend and deform a 3D asset, a rigging artist, or technical
director as they are often referred to, will first need to set up a rig. This is done by
creating all of the control points on the 3D model that will be needed so that an
animator can bend and deform the model to create the animations.
In the film industry a character rig will most likely have hundreds of controls for an
animator to be able to manipulate. These rigs can take weeks to finish and a lot of
technical prowess is required for this.
The same process is required for video games, but depending on the complexity
of the animations that will need to be done will determine how complex the rig
needs to be. For example, a character in a game may never need to speak so
there would be no need to create a facial control rig.
To get started with rigging, the first step is to determine which software you want
to use. Software such as Maya, 3ds Max, CINEMA 4D, and Softimage are all
capable of creating powerful rigs. Check out their interfaces and workflows in
some of our tutorials to get an idea of which one may interest you the most. Then
grab a demo and start to dig in!
Depending on which software program you like the best, here are some tutorials
to get you started off on the right path:
Quick Start to Rigging in Maya
Quick Start to Rigging in 3ds Max
Rigging in CINEMA 4D
Rigging in Softimage
Animation
The animators job is to make the required 3D assets move in a believable way.
For example, when a character in a 3D animated movie is moving, every
movement was created by an animator. Take a look at how the characters in Toy
Story move. Every movement that character did had to be created by an
animator. Animation is heavily utilized in most of the above fields, especially in
movies and video games. Each one differs a bit from the other.
For example, when working on a 3D animated movie like Tangled, the animators
typically create three to four seconds of finished animation every week. With
animation in video games, an animator might be tasked to create 20 seconds of
animation in a single day.
Rendering
Rendering is most commonly done toward the end of the production. The same
way that a 2D artist would render their drawings by adding lighting and shading
into their paintings or drawings, 3D rendering allows the 3D artist to incorporate
shading and lighting into the 3D scenes.
Rendering in a 3D movie can be a lengthy process. For example, a single frame
from Monster University took 29 hours to complete.
Rendering in a game must be done in real-time. This means it cant take a single
frame hours to render. It is up to the consoles (or computers) graphics card to
produce the render and must be done while the viewer is playing. The reason why
movies typically look much better graphically is because each frame is prerendered and can be devoted as much time needed to get the final quality.
Like many of the other steps in the pipeline, to get started with learning how to
create beautiful renders, the first step is to determine which software you want to
use. Software such as Maya, 3ds Max, CINEMA 4D, and Softimage are all
capable of creating beautiful renders. A great way to figure out which one you
think youre interested in is to start by simply checking out the interfaces and
workflows in some of our tutorials. This will let you see how to move around in
the application to get an idea of which one may interest you the most. Then grab
a demo of the one(s) that peeks your interest and dig in!
Depending on which software program you like the best, here are some tutorials
to get you started off on the right path:
Rendering in Maya
Rendering in 3ds Max
Rendering in CINEMA 4D
Rendering in Softimage
Composting
A compositor is utilized most commonly in the VFX field. A compositor can add in
separate images into a life-action background to create the illusion that all
elements are there. For example, if youve ever seen a behind the scenes look at
sci-fi type movie, you may have noticed the actors are in a strange green room. If
you were to look at the finished movie, it looks as if the actors are in a sci-fi city,
or some other area. Its the compositors job to add all the separate elements to
give the illusion that the actors are in another universe.
Compositing is an important role within the VFX pipeline, without compositors
there would be no way to integrate 3D elements into a live-action scene.
Some of the most important software applications for compositing are NUKE and
After Effects. A great first step to becoming familiar with these powerful tools is to
follow along with one of these learning paths:
10
Final thoughts
Once youve figured out the path you want to pursue, its up to you to become
great at whichever path you choose to take. Great artists arent born overnight. It
can take years to truly master one of these subjects. Dont get discouraged if
what youre making doesnt look like what Pixar is producing. Remember that
their movies take hundreds of experienced artists years to create. Practice,
practice, practice and with anything you will become better at it.
One of the best ways to become better is to get feedback from fellow artists, find
a community that you can share your work in. Its important that you can take
constructive criticism well, dont take it personally. Visit our forums to post your
work in progress and get feedback from fellow artists.
When youre ready to dive in make sure to take advantage of training on every
step of the 3D pipeline. Keep learning more about jumping into 3D with choosing
your first 3D application and getting a better understanding of the 3D pipeline.
11
12
Modeling: Assets to be used in anything from animations to VFX shots are all
modeled from scratch, or adapted from other models, using a variety of
techniques to meet project requirements. This phase takes all of the concepts
from pre-production and starts bringing them to life. Assets are usually modeled
with the style or concepts set forth in the pre-production phase. If you have a
sculpting application, like ZBrush or Mudbox, youll also be able to digitally sculpt
assets or add more detail to your models.
Modelers often start with a completely empty 3D scene and build up the 3D
geometry to look like anything from simple props, or environments, to complex
characters. A 3D model is made up of a series of points called vertices that are
connected to form a mesh. These vertices have all been meticulously placed by a
3D modeler. Its one of the first and most important steps in the 3D pipeline
because it is essentially the creation of the assets that all of the other steps in the
3D pipeline will use.
Painting and Texturing: Your awesome models, or assets, reach the next step
where color and textures will take over that gray look, known as a default shader.
This step is where you will learn about materials, shaders, textures, maps and all
of the ways you can add physical textures and color to your models.
A texture artist will work with what is referred to as a shading material that, when
applied to a 3D model, gives the artists the ability to control things like color,
reflectivity, shininess, and much more. This way, what was once a 3D model with
a solitary color can be transformed into a 3D model that looks a lot more realistic
with colors and materials applied.
If someone else is rigging and animating the model, the model might also be
getting rigged at the same time you or someone else is texturing it.
Rigging: Your models are ready to move, but how do you get them ready to jump
across the screen? Rigging, like all steps of the pipeline, is an important step in
making this happen. By creating rigs for your characters and objects that move,
you or an animator can control the movement to create life-like or stylized
animations.
Before an animator can begin the animation process the computer needs to be
told how the 3D models can move. For example, should a 3D model that looks
like an arm be able to bend at the elbow like a realistic arm? Or can it be
stretched into wacky shapes like a cartoon characters arm? Setting this up and
telling the computer the range of motion for each part of the 3D model is what the
rigging process accomplishes. This is done by creating control points on the 3D
model that an animator can bend and deform to create the animations. Think of it
as the skeleton for the 3D model.
Animation: In the animation phase, rigged assets are animated using controls to
match the desired shot. A lot goes into creating seamless animations, but this is
where you really see everything come together and results you can show others.
Using a timeline, an animator will set movement in frames that play back as an
animation.
Animation can mean anything from adding motion to a piston for an engine, all the
way to creating complex character performances that you see in the latest 3D
animated movies. 3D animation is essentially a digital version of 2D animation and
uses the same concept of as a flipbook animation. Except instead of creating a new
pose on each sheet of paper, 3D animators create poses on a series of still images
that are referred to as frames. By creating a series of poses and playing it over a
certain amount of frames you can create the illusion of movement. It is an animators
job to make the 3D characters and objects come to life in a believable way.
Dynamics: Closely tied to the animation step of the pipeline, dynamics allows
you to create simulations and real-world forces like a model shattering on impact
with the ground. Dynamics lets you save time, instead of hand animating
everything, and usually produces better results you can fine tune.
Lighting: As the name might imply, lighting is the step where you can control
most of the light elements of your scenes and shots (you may touch lighting in
both rendering or even a 2D application like Photoshop after creating the render).
Lighting lets you control everything from where the sun is in a shot to how much
glow a light might have thats in the scene. While it sounds easy, and it will be
with practice, lighting can add that exact feel you want a shot to portray.
Polygon geometry
NURBs surfaces
Subdivision surfaces
surfaces.
Modeling Terms
Starting your 3D modeling journey is an exciting and rewarding experience. As you begin to
learn and practice, there are essential terms you need to know and remember to grow your
modeling skills. Use this resource to learn key terms and as a reference when creating your
own 3D models.
Vertex
A vertex is simply a point in 3D
Face
Edge
to them.
Topology
Triangle
Quad
N-Gon
deformed or animated.
Modeling Terms
Extrude
Extruding is one of the primary ways of creating
additional geometry on a mesh. The extrude
command allows you to pull out extra geometry
from a face (polygon in 3ds Max), edge or vertex.
Edge Loop
An Edge Loop is a series of edges connected
across a surface, with the last edge meeting the
first edge, creating a ring or loop. Edge loops are
especially important for maintaining hard edges
in a mesh, and also for more organic models.
For example, in order for an arm to deform properly
there will need to be edge loops on each side of the
elbow joint so there is enough resolution.
Modeling Terms
Beveling
Pivot point
Normals
Instances
When working with a 3D set you will often need to create
duplicates of a single object, whether its hundreds of trees or
fence posts. Doing this however can greatly increase render
time, because the computer has to calculate all this new
geometry. Instead of creating a duplicate of an object, you can
create an instance.
Instances
Construction History
While youre working on your 3D models you will likely
use a wide range of tools to get the desired result. For
example, you may need to extrude many different
faces or bevel the edges of the model to create a
particular shape. Most 3D applications keep track of all
This displays a list of every different tool youve used
on your 3D model in the order that you used them.
Modeling Terms
Digital Sculpting
When creating 3D models in an application like Maya the process includes manipulating
vertices and edges to get the desired look. While this works, it can be hard to get fine detail
that is often required, especially in organic models. Digital sculpting works around this issue by
allowing you to create your 3D meshes in much the same way as a traditional sculptor would.
Base Mesh
Modeling Terms
If you are new to the 3D texturing process then chances are youve heard some terms
being tossed around that you might not fully understand. This article will go over some of
the most common texturing terminology you are likely to encounter so you will be more
comfortable when deciding which map to use or what was that term they just referenced?
Texture Mapping
Shaders
Eye Shader
TEXTURING TERMS
Tail Texture
Map
UV Mapping
A 3D object has many sides and a computer doesnt know how to correctly put a 2D
texture onto the 3D object.
Specularity
Normals
TEXTURING TERMS
A UV map is basically the 3D model stretched out into a flat 2D image. Each face on your
polygon object is tied to a face on the UV map. Placing a 2D texture onto this new 2D
representation of your 3D object is much easier.
Transparency Maps
Transparency maps are grey scale textures that
use black and white values to signify areas of
transparency or opacity on an objects material.
For example, when modeling a fence, instead of
modeling each individual chain link which would take
a significant amount of time, you can use a black and
white texture to determine what areas should stay
opaque and what should be transparent.
Bump Maps
A bump map gives the illusion of depth or relief on
a texture without greatly increasing render time.
For example, the raised surface on a brick wall
can be faked by using a bump map. The computer
determines where raised areas on the image are by
reading the black, white and grey scale data on the
graphic. In other words, bump maps encode height
information using black and white values.
Normal Maps
A normal map creates the illusion of detail without
having to rely on a high poly count. For example, a
character can be detailed into a sculpting program
like ZBrush, and all the information can be baked
onto a normal map and transferred to a low poly
character, giving the illusion of detail without
increasing the actual poly count for the model.
Game studios utilize normal maps often because
they need to stay within a tight polygon budget, but
TEXTURING TERMS
Baking
In your typical 3D scene you will want to shade, texture and light objects to create the exact
look that you want, and then you render. To shorten render times you can bake all the
materials, textures and illumination information into an image file. For instance, you could
bake all the lighting information directly onto an existing texture, render it once, and then
delete the actual lights used in the scene. This is great for games because a light would
need to be recalculated every new frame.
baked lights
TEXTURING TERMS
Now that you have familiarized yourself with these common texturing
terms, youre one step closer to building textures for 3D models. See
them in action in the CG101: Texturing tutorial before taking the leap into
any texturing tutorials.
Joints
Sometimes called bones, you can think of joints for rigging in the same way you think of joints in a human body.
They basically work in the same way. Joints are the points of articulation you create to control the model. For
instance, if you were to rig a characters arm you would want to place a joint for the upper arm, another joint for the
elbow and another joint for the wrist, which allows the animator to rotate the arm in a realistic way.
Driven Keys
To speed up the animation process for the animators, a rigging artist
can utilize driven keys when rigging a character. Driven keys allow you
to use one control or object to drive multiple different objects and
attributes. In the example we can use a driven key to control the fist
A driven key contains two parts: the driver and the driven. The driver
is the object in control of the animation. The driven is the objects and
attributes that are being controlled by the driver. Typically for regular
keyframes an attribute has values keyed to time in the time slider. For
a driven key, the attribute has values keyed to the value of the driving
attributes. The driver can be another object, or in the case of the
Driven
Driver
RIGGING TERMS
IK (Inverse Kinematics)
Animate bones by going up the hierarchy.
Great for keeping the hand planted while
animating the uppper body.
FK (Forward Kinematics)
Animate bones by moving down
the hierarchy, or forward. Great for
smooth arcing movements.
KINEMATICS
FK (Forward Kinematics)
IK (Inverse Kinematics)
more time than IK, but can give the animator much
more control of the poses. Most times riggers will
incorporate both FK and IK into the rig to meet the
animators needs.
RIGGING TERMS
Constraint
Constraints are very important in both the rigging and animation process. Typically your 3D application will have
several options for constraining. Constraints limit an objects position, rotation and scale based off the attributes
of the parent object. For example, by taking two separate spheres, applying a parent constraint, and then deciding
which is the parent and which is the child, you can select just one and the other will follow the parent.
Constraint
Control Curve
Control Curves
Control curves are created by the rigger to assist the animator in manipulating joints within the rig. Typically a rig
consists of many components that need to be manipulated to move the character in the desired pose. This can be
very difficult to do without control curves because the animator would need to hide the mesh to see the skeleton
Control curves are typically simple NURBS curves placed outside of the character so the animator can easily select
the curve to position the character instead of the actual joint.
RIGGING TERMS
within the character and try to determine which joint manipulates the elbow, for example.
Deformers
Skinning
Weight Painting
Skinning
RIGGING TERMS
Deformers
Facial Rigging
When creating complex character rigs the facial rig setup is often
a whole different monster. A typical joint or bone setup doesnt
work well for a facial rig other than having a joint for the jaw bone
because facial movement often requires very stretchy and organic
motion. Instead of a normal joint setup, facial rigging usually requires
deformers (mentioned on previous page) and blend shapes.
Blend Shapes
A blend shape, or morph depending on your 3D application, allows you
to change the shape of one object into the shape of another object.
When rigging, a common use for blend shapes is to set up poses for facial
animation. This might be lip sync poses or more complex expressions like a
smile or frown. You can tie all these new poses into the original face mesh
and have it operate all on one control slider.
For example, if you want to raise an eyebrow you can model a face pose
with one eyebrow raised, connect it to a blend shape and using the slider
with a value of 0 to 100 to either raise or lower the eyebrow. This is a great
way for the animator to be able to quickly make face poses without having
to move individual facial controls around. There are some downsides to
using blend shapes for facial poses, because the editability can be limited.
Riggers often will give the animators both blend shape options and
traditional control points to use them in conjunction.
Right Brow
RIGGING TERMS
Left Brow
Modeling with
Quads or Triangles
What Should I Use?
As a modeler, youve probably already
encountered your own internal debate on
using quads or tris? The truth is, while it can
come down to preference, there are some key
advantages to using quads to create your models.
Its also not the end of the world or your model
if you use both, but it is recommended to use as
few triangles as possible to save you some major
headaches down the line. Lets cover why quads
are so beneficial for your work.
Triangle
N-Gon
Quadrilaterals
QUADS VS TRIANGLES
Edge loops
Sculpting
QUADS VS TRIANGLES
Subdividing
Smoothing
QUADS VS TRIANGLES
Animation
Its always good to look at how other artists solve topology issues in their work. If you want to
go through a series of exercises exploring techniques to create the best topology in your own
models, follow along some of our popular courses:
Skill-Builder: Mastering Topology in Maya
Modelers Toolbox: Topology Tips
Retopology Techniques in Mudbox
Adding Facial Topology in 3ds Max
If youre just starting your 3D modeling journey or you need a refresh on important modeling
terms, take a look at an article covering key 3D modeling terminology.
QUADS VS TRIANGLES
youll be able to get up and running and comfortable with the software so you can
spend more time animating.
Body Mechanics and Animation in Maya: Pulling Objects, and Creating Run
Cycles in 3ds Max.
Push Yourself
As an animator, you should have a willingness to learn and be eager to learn new
things as well. Animation is never something thats truly mastered; there are
always new things to discover. Its never good to become complacent. Find new
ways to enhance your skills, whether its animating a type of creature youve
never tried before or taking on a more subtle acting shot youre not use to.
Get Experience
Your dream job or goal may be to work at a studio like Pixar or Infinity Ward, but
that doesnt mean you should hold out until you get there. Just because a studio
thats interested in you isnt Pixar doesnt mean you shouldnt work there. Any
experience you can get is good experience.
Sure, it might not be your true dream job, but youre animating and youre
improving your skills. The more on the job experience you get under your belt will
make you that much more appealing to the bigger studios.
Level Your Expectations and Exceed Everyone Elses
Its important to keep in mind that you might not get the most amazing shots at
your first job. Youll likely need to prove yourself. So be willing to take on any shot
no matter how small it may be. The animation supervisors will want to see how
you handle yourself and how well you do on these simpler shots before giving you
more complex ones.
For example, your first animation at a studio may be a quick little 24 frame clip of
something like a hand opening a door or a character turning to look at something
else. Yeah, they might not be the animations the audience will gawk over, like a
shot where you need to animate fighting robots, but you need to approach every
shot you do like it is.
Dont look at those animations or those shots the experienced animators give to
the new guys so they dont have to do them as boring. Approach it like its the
most important shot in the film. For each shot you do, the supervisors will see
your eagerness and skill. Eventually more complex and exciting shots will be sent
your way, but most likely not right off the bat.
You also need to remember that when you start animating professionally youre
not animating for yourself anymore. When you were learning, you were animating
your own shots, your own ideas. As you begin working at a studio, especially in
movies, youre animating for the director now. Youre bringing their project to life
and its up to you to make their idea and vision a reality.
Thats not to say you shouldnt bring your own ideas to the table, because you will
be, but you need to be able to take feedback and criticism well. If the director
doesnt like one of your choices and tells you to go in a different direction,
chances are you will need to.
Animation is an extremely fun and challenging path to take. Now that you have an
understanding of what it takes to become a successful 3D animator, its up to you
to take the next step.
Join a community to share your work with other animators and start learning with
the many 3D animation tutorials available, including Maya animation tutorials, 3ds
Max animation tutorials and CINEMA 4D animation tutorials.
If you have any questions or need some more advice, share in the comments
below.
Keyframe
Timeline
Keyframe
Timeline
Keyframe
24 FPS
Frame rate
The frame rate is the amount of frames per second. Its important to find out what frame
rate your animation needs to be on before starting any animation, so you can be sure your
animation will be timed right. For example, in film the frame rate is 24 frames per second,
meaning 24 different images are displayed over the course of 1 second. As one of the more
common frame rates, 24 frames per second is a great frame rate to default to if youre not
sure what frame rate your project needs.
ANIMATION TERMS
12 F = .5 S
Poses
A pose in animation
represents how the
character is positioned. You
can think of a pose the same
way a statue is posed. Except
in animation there are
many poses that make up
the animation. If you were
to freeze an animation at
any point in time, whatever
position that character is in
could be considered a pose.
S Shape
Backwards C Shape
Line of Action
The line of action is an invisible line that can be drawn along a characters pose. Typically
there will be a few main lines for a pose, a C shape, a backwards C shape, and an
S shape. When posing out your character youll want to ensure you have a strong line
of action that typically resembles one of these shapes to help you establish a dynamic
pose. An unappealing line of action would be a simple straight line that flows from your
characters head to their feet.
ANIMATION TERMS
C Shape
Breakdown
Blocking is an animation
technique where the most
important story telling
poses are created to
establish the placement of
character or object and how
they will move in the scene.
This technique is used very
early on in the animation
process and helps tell the
story of the animation.
Blocking is often the first
step in pose-to-pose
animation.
Inbetweens
Not to be confused with a
breakdown. An inbetween
basically fills in what is
happening between the
breakdowns for pose A
and pose B. In computer
animation often times
the inbetweens will be
created by the computer.
In traditional 2D animation
there were often times
assistants to the animators
called inbetweeners that did
this fill in work.
ANIMATION TERMS
Blocking
Moving Holds
Polish Pass
The polish pass refers to the very last step in an animation. This is the point when an animator
will add in the very small finishing touches to the work. Things like eye movements, finger
adjustments, tracking arcs, etc. are all usually animated in the polish pass after all of the main
movements have been finalized.
ANIMATION TERMS
Twinning
Gimbal Lock
In computer animation gimbal lock is the loss of one degree of rotation in a joint. In computer
animation this means that there has been a significant amount of rotation that has passed the
180 degree mark, and the computer doesnt understand which direction you want to rotate. This
will look like a very fast rotation hiccup when your animation is played back. To avoid gimbal lock
make sure you choose a rotation order that would be suitable for your animation. The rotation
order allows you to reevaluate how each axis reacts as an object is rotated. Certain applications
also include tools that reinterpret rotation values that cause gimbal lock in order to help iron out
discontinuities in animated-rotation data sets.
ANIMATION TERMS
Breaking Joints
This is probably one of the most important aspects of having a great character rig.
A rig needs to deform properly and believably in every single area of
manipulation. If a character is bending over, the stomach and chest all need to
deform properly like a real human would.
Bad deformations can stick out like a sore thumb, and not only look bad at
animation time, but also when rendered. To fix this, youll need to make sure
youre painting the weights properly and in the right areas.
As the character comes down from the modeling department the character should
have enough resolution to deform well, so its up to you to ensure the characters
you receive have all the right edge flow and topology in order to create a great rig.
Before ever passing your rig on you should have a strong testing phase to ensure
that all the weights have been properly painted to achieve a realistic deformation,
so your rigs wont be kicked back by an animator due to deformation issues.
In order for an animator to move the individual joints on a character theyll need to
have access to control curves to make the selection process much easier. The
placement of your control curves should be clear on the rig; the animator should
be able to tell exactly what the curve will influence on the particular part of the
character without having to select it first.
The control curves should also be big enough to see so the animator can easily
select them. Clear control curves will ensure the animator spends less time
figuring out what a control does and more time actually animating.
GUI Picker
A GUI (Graphical User Interface) picker is a great feature to include on a rig with
hundreds of controls. While clear control curves are vital, depending on how
many controls are actually needed on a character rig can make the selection
process extremely difficult for the animator. A complex character rig can
sometimes get up to the thousands in the number of controls. Having a place
where all these controls can be displayed is very beneficial.
GUI pickers are especially good for facial controls because often times there can
be hundreds of controls all populating one small area on the face, making them
extremely hard to find or select if doing it directly in the viewport. Giving the
animator the ability to see a representation of where each control lies on the
character and allowing them to select it in a different window can greatly speed up
the animators workflow.
However, its important to keep in mind the GUI picker should be simple enough
to use. Try not to have numerous submenus for one character all with different
controls. A GUI picker should speed up the control selection process not slow it
down.
Great Eye Controls
When it comes to character animation the eyes are extremely important. The rig
may be used in subtle acting shots, where the eyes can play a key role in selling
the characters mood and emotions. There should be more than just an open and
close control for the eye lids because the animator will most likely need to do
more with the eyes than blink.
Having several eye shapers along the lids will help push the expression of the
eyes and sell the overall emotion the animator is trying to achieve. You can also
incorporate squash and stretch controls in the eyes and lids to help push the
fleshiness of a blink and to create more exaggerated animations.
IK and FK Options
In any character rig you create there should always be the option to switch
between either IK or FK, whether in the arms or the feet. A great character rig will
give the animators different options, because not all animators like using IK for
every shot. And as a rigger you can never predict the types of animations your rig
will be used for.
In a character rig there should also be the ability to switch to IK or FK for the feet.
While most of the time IK will be used for the feet, there are still times when an
animator may want to use FK to achieve the right look. Having IK and FK options
for both the feet and arms will eliminate any possibility your rig will be sent back
because an animator likes to work in FK.
Versatility
You never know what type of animation your character rig will be used for, so you
should design your rig to be versatile. The rig should perform well for very subtle
acting shots, as well as realistic body mechanics or more cartoony animations.
You can incorporate stretchy limbs and squash and stretch controls where
needed to give the animators the ability to exaggerate their animations when
needed. Not all shots need to be cartoony or subtle, but a character rig should be
able to perform well and achieve the look wanted no matter what type of shot it is.
Of course, depending on the project there will be times when your character rig
will be used for only very realistic movement, and will never need to achieve a
cartoony style animation. So its important you have clear communication with the
other departments on the limitations that need to be set for a rig, and how much
freedom you have.
Full Finger Controls
The finger controls are an extremely important aspect in a character rig that
sometimes get overlooked. The controls on the fingers should be more than
simple opening and closing in one rotation axis. The hands are used to express
emotions, and getting the right posing is extremely important.
A great character rig should have the ability to rotate each individual finger joint in
the X, Y, and Z rotation axis. Thisll allow the animator to incorporate things like
drag and lead and follow into the finger animations. Itll also give the animators a
great deal of control over the fingers so they can get the perfect pose.
You can even take this a step further and add in squash and stretch controls for
the tips of the fingers to get the compression that occurs when a finger is pressed
up against another object, like a table for instance.
Global Scaling
Having the ability to scale a character rig is great for ensuring the character will
look right in any environment. The animator should have the ability to completely
scale the entire rig, including the controls.
This will give the animator as much control over a shot as possible. For example,
animators are often required to cheat certain things to the camera to achieve the
look they want, and scaling a character down can be perfect for establishing
perspective between two characters in a shot.
Breathing Controls
Breathing controls are a great feature to incorporate into any character rig,
because if this is a living and breathing character, they will obviously need to be
able to breath, which means they will need to be animated to give the illusion of
breathing. By including breathing controls you can speed up the animation
process.
Youll give the animator the ability to quickly and easily create a breathing
animation without having to move and scale the chest controls to try and simulate
this effect. Providing a simple slider controlling the characters breathing means
the animator can get a realistic animation in a fraction of the time by keying this
one slider.
Pickwalking Ability
Pickwalking allows you to use the arrow keys to select through different controls
on the character rig. For instance, if you select the wrist control you can pickwalk
up to the elbow, then the arm.
While this might not be as vital as some of the other features listed above its still
a great feature to include in your character rig. Even these very small additions
are extremely important to the animator, and can help them speed up their
workflow.
Taking the extra bit of time to include the ability to pickwalk will not only create a
more intuitive rig, it will also ensure that the animators are able to spend more
time animating. Take note that depending on your program, pickwalking may be
called something different.
Now that you know the key features every character rig should include, start
incorporating them into your next project. Whether its a simple addition like pick
walking or a more vital aspect like clean deformations they all play a part in
ensuring your character rig is the best possible for any animation. If you want to
learn more about character rigging check out Introduction to Rigging in Maya, and
continue learning with the hundreds of other 3D rigging tutorials.
Chances are the last blockbuster movie you saw utilized dynamics to create some
of the special effects. Without dynamics most of the jaw-dropping 3D effects you
see wouldnt be possible. Working with a complex physics engine may seem
daunting, and an understanding of physics might seem paramount, but in actuality
you do not need to be a physics genius to create these types of simulations.
Dynamics are a type of animation simulation but they differ in how they are
calculated in the computer. Typically dynamics are calculated from frame to
frame, and the position of an object in each frame is taken from the position of the
previous frame. This differs from keyframe animation where the objects position
is determined by key values set at different frames. You can, however, bake out
your simulations into regular keyframe animation, which will allow you to edit the
simulation with the use of keyframes on the timeline.
If youve ever had the challenge of animating a collision, or any object that needs
to feel like its obeying the laws of physics, you know it can be extremely difficult.
Something as simple as a dice rolling on the table can take hours of tweaking
keyframes to get it to look natural.
Dynamics have the ability to quickly and easily simulate this type of animation
with what is called a rigid body, whether its a line of dominoes falling or a
wrecking ball demolishing a brick wall. If you were to animate each brick
crumbling or each domino falling, it would be a giant task but with rigid bodies it
can be simulated by the computer in a realistic way in a fraction of the time. Rigid
bodies are great for simulating animations that would otherwise take much too
long with traditional keyframing.
Most 3D applications have built-in effects great for quickly dropping down an
effect that will produce very nice results. For instance, Softimage, Maya, and 3ds
Max have built-in fire effects that can be emitted from any polygon or NURBS
object. Play around with these effects in your 3D application to see how they
work; each one has different attributes that can be fine tuned to adjust everything
from fire strength, emission, direction and more. There are also many other prebuilt effects like smoke and lighting.
Another great feature with dynamics is the ability to simulate cloth. With Maya you
can quickly simulate a cloth material from any polygon object with nCloth. Whether
you want to create clothes that move and flow properly around your character, or a
table cloth for a dining room this can be achieved with nCloth. In 3ds Max this is
called Cloth, and Softimage simply calls it Cloth as well. Keep in mind that working
with a cloth simulation can use a lot of computing power. Even with a fast computer,
cloth dynamics at a high level of accuracy can take a very long time to process, so
you may need to lower your simulation to a reasonable level.
One of the most powerful features in a dynamics system is particles. Particles can
be used to replicate fire, explosions, smoke, water, fog and more. While the builtin effects that come in most 3D applications are great, particles allow you to fine
tune the effects and have complete control over your dynamic simulation.
Particles can also be used to create things like grass and fur.
The particles are controlled by an emitter which acts as the source of the
particles; the emitter has many different attributes attached to it, like particle
emission rate, velocity and many other settings that can be tweaked. Unlike the
built-in effects, particles do not produce the desired look right at the start, and will
need to be adjusted to create the look of the effect that you want.
What to Expect When Working With Dynamics
When working with dynamics there is inevitably going to be a significant amount
of trial and error. The result you are looking for will not be achieved with the click
of a button. Even with pre-made effects like the fire effects, there will most likely
be editing that needs to be done in the effects attributes, in order to get exactly
what you are looking for. When working with dynamics, the typical workflow is
tweak and test. Do not get discouraged when you arent getting the results you
want right off the bat.
What you see in the viewport isnt always what you get. When played back,
complex dynamics can give undesirable results or be slow all together, because
the computer has to calculate everything on the fly. To get a better representation
of the dynamic simulation happening, you can do a quick playblast or animation
preview depending on your software.
Dynamics are an amazing tool to have at your disposal, and can make many
tasks much easier for you. Whether you need to simulate an object ripping
through geometry or a wall being demolished, dynamics can achieve it quickly
and in a believable way. To learn more about dynamics, check out these in-depth
tutorials on Introduction to Dynamics in Maya, Introduction to MassFX in 3ds Max,
and Beginners Guide to ICE in Softimage. Find more training options with more
3D dynamics tutorials.
The most common way to achieve proper three-point lighting is by using three
different spot lights in the scene. Setting up each light the correct way will allow
the subject to be illuminated without deep shadows and be seen properly in the
camera view.
The first and most important light is the key light. Like the name suggests, this
light is vital when establishing the overall lighting for the scene. It should have the
most intensity out of the three lights and should highlight the form and dimension
of the subject. The key light is typically set up to the right of the camera at a 45
degree angle.
Once the key light has been properly set up, then the fill light should be created.
The fill lights purpose is to fill in the deep shadows that are inevitably cast onto
the subject by the key light. The fill light is usually set up opposite of the key light.
The last spot light used is the rim light (sometimes referred to as the back light).
This has the least illumination effects to the subject because it is typically placed
directly behind the subject, facing the camera. The rim lights purpose is to add a
very slight glow to the back of the character. If you were to hide the key and fill
light, you would see that the subject is darkened all around, except a small light
around the edges.
Key Light
When setting up the key light the first step is getting it in the proper place. As
mentioned above, typically it is placed to the right of the camera at a 45 degree
angle. While this works most of the time, it really depends on your scene and
what you need to illuminate. Play with different angles and positions until you are
happy with the result. Remember, most of your light will be emitted from the key
light so a higher intensity may be needed.
Fill Light
The position of the fill light depends on where the shadows are being cast from
the key light. It should be placed in a spot where it can illuminate those dark spots
on the subject. Its important to remember that the fill light should not be as bright
as the key light.
A common mistake is having the intensity much too high, like in the example
image above. This can cause the subject to get blown out.
Instead, you want it just bright enough to illuminate what the spot light isnt
reaching.
Rim Light
When positioning the rim light make sure it doesnt really provide any frontal
illumination to the subject, it should just create a very slight outline of light. While
the rim light is typically placed directly behind the character dont be afraid to
adjust the angle in order to achieve the look that you want.
Depending on the subject you are illuminating, and if there is a background, you
may want to adjust your settings in your 3D application so the spot lights in your
three-point lighting setup only illuminate the subject and not the background. The
reason for this is because the spot lights can have unappealing results on the
background as shown in the image above.
Now that you are familiar with what goes into three-point lighting, a great next
step is to jump in and try it out on your next project. Keep into account that threepoint lighting is not the be-all and end-all lighting setup for every project.
However, it is a great place to start and build off of when youre first starting to
light your scenes.
never use motion blur to hide hitches in your animation. The first reason being its
better to fix it rather than hide it, and the second reason is that a recruiter will
notice!
Before you use motion blur you also need to make sure that your animation
curves are cleaned up and there are no problem areas, like gimbal lock. Messy
curves can cause problems in the motion blur calculations. This will result in
glitchy motion blur when rendered.
This can be frustrating, especially if your animation was just a few frames from
being completed! Or if your animation takes several hours to finish, which it most
often does.
Instead, render to an image sequence. This can be a JPG, PNG or whichever you
prefer. If your computer crashes, all the previous rendered frames will be saved,
and you can start back up from where you left off. Just about any compositing
software can import image sequences so all you have to do is bring them in and
make sure they are playing back at the frame rate you animated in, most likely 24
frames-per-second, and render to a video format from there.
If youre ready to render your animations for your demo reel, make sure to
incorporate some of these tips into your workflow to speed up the process and
produce results that will enhance your animations. Remember that your render
shouldnt hide or distract from the animation but should complement it.
Blending modes
Another way you can composite images together is through blending modes.
There are a lot of different blending modes to choose from, but when boiled down
to their core functionality they are simply mathematical algorithms that bring the
images together in a way that blends the color information together. One of the
most common blending modes is the Multiply blending mode.
Before I explain the awesomeness of this blending mode, you first need to
understand the color scale within the RGB channels. It is on a scale of 255 colors
with 0 being black and 255 being white.
So lets say that I have a beautiful nature scene and I have an image of a red leaf.
I want to place the leaf on top of the nature scene and use the multiply blending
mode to unite them. So you change the blending mode of the leaf to multiply, and
now all the color values in the nature scene below the leaf will be multiplied by the
color value of the leaf. Then they will be divided by 255.
Following the above process, youll essentially create an effect where the whitest
pixels in the image become transparent. This creates a darkened effect on the
leaf so that the pixels underneath in the image show through. There are a lot of
other blending modes with similar operations that can be chosen depending on
the look you want. In fact, every union of images you have has a blending mode.
Its just that if you dont set it, its defaulting to the blending mode of Normal.
Node vs. Layer-based compositing
Up until this point, weve been looking at the compositing process from a software
agnostic standpoint, which is best if you still havent chosen what software to
learn. However, when you do choose, youll have two main types of categories to
choose from: node-based software and layer-based.
Layer-based software rely on information being placed in a stack. If you want an
image to go on top of another image then you would place it on top of the stack.
When it comes to node-based software, you rely on input pipes to choose which
image you want to be on top. If you have an A and a B pipe, you would hook
your B pipe up for the background and the A pipe up for the foreground.
Not only was Star Wars a film that is still impressive to the VFX industry today,
but 70s films like The Hindenburg, The Poseidon Adventure and the horror classic
The Exorcist brought together many different techniques, like matte paintings,
which is a technique still heavily utilized in the VFX industry. The horrifying effects
in The Exorcist like the 360 degree head rotation and many other grotesque
effects made the 70s a very impressive decade in terms of visual effects.
80s
The 80s saw a massive leap forward in visual effects with movies like Blade
Runner, Raiders of the Lost Ark and E.T. the Extra-Terrestrial. Blade Runner
featured a beautiful futuristic city with flying cars, floating advertisements and
more. Everything youd expect to see in a futuristic city, right? The movie took
place in 2019, so were about four years away, hopefully well get our flying cars
soon!
Ray Harryhousen also showed off more of his considerable skill with Clash of the
Titans which features some amazing stop-motion work.
Beyond these great movies of the 80s, there were actually even larger
advancements that lead to what visual effects are today. The 80s introduced the
first computer generated images in a movie. Star Trek 2: The Wrath of Khan was
the initial film to feature a completely computer-generated scene. Right after Star
Trek 2: The Wrath of Khan, and the first CGI elements in a movie, Tron took this a
step further and featured extensive sequences created entirely by the computer.
After that, more and more movies in the 80s featured various CGI elements like
The Last Starfighter which featured detailed 3D models, whereas before this type
of spaceship was created with miniature models, like in Star Wars. The first ever
3D animated short film was released in 1984 with the title, The Adventures
of Andr and Wally B. If youre familiar with the history of Pixar you may
recognize this as one of the original things created by John Lasseter and his
team.
90s
After the introduction of CG in the 80s, this only led to more and more films
utilizing this technique. If the 80s were the spawn of CG in movies, 90s were the
explosion. You can probably think of a few game-changing feature films that
many VFX artists refer to as the reason they got into the Industry. Jurassic Park is
one of them. Spielberg had a team of experts and combined CG with
animatronics to create several different breathtaking sequences that gave a new
look into what is possible with CG.
There were also many other advancements with CG in the 90s, including the first
time motion capture technology was used in the film Total Recall, for a very short
x-ray sequence. Terminator 2: Judgment Day featured many distinctive visual
effects shots, as the liquid metal terminator could morph into any character. Shots
like when the terminator was shattered into many different pieces and those
pieces reassembled back together were just a few of the amazing VFX
sequences in the film.
Of course, likely the biggest advancement in terms of CG was the first feature film
created entirely in CG, which was Toy Story. This led to the success of Pixar and
the spawn and popularity of many different completely 3D animated films. Not
only that, but the technology used to create these films also helped to push the
quality of the CG elements integrated into feature films.
The Matrix achieved numerous different innovative visual effects elements
making up a large portion of the film. Of course, the bullet dodging scenes are
very iconic utilizing various techniques to achieve this effect.
2000s
As were inching closer to where visual effects are today, the quality is constantly
increasing. Films like The Lord of The Rings took motion capture technology to a
totally new level with the creature, Gollum. One of the first films to heavily utilize
motion capture, they were able to infuse an actors performance onto a entirely
CG creature. This of course led to many other movies perfecting this technique
even further, like The Polar Express.
Pirates of the Caribbean: Dead Mans Chest also pushed motion capture with the
award-winning visual effects on Davy Jones, using facial motion capture
technology to push the actors performance and capture realistic movements.
This technology was pushed yet again in James Camerons Avatar, with
advancements in facial and body motion capture.
2010-Present
The world of effects in films has definitely come a long way from special effects to
the dominated realm of visual effects. In the past few years, weve seen movies
constantly trying to push the boundaries of visual effects, trying to achieve more
realistic and believable visual effects that can hold up next to the real actors and
not know the difference.
To get a great glimpse into where weve come just in the past decade, take a look
at Gollum in The Lord of The Rings: The Two Towers and compare him to Gollum
in The Hobbit: An Unexpected Journey. As technology advances and the tools
used to create these out of this world characters so do the quality of what is on
screen.
The recent release of Dawn of the Planet of the Apes features extremely realistic
apes, and many advancements in terms of motion capture and the visual
aesthetics of the apes like the rendering of the fur. For example, Rise of the
Planet of the Apes was one of the first films to use motion capture on location,
and not in a specifically designed motion capture studio.
More films are being shot largely on green screen stages, leaving the rest of the
film up to the VFX artists. VFX is as much of a part as many blockbusters like The
Avengers or Pacific Rim as the actors themselves. While VFX is often seen as
icing on the cake of a film, its becoming more of a center piece. If you want to
share some of the films that inspire you as an artist, whether its with practical
effects or visual effects post them in the comments below!
DEMO REELS
10
12
11
13
14
Aspiring Modelers
Creating your own concepts is great, but your focus is
building those concepts in 3D not creating them. So feel free
to use designs that have been done by professional artists. If
the design is abstract, be sure to include the concept art.
Make sure to include wireframes to show your models
topology.
Even though you wont be animating your characters, take
the time to learn how your character would move and
behave. This will help you as you build your character.
Aspiring Animators
Rather than quick shots of random animations, try creating
a couple vignettes to tell a story.
Take the time to get to know your character well. Learn
about your characters likes, dislikes and how your character
would react to the situation. Then animate accordingly.
Aspiring Compositors
Not all compositing has to be visible right away. In
fact, the best composites are the ones that dont
stand out.
Include before and after shots so its easy to tell
what youve changed.
Softimage
Softimage is another of Autodesk 3D applications, sadly; however, the last
release of the program will be 2015. After that there will be no updates to the
software as Autodesk phases the program out. Its a side time in the 3D
community seeing the beloved program go the way of the wind, but there are
still plenty of 3D applications to choose from. If you still desire, you can
download the 30 trial or the three year free student license.
Softimage (30 day) NOTE: Softimage 2015 will be the last release per an
Autodesk announcement.
CINEMA 4D
There are several different versions of CINEMA 4D focusing on specific areas
of a production pipeline. CINEMA 4D Prime costs just $995 and focuses on
graphic designers. CINEMA 4D Broadcast costs $1,695 which focuses on the
motion graphics side of things. CINEMA 4D Studio combines every package
into one, costing $3695.
MODO
MODO is one of the more inexpensive 3D applications out there, costing
$1495. Even though it costs less its still a program that packs a punch. You
can also download a free 30 day trial.
Houdini
Houdini offers a few different versions of the program focuses on specific
areas of a pipeline. Houdini costs $1,995 which focuses on animation,
modeling and lighting. Houdini FX focuses specifically on the visual effects
side of things like particle simulations, fluids, dynamics, etc. It costs $4,495.
You can also get Houdini Indie which only costs $199 and focuses on
animators and game creators. The program is attached to a limited
commercial license however.
Blender
When it comes to pricing, Blender takes the cake on this one. Blender is an
open-source 3D application, meaning it doesnt cost anything. While its free, it
doesnt mean its not powerful. Its very popular among hobbyist and indie
developers, its also starting to grow in popularity in recent years as updates
and improvements are constantly happening with the program.
Blender (Download Page)
In terms of pricing, all the programs are very similar, with the exception of
Blender, of course. Maya and 3ds Max offers the most appealing pricing
packages with the 3 year student version. This gives you access to both
complete versions of the applications at no cost to you.
In essence, all of the tools are there in each application. Some may have more
powerful features in certain areas, but usually that is something you can learn
in any software down the line. We like to think of the analogy where all the
tools are in the tool belt, the application, is just organized differently. Some of
those advanced features like the Mograph Editor in CINEMA 4D or creating
quick dynamic simulations in Houdini may be software-specific, but learning
the main skills and tools will help you take those on when youre ready.
Spend a week in each software youre contemplating learning first and find
which one you are most comfortable working with. You can even start
watching free beginner training with a Digital-Tutors demo account, or jump
right in with a full membership and watch entire introductory courses.
to get re-familiarized when switching software it just takes time to get used
to the new setup.
When learning a software youll essentially be learning new skills too. These
skills transfer from application to application, so make sure that overall picture
is the focus. Low-poly modeling for games is low-poly modeling no matter
which application you are using to get there. Learning the essential skills, and
advanced skills along the way, will help you conquer your first application and
then any other that may stand in your way.
Game Art
From your research you may hear this is 3ds Max dominated (and previously
Softimage), but Maya is right up there with it. With the recent release of Maya
LT which focuses solely on game development Maya is slowly trying to bridge
this gap between 3ds Max and Maya in terms of a game art application. Thats
why its so important to focus on those skills youll be able to transfer to any
application.
We also love working with Softimage, but its popularity hasnt taken off quite
as much even though
hough it does have some unique, powerful tools and useruser
friendly UI. While 3ds Max and Maya are known as great game art
applications, MODO, CINEMA 4D and Blender are also popular for game
asset creation. Specifically Blender since its a free 3D application its popular
among indie developers.. If you want to learn more about the pros and cons of
each application in terms of game art read the 3ds Max, Maya LT or Blender
Which 3D Software Should I Choose for Asset Creation? article.
3D Modeling
If you want to get into game art, youllll likely be modeling, creating assets,
characters, etc. However, 3D modeling goes much deeper than that, and
modeling for movies can be different than modeling for games. While the
techniques are the same, the guidelines can be different. Some of the most
popular applications for 3D modeling are 3ds Max and Maya. You can
probably start to see the trend here. 3ds Max and Maya are really the two
industry standards applications. Both feature some really great 3D modeling
tools that will help you get the job done.
MODO, CINEMA 4D and Blender have also seen a rise in popularity in terms
of 3D modeling. CINEMA 4D boasts a fairly easy to grasp workflow that many
new artists enjoy for its ease of use. MODO also has powerful 3D modeling
tools. An important thing to keep in mind is that while each 3D application may
have slightly different tools, for the most part, they are all similar. Youll be
able to create the exact same 3D model in application A as you could in
application B, the techniques and workflows that you use to get there will be
different depending on the application. Thats why its vital that you test the
waters first, and find out what 3D programs workflow you find most
comfortable.
3D Animation
Its hard to put any limitations on software for this one. If you look to studios
and schools for what theyre using, and our most viewed animation training, it
goes Maya, 3ds Max, CINEMA 4D, Houdini, MODO, and Softimage. Even
prompting our animation tutors and modeling tutors with the question, they too
think it all comes down to your skills and applying them to whatever software
you need to use to make your scenes, characters and animations happen.
When it comes to animation, the 3D application actually plays a very small
part. Since creating animation mainly consists of creating keyframes, as long
as the application can do that, thats all that really matters. If you can animate
in program A you can very easily animate in program B without any real
learning curve. If youre wanting to get into animation its more important that
you focus on the principles of great animation, and not the application youre
going to use to create it.
Rigging
Maya is really known as the go-to application for rigging because of its
powerful scripting language MEL that allow artists to create complex rigs,
and create custom tools. 3ds Max also has some great rigging tools, like CAT
which can greatly speed up the process of rigging, and is great for game
projects. However, all 3D applications listed here have the ability to create
complex rigs. Again it really comes down to personal preference, and which
program you find more comfortable to work with.
Broadcast
CINEMA 4D is a growing name in this area and has seen an even bigger
spike with its new integration with After Effects using CINEWARE. Maya and
3ds Max still also play nicely with After Effects, including state sets in 3ds
Max. CINEMA 4Ds affordability and ease of use on a Mac have also helped it
grow in popularity. Depending on the intricacy of the project though, even
something like Houdini could be used for the VFX tools. Check out this indepth post to break down some of the
he differences between 3ds Max and
CINEMA 4D for motion graphics. With CINEMA 4D Broadcast it also gives you
the option to choose a cheaper version of the program if the only thing you
want to focus on is motion graphics and design.
Visual Effects
Most applications weve been discussing, possibly with a mixture of other
secondary applications, are going to be able to create stunning visual effects.
Youll hear
ar about some powerful tools and dynamics, such as Softimages
ICE, Houdini as a whole. With The Foundrys purchase of Luxology, visual
effects between NUKE and MODO should continue to evolve too, though the
workflow between NUKE and other applications is also still very popular.
When it comes to visual effects it usually comes down to a mixture of different
applications to create the finished work. For instance, Houdini is known as the
go-to
to app for simulations, NUKE for compositing. Maya, CINEMA 4D, MODO,
etc for rendering.
Product Visualization or Design
Most applications discussed will help
you create product
ct and architectural
visualizations anything from tiny
handheld devices, to vehicles, to large
building developments. For
visualizations, learning rendering
applications like V-Ray
Ray, Arnold
Renderer and Maxwell Render will also
be helpful. When its time to create
specific product designs though, CAD
software like Inventor,, Rhino or
SolidWorks might be more fitting. Youll
also want to spend time focusing on
items
tems outside the software like usability
of the product and overall user
experience.
All 3D applications mentioned have the tools youll need to create product
visualizations since they are complete 3D packages with modeling, animation,
texturing and rendering tools.
Investigate and Ask
If you have a dream job or even think you know your first stepping stone into
the creative industry, there is absolutely nothing holding you back from hunting
down more specifics. Look to things like job listings for the software
sof
they
mention, Google what does X Studio use? or even how did X Studio create
that shot?, and then there is the friendly email to the company that can work
too.
Knowing what a company is using can help you get that foot in the door, but
again dont limit yourself to one application. There are many studios that use
their own proprietary software. For instance, Pixar uses Presto for animation,
and DreamWorks uses Premo for all their animations. Thats why the most
important thing is that you master the skills youll need, and not necessarily
the 3D application.
Hopefully your head is now in the right place for making the software decision,
whether its Maya, 3ds Max (or Softimage, CINEMA 4D, MODO, Houdini,
etc.). Dont forget thats its not a forever commitment should you want to
change not all will be lost. Focusing on high-level skills and techniques that
transfer to any application will set you up for a successful career; that and
hard work, patience and perseverance.
See how you can jump ahead from the start with introductory training on
todays most popular creative software. Learn and grow skills with the same
training used in top schools, studios and companies all around the world.
For more info no comparing and contrasting 3ds Max and Maya, be sure to
check out a newer article about the differences in these popular applications.
Whether you are creating a character rig for yourself or the animators within your
pipeline, you need to take into account some of the important things that go into
creating a great character rig. Lets go over five tips that can help you create the best rig
possible.
You can also create an IK and FK spine rig or a blend of both together to get more
control over the chest movements. An IK and FK spine is a great option for creating
very stylized cartoony animations.
squash and stretch attribute within the head control. This still gives the animators a
good level of flexibility but without too many control curves to keep track of.
Alternatively you should also be cleaning up the attributes for your controls. If an
animator doesnt need to rotate or scale a control, or maybe if they do it could have a
negative affect on the rig, then lock it and hide it. That way the animator wont have to
worry about it or accidentally do something they didnt intend.
Add Special Controls as Needed for Your Rig
Depending on what your rig is going to be used for try incorporating those extra
controls that can push your rig to the next level. If you are familiar with animation you
probably know what is beneficial to add to a rig. Putting in a squash and stretch control
for the head or the hips can help with those shots that need a bit more cartoony flair to
them. Or even incorporating a shrink and stretch control to the legs can help when
creating a more stylized animation, but are also great for reducing knee pops for walk
or run cycles.
Go one step further, and create a control for the chest that can quickly simulate a
character breathing by simply tying an attribute into the existing chest control. If you
are not sure what kind of extra things you can add to your rigs, ask some animators.
Theyll be sure to throw some ideas at you.
12 PRINCIPLES OF ANIMATION
Explore the 12 principles and start mastering them in your own work to create captivating
animations. If you ever get stumped on a principle or your animation needs some help, use
this as a reference.
Squash and stretch can be implemented in many different areas of animation, like the eyes
during a blink or when someone gets surprised or scared, their face squashes down, and
stretches. Squash and stretch is a great principle to utilize to exaggerate animations and
add more appeal to a movement.
12 PRINCIPLES OF ANIMATION
The easiest way to understand how squash and stretch works is to look at a bouncing ball.
As the ball starts to fall and picks up speed, the ball will stretch out just before impact, and
as the ball impacts the ground, it squashes, and as it takes off again it stretches.
#3 Anticipation
Anticipation is used in animation to set the
audience up for an action that is about to
happen. Not only is anticipation needed to
prepare the audience, but its also required to
sell believable movements.
12 PRINCIPLES OF ANIMATION
Overlapping action is very similar in that it means different parts of the body will move at
different times. For example, if a character raises their arm up to wave, the shoulder will
move first, and then the arm, and the elbow and hand may lag behind a few frames. You
may have also heard this referred to as drag or lead and follow You can even see an
example of overlapping action in something like a blade of grass, the base moves first, and
the rest of the grass follows behind at different rates, giving you that waving motion.
In real life everything moves and different speeds and at different moments in time, and
that is why follow through and overlapping action is so important for capturing realistic and
fluid movement.
12 PRINCIPLES OF ANIMATION
Follow through is the idea that individual body parts will continue moving after the
character has come to a stop. For example, as a character comes to a stop from a walk,
every part of the body wont stop at the exact same time, instead, the arms may continue
forward before coming to a settle. This could also be articles of clothing that continue to
move as the character comes to a stop.
#6 Arcs
Everything in real-life typically moves in some type of arcing motion, and in animation you
should adhere to this principle of arcs to ensure your animation is smooth and moves in a
realistic way.
The only time something would move in a perfectly straight line is if youre trying to
animate a robot, because its unnatural for people to move in straight lines.
For example, if a character is turning their head, they will dip their head down during the
turn creating an arcing motion. You also want to ensure the more subtle things move in
arcs as well, for example the tips of the toes should move in rounded arcing motions as the
character walks.
12 PRINCIPLES OF ANIMATION
#7 Exaggeration
#8 Solid Drawing
#9 Appeal
This principle can really come down to adding
more appeal in many different areas of your
animation, like appeal in posing. However,
the most obvious example is appeal in the
character design, you want to have a character
that the audience can connect or relate to. A
complicated or confusing character design can
lack appeal.
12 PRINCIPLES OF ANIMATION
Secondary action refers to creating actions that emphasize or support the main action of
the animation; it can breathe more life into an animation and create a more convincing
performance.
Its important to remember that the secondary action should typically be something subtle
that doesnt detract from the main action happening, and can be thought of as almost a
subconscious action. For example, a character talking to another character in a waiting room, the
two of them talking would be the main action, and if that character began tapping their fingers
nervously, that would be the secondary action.
A character walking down the street while whistling could be another example of secondary
action or a person leaning up against a wall talking to some people at school, the main action is
the character leaning against the wall and talking, and then putting in an action of them crossing
their arms would be the secondary action.
12 PRINCIPLES OF ANIMATION
#12 Staging
Staging is how you go about setting up your scene, from the placement of the characters to
the background and foreground elements and how the camera angle is set up. The purpose
of staging is to make the message of the animation unmistakably clear to the viewer. This
could be ensuring the camera is set up in a way to communicate the characters expression
clearly, or setting up two different characters so that both of them are easily viewed from
the specific angle.
Now that you know the meaning and purpose behind each principle be sure to implement
these 12 key principles into all your animations, and youll be sure to create stunning work!
Explore animation tutorials on Digital-Tutors
12 PRINCIPLES OF ANIMATION
You want to keep focus on the purpose of the shot and what you want to communicate so
the audience doesnt become confused.