Sie sind auf Seite 1von 52

April 2009

www.cgw.com

www.cgw.com April 2009

April 2009 Volume 32 Number 4

Innovations in visual computing for the global DCC community

10

Features
COVER STORY

Super Sundays Best

made the most out of their 30 seconds of airtime during this


years Super Bowl, using digital techniques to create some incredible spots.
10Advertisers
The most memorable has to be the SoBe Lizard Lake spot, which was
presented in stereo.
By Debra Kaufman

Getting Unreal

former game developer is using the tricks of his former tradethe Unreal
3to streamline the content creation process for his new project,
24AEngine
Chadam, an online series he hopes will find its way to television.

By John Gaudiosi

24

30
38

Without Bounds

do you get when you mix art with augmented reality? In all
likelihood, one of digital artist Maurice Banayouns unique installations.
30What

By Barbara Robertson

Picture Perfect

photographers begin to embrace the advantages that CG can


offer them as they ply their craft.
38Automotive

By Bob Cramblitt

Departments
Editors Note

Stereo Vision
Stereo is growing up, and growing fast. 3D is invading movie theaters at
an incredible pace, and it is now taking on the home-entertainment arena,
infiltrating video games and television.

Spotlight

4service levels.

Products Autodesks 3ds Max Design 2010. Image Metrics new

Viewpoint

mattes is still a difficult, time-consuming process, but there are ways


6Making
to make matte magic work to your advantage.

Back Products

46Recent software and hardware releases.


CGW In 3D

Put on your 3D glasses and check out the


following pages, which are presented in stereo:
Editorial: The cover Super Sundays Best
(cover, pages 10-23)

Ads: iZ3D American Paper Optics


(inside cover, page 13)

Gone to the Dogs

students at Ohio State Universitys ACCAD sink their teeth into a


unique project: Using complex 3D tools, they re-created humorist James
42Grad
Thurbers line drawings of dogs and unleashed them during a musical suite.

By Karen Moltenbrey

SEE IT IN

John DesJardin discusses


The Watchmen.
VFX for television.
Outfitting your workstation.
Canadian post studios push their
talent, not location.

ON THE COVER

Leaping Lizards! CGW joins the 3D trend with its first


stereoscopic issue, starting with the cover, which
features an image from the 3D Super Bowl commercial
Lizard Lake. We would like to extend a big thanks to
Digital Domain, which not only was responsible for the
cutting-edge commercial (pg. 10), but whose artists
spent countless hours perfecting this 3D image to help
us in our endeavor.
April 2009

EditorsNote

Stereo Vision

The Magazine for Digital Content Professionals

E D I TO R I A L

f you took a good look at the cover of Computer Graphics World, you may have rubbed
your eyes a bit or blinked a few times to make sure your vision was in check. Your
eyes are fine. The reason why the image appears somewhat distorted is because it is
rendered for 3D viewing. Thats right. CGW has printed its very first stereoscopic cover,
and what better time to do so than nowin time for NAB.
During the past year, stereo has taken on a life of its own. At last years NAB, stereo
had become far more than a buzzword; it was the hot topic. The Content Theater
featured a day of stereoscopic topics and offered a look at two
3D projects. This year, the Content Theater has expanded its
stereo focus. Last summer, the SIGGRAPH Computer Animation Festival devoted two days to stereoscopic research,
applications, and entertainment, curated by Rob Engle of
Sony Pictures Imageworks.
As CGW contributing editor Barbara Robertson pointed
out in her feature article Rethinking Moviemaking (November 2008), the proponents of stereo films believe that the move
to stereoscopic 3D is just as profound as the introduction of
sound and color. And indeed, the revolution is well on its way.
A few years ago, stereo was mainly used in theme-park attractions, which used special
equipment to project stereo images. Movie theaters were another story. But once companies like RealD got onboard, stereo took off quicklyand in a big way. As early as 2007,
stereo 3D was still a novelty. Today, we can expect several 3D movies, including Pixars
Up, Fox/Blue Skys Ice Age: Dawn of the Dinosaurs, Imageworks G-Forceand thats before the fall season begins. By the end of the year, the number of 3D screens is expected to
double, just in time for the highly anticipated 3D live-action Avatar. In fact, starting with
Monsters vs. Aliensreleased at the end of last month on an estimated 2000 3D screens
nationwideall DreamWorks Animations films will be authored in stereo 3D.
Stereo 3D is also moving into other media with the same swiftness. It has already
captured the concert crowd. And this past February, television viewers who tuned in to
watch the Super Bowl were treated to a stereo 3D extravaganza. Prior to the game, viewers
were urged to pick up their stereo glasses at local retailers where Pepsi and PepsiCos SoBe
are sold. Just a few hours before kickoff, I ran to the local grocery store to get several pair.
Then, at halftime, we were told to don the glasses. First, we were treated to a Monsters
vs. Aliens trailer in 3D, where the colorful creatures really popped. The stereo continued
with the 60-second commercial Lizard Lake, featuring the very cool SoBe lizards from
last year, only this year they were leaping off the screen. In fact, this months cover image
is from that commercial. (For more about the making of the spot, see Super Sundays
Best, pg. 10.) The stereo event culminated the following evening with a 3D viewing of
the television series Chuck.
With stereo invading theaters at such an aggressive pace, will it soon be knocking at
your own door? The answer is yes. Three-dimensional gaming is already taking hold, as
vendors, such as iZ3D, offer affordable monitors that work with passive polarized glasses
to turn favorite interactive 2D titles into 3D adventures. Among others, Nvidia is getting
into the home stereo game with an HD 3D solution that works with a number of monitors and projectors (see Game-Changing Technology, March 2009).
So are films, games, and television shows more exciting in stereo? Yes. Filmmakers and
others are recognizing that stereo is no longer a gag; its a tool to enhance the storytelling
experience, and they are using stereo for just that purpose. So sit back and enjoy your trips
to the next dimension. n

CHIEF EDITOR

April
2009
August
2008

karen@CGW.com

KAren moltenbrey
Chief editor

karen@cgw.com (603) 432-7568


36 east nashua road
Windham, nH 03087

COnTRIbuTIng EDITORs

Courtney Howard, Jenny Donelan,


Audrey Doyle, George maestri,
Kathleen maher, martin mceachern,
barbara robertson

WIllIAm r. rIttWAGe

Publisher, President and Ceo,


CoP Communications

SA L E S
lIsA blACK

national sales manager


Classifieds education recruitment

lisab@cgw.com (877) CGW-Post [249-7678]


fax: (214) 260-1127

Kelly ryAn

Classifieds and reprints


kryan@copcomm.com
(818) 291-1155

editorial office / lA sales office:

620 West elk Avenue, Glendale, CA 91204


(800) 280-6446

PRODucTIOn
KeItH KnoPF

Production Director
Knopf bay Productions
keith@copcomm.com (818) 291-1158

mICHAel VIGGIAno
Art Director

mviggiano@copcomm.com

CHrIs sAlCIDo

Account representative

csalcido@copprints.com (818) 291-1144

Computer graphics World Magazine


is published by Computer graphics World,
a COP Communications company.
Computer graphics World does not verify any claims or
other information appearing in any of the advertisements
contained in the publication, and cannot take any
responsibility for any losses or other damages incurred
by readers in reliance on such content.
Computer graphics World cannot be held responsible for
the safekeeping or return of unsolicited articles,
manuscripts, photographs, illustrations or other materials.
Address all subscription correspondence to: Computer
graphics World, 620 West Elk Ave, glendale, CA 91204.
subscriptions are available free to qualified individuals
within the united states. non-qualified subscription rates:
usA$72 for 1 year, $98 for 2 years; Canadian
subscriptions $98 for 1 year and $136 for 2 years;
all other countries$150 for 1 year and $208 for 2 years.
Digital subscriptions are available for $27 per year.
subscribers can also contact customer service by calling
(800) 280 6446, opt 2 (publishing), opt 1 (subscriptions) or
sending an email to csr@cgw.com. Change of address can
be made online at http://www.omeda.com/cgw/ and click
on customer service assistance.

Postmaster: send Address Changes to

Computer graphics World, P.O. box 3551,


northbrook, IL 60065-3551
Please send customer service inquiries to
620 W. Elk Ave., glendale, CA 91204

LightWave 3D

Autodesk Announces 3ds Max Design 2010 Software


Autodesk is readying 3ds Max Design 2010, the latest version of
its modeling, animation, and rendering software for design professionals. The 2010 release includes new features and enhancements that accelerate design iteration and help achieve more
sustainable designs.
3ds Max Design lets architects, designers, engineers, and
visualization specialists explore, validate, and communicate their
creative ideas, from initial concept models to final, cinema-quality
presentations. The product offers digital continuity with Autodesks
AutoCAD, Revit, and Inventor software, enabling design data to
be reused for visualization.
At least 350 new features have been added, including renderlike effects in the viewport display for near-photoreal quality to
speed decision-making. A new Graphite modeling and texturing system, with approximately 100 new creative tools, helps
designers explore and rapidly iterate their ideas. A real-time
lighting analysis solution for validating the effect of direct lighting, along with the newly certified Exposure technology, helps
users create more sustainable designs. And, an extensive

library of particle effects and flickerless rendering improvements help users communicate design intent using the latest
developments in game and film technology.
3ds Max Design 2010, priced at $3495, is expected to be
released later this spring, while 3ds Max 2010 software for
entertainment professionals, which is priced at $895, is available now.

PRODUCT: DESIGN

Image Metrics Offers New Service Levels


Image Metrics has introduced four new
facial animation service levels for film and
game customers.

Due to throughput innovations in Image


Metrics proprietary technology as well
as improvements to its in-house facial
animation processes, the new service
levels now offer even better results at
more competitive prices. In the past,
CG storytellers have been unable to get

the performances they wanted from their


CG characters because animating the
face has been cost-prohibitive. In games
alone, prices for adequate facial animation average roughly $5000 per minute,
points out Michael Starkenburg, Image
Metrics CEO.
The four new service levels range from
facial animation for background videogame characters to photorealistic digital
doubles for live-action movies. All the
offerings use Image Metrics proprietary
facial animation technology to analyze an
actors facial performance and transfer it
with all of its subtleties and nuances to a
3D facial rig.
The new Value service, at about half
the cost of standard facial animation,

PRODUCT: FACIAL ANIMATION


4

April 2009

automatically generates large volumes of


high-quality facial animation for secondary characters in games or for previsualization purposes. The Pro service, which
offers more subtle facial movements
on advanced game rigs, is geared for
in-game scenes and cut-scenes. The
Premium level, for high-quality faces,
provides greater creative control with
pore-level analysis of facial movement
and more fine-tuning of the performance.
Finally, the Elite offering is for believable
digital faces, and utilizes all the features
of the technology.
All the service levels include a final
output of animation curves that work with
the leading 3D software applications.
Pricing varies according to project.

CGW :209_p

3/6/09

1:36 PM

Page 1

By GerGeley Vass

PostProduction

Making Mattes

hen integrating CG and real elements for visual effects, the captured images often need to be segmented
into foreground and background regions in order to
achieve realistic occlusions. This kind of separation of picture areas is usually defined by a matte, or mask image, where
brightness corresponds to the transparency or the opaqueness
of the given layer.
If computer-generated objects are in the foreground, we simply
need to let our render engine produce the masks, which are often
stored as the fourth alpha channel. However, it is a bit more complicated to composite something behind the real elements.
The creation of a proper matte image for photographs, video, or
film is something artists and scientists have been struggling with
since the birth of photography. In fact, compositingthe combination of visual elements from different sourcesis not a digital
concept. Long before digital image processing, or even early video
technology, filmmakers utilized the double-exposure (or multipleexposure) technique. By exposing the film twice, images could be
overlaid on top of each other. If only one part of the frame was
recorded for the first pass, then by using a holdout matte, the film
could be rewound and the blank, unexposed area could be recorded with an inverse mask. If the separate elements were lit properly
and the camera perspective matched, the resulting image would
appear perfectly integrated.
That kind of compositing did not need to be created in-camera, since filmmakers could use optical printers during the postproduction stage. These machines could project film frames onto
another filmstrip while applying different filters, pan and scan, or
masks. Through the use of the multiple-exposure technique, not
A former Maya TD and instructor, Gergely Vass
eventually moved to the Image Science Team of
Autodesk Media and Entertainment. Currently
he is developing advanced postproduction tools
for Colorfront in Hungary, one of Europes leading DI and post facilities. Vass can be reached
at gergely@colorfront.com.

April 2009

Using the rotoscope, patented in 1917 by Max Fleischer, animators


could trace the contours of real actors frame by frame.

only could live-action frames be blended, but also miniature and


painted scenes.

Matte Magic
Painted backgrounds, or matte paintings, wereand still are
used extensively to save production costs by negating the need to
build or visit large, distant, or expensive sets. By limiting live action to occur in a well-defined window inside the picture frame,
static mattes could be used. However, when moving objects were
to be masked out, traveling mattes were needed. These give filmmakers more freedom but are harder to accomplish, especially
with traditional chemical and optical techniques.
The most laborious way to create traveling mattes was, and
still is, through rotoscoping, invented in the early 1900s by Max
Fleischer (Fleischer Studios). The original rotoscope was a piece
of equipment that projected prerecorded film frames onto a glass
panel, where an animator could precisely redraw its shape. While
rotoscoping was an effective way to create realistic animated characters, like Disneys Snow White (1937) and Cinderella (1950), it

The new DeckLink Studio has SD/HD-SDI, loads of analog


connections, down converter and more for only $695!
Turbocharge your creativity with DeckLink Studio, the SD/HD
broadcast video card that costs hundreds of dollars less than SD
solutions! With SD/HD-SDI and enhanced analog connections,
DeckLink Studio connects to a massive range of equipment such
as HDCAM, HD-D5, Digital Betacam, Betacam SP and more!
More Video Connections!
DeckLink Studio includes 10 bit SD/HDSDI, component, composite, S-Video,
4 ch balanced analog audio, 2 ch AES/
EBU, reference, RS-422 deck control and
a built in hardware down converter. High speed 1 lane PCI Express
gives you more HD real time effects and supports advanced
video formats such as ProRes(Mac), DVCPro HD, JPEG, DV, HDV
playback and even 10 bit uncompressed capture and playback!

Built in SD Keyer
DeckLink Studio includes a built in internal SD keyer that lets you
layer RGBA images over the live video input. You can also use the
included Photoshop plug-ins for broadcast graphics! DeckLink
Studio also supports external SD keying with key and fill SDI out.
Windows or Mac OS X
DeckLink Studio is fully compatible
with Apple Final Cut Pro, Adobe
Premiere Pro, Adobe After Effects,
Adobe Photoshop, Fusion and any
DirectShow or QuickTime based
software. DeckLink Studio instantly switches between, 1080HD,
720HD, NTSC and PAL for full worldwide compatibility.

Hardware Down Conversion


For monitoring, youll love the built
in HD down converter thats always
active on the SD-SDI, S-Video and
composite video output connections.
The built in hardware down converter lets all video outputs remain
active in both capture and playback mode, and in all HD video
formats! Instantly switch between letterbox, anamorphic 16:9 and
center cut 4:3 down conversion styles.

Decklink Studio

$695

Learn more today at www.blackmagic-design.com

n n n n

Viewpoint

was also a convenient approach to creating precise traveling mattes tracking of planar shapes, and paint functionalityto make this
by tracing the outline of actors or moving objects.
still-laborious task easier and the resulting matte look better. Todays
Rotoscoping is not only a very time-consuming task, but it also popular matte-creation tools, based on keying or skilled animators
requires great animation skills. A poor and inconsistent hand-drawn frame-by-frame tracing, are essentially built upon the principles of
mask is very disturbing to see on screen, as contours may move and the chemical and optical processes developed half a century ago.
flicker. Color keying, or chroma keying, introduced in the 1930s, is
Matte creation still is, however, an actively researched field with
a well-known process for creating traveling mattes of moving actors fascinating new results. Natural image-matting algorithms are deor objects shot in front of a uniform blue or green background. The signed to extract a matte of an object (even a hairy one) in front
holdout matte for the foreground, or the male matte, was chemi- of an arbitrarily textured background. To guide the process, one
cally developed by rerecording the frames on an optical printer using has to provide an initial segmentation of the image: foreground,
filters to block out the tinted background. Similarly, filtering out all background, and unknown regions. This input image, called the
non-blue colors created the female matte. Using these matte im- trimap, does not require significant time to create; sometimes even
ages, the foreground and background layers could be composited a few brushstrokes are sufficient. It is the softwares job to figure
together seamlessly. Alas, the bluescreen technique had several draw- out the transparency value for each pixel in the unknown region.
backs: Actors were not allowed to wear blue clothes or makeup, and
Almost all the current natural-image matting techniques are
even so, there was noticeable color spill on the edges of the matte, in based on the assumption that in each small image window near
particular at semi-transparent regions, such as the hair.
the object boundary, the color of both the foreground and the
An alternative process, developed by the Walt Disney Company, background is smooth. Hard edges on the input image are due to
was the sodium vapor process, whereby the actorwearing any opacity change at the objects boundary. If this statement does not
colorwas filmed in front of a white screen. A strong sodium va- hold, the user has to manually adjust the matte to correct the poor
por lampwhich has a very narrow spectrum not picked up by the areas of the resulting matte. Thus, by computing not only transsensitive layers of standard color filmwas used to light the scene. parency, but also the clean image layersfree from any color
With a custom-made camera recording every frame on two separate spillwe can composite the extracted foreground object onto anfilmstrips simultaneously, a second black-and-white film (sensitive other, perhaps computer-generated, background seamlessly.
to the sodium light) was
then exposed. On this
second film, a neat female
matte was created. At the
time of its use (during
19401960 in movies like
The Birds or Mary Poppins), the sodium vapor
process yielded cleaner
results than bluescreen.
Matte generation with a few brushstrokes (Levin, Lischinski, Weiss: A Closed Form Solution to Natural Image Matting).

Keying Concepts Today


As technology advanced in the past several decades, greenscreen and
bluescreen technologyand keying, in generalhas been perfected
in many ways. With custom materials (such as retro-reflective fabrics), advanced lighting kits, and modern cameras, it is easier than
ever to create an evenly lit screen for keying. Hardware and software
solutions are now able to pull a perfect key by maintaining the original edge quality and removing color fringing/spill, even if strong film
grain or video (especially DV) artifacts degrade the images.
Rotoscoping is still a commonly used technique to deal with
shots where no automatic keying/matting is possible. Current roto
applications use a wide range of toolsincluding advanced freeform curves with variable softness, realistic motion blur, automatic


April 2009

These new innovations are certainly finding their way into software. However, we have to wait a bit longer for easy and quick,
high-quality matte generation for film-resolution image sequences based on a few brushstrokes. Until then, color keying and advanced tracking solutions may help us out. Having a nice matte
for all our image layers, however, does not guarantee a perfectly
realistic end result. It is the compositors job to deal with the issues
of non-uniform grain structure and video noise, differences in
color, contrast, and exposure, or motion artifacts when merging
images from different sources. Anyone interested in integrating
rendered CG elements into film or video should not only learn
the nuts and bolts of compositing, but also how to pull a perfect
matte effectively. n

Finally, converters that auto switch SD and HD


and include AES/EBU and analog audio!
Build your studio with the worlds most advanced converters. Only
Mini Converters include auto SD/HD switching, redundant input,
AES/EBU and analog audio on 1/4 inch jack connections, plus
advanced 3 Gb/s SDI. There are 4 great models to choose from
depending on the conversion you need, plus a sync generator model!
Auto Switching SD and HD

New 3 Gb/s SDI Technology


Mini Converters include the latest 3 Gb/s SDI
technology, so youre always future proofed!
3 Gb/s SDI is also fully compatible with all your
existing standard definition and high definition SDI equipment.
Broadcast Quality

Mini Converters instantly switch between all


SD and HD formats, including NTSC, PAL,
1080i/59.94, 1080i/50,1080PsF/23.98, 1080PsF/24,
720p/59.94, 720p/50. Updates can be loaded via USB.

Mini Converters are built to the highest quality standards with


low SDI jitter, so you get the longest SDI cable lengths combined
with ultra low noise broadcast quality analog video and audio.

Redundant SDI Input

Choose either SDI to HDMI, HDMI to SDI, SDI to Analog or


Analog to SDI models for only $495. Reference all your studio
equipment with Sync Generator for $295.

Mini Converters feature a redundant input and loop through


SDI output. Connect a redundant SDI cable to the second input,
and if the main SDI input is lost, Mini Converters will automatically
switch over in an instant. Thats great for mission critical tasks
such as live events.
Pro Analog and AES/EBU Audio
Standard 1/4 inch jacks are built in to each Mini Converter for
professional balanced audio that switches between AES/EBU
or analog. Unlike other converters you dont need expensive
custom audio cables.

Five Exciting Models

Mini Converters

$495

Sync Generator

$295

Learn more today at www.blackmagic-design.com

10

April 2009

BroadcastStereo

CirCus

uper Bowl XLIII, which pitted the


Pittsburgh Steelers against the Arizona Cardinals, was a nail-biting game
that ended in the Steelers beating the underdog Cardinals. The
Steelers also secured a place in Super Bowl history, earning their
sixth world championship win and sole possession of the record
for the most Super Bowl wins.
The game also took place in the midst of the USs most profound economic
downturn since the Great Depression. But that only increased viewership in a
nation looking for diversion: This years Super Bowl became the most watched,
with 98.7 million Americans tuning into the game on the NBC network.
A total of 32 companies presented 84 commercials throughout the event,
and NBC aired 45 minutes, 10 seconds of advertising (including NFL messages and NBC promotional plugs) between the opening kickoff and the
final whistle. The top four advertisers in terms of total ad time were PepsiCo,
Anheuser-Busch InBev, General Electric, and Viacom.
Several commercialsincluding CareerBuilder, Monster, and Cash4Goldreflected the countrys anxious mood. But the spots that really
rocked had integrated what Super Bowl spots always do best: innovation,
laughs, and a strong story.
As always, upcoming theatrical releases were a top category, with promos
for nine movies (one of them the 3D animated Monsters vs. Aliens). Car manufacturers had less of a presence due to their economic woes, although one
of our favorite commercials came from tire manufacturer Bridgestone (Hot
Item). Meanwhile, Coca-Cola and Pepsi battled on the airwaves, each of
them with strong contenders. Were fond of PepsiCos SoBe Lizard Lake
and Coca-Colas Avatar, both of which combined a strong CG sensibility
with clever stories and impeccable execution.
And whats Super Bowl without the Budweiser Clydesdale horses? Super
Bowl 2009 featured two spots with the iconic equines, and our favorite was
Circus, which featured a romance and the power of love between a Clydesdale and his ladylove Lucy, a circus horse.
General Electric, parent company of broadcaster NBC, also scored with
Scarecrow, a wired take on the Wizard of Oz that made a spot-on match
between the character and the product in a breathtaking example of computer-generated imagery.
We hope you enjoy the making-of as much as we enjoyed the commercials.

n n n n

Budweiser

Director: Joe Pytka


Agency: DDB Chicago
Production company: Pytka, Venice, CA
CG company: Filmworkers Club, Chicago

A Clydesdale nuzzles his ladylove Lucy, a horse in


a circus, but when the circus leaves town, it seems
that the romance is over. But our Clydesdale hero
jumps fences and gallops across America to reunite with Lucy, who bucks off a bareback rider
and astonishes the clowns as she and her mate
burst through the circus tent together.
Filmworkers Club first heard about the concept for the spot back in November from the
DDB Chicago creative team who were responsible for last years successful Clydesdale Team
commercial (see Fan Favorites, April 2008).

Using bi-cubic warping within Smoke, Filmworkers


Club made it appear as though this horse was jumping over a canyon rather than a fence.

The exquisite views of America, as our horse


hero gallops to find Lucy, are based on superhigh-resolution (8k and 10k) stock footage from
iStock. Filmworkers Club then manipulated those
stock images to create the most spectacular shots
possible. Our idea was to create a heightened
sense of beauty and reality, explains visual effects
director Rob Churchill. We wanted America to
look beautiful, so the countryside scenes are more
saturated, heightening the reality.
Colorist Michael Mazur took the process halfway with a first color pass for compositing, and
then he completed the color after the composite.
That way, the whites arent blown out and the
edges are crisp, says Churchill.
The commercial was essentially created in compositing, using a combination of live-action eleApril 2009

11

n n n n

BroadcastStereo

The live-action spot Circus required quite a bit of digital work, and not all of it involved making
the horses do super feats. In the scene above, the inside of the circus tent was shot live (left);
compositors later added the clown and performers to the plate.

ments, stock footage, and lots of paint, relighting, and digital elements. Filmworkers
Club assigned a team of five compositorsHeidi Anderson, Chris Ryan, Rick
Thompson, Jen Paine, and Churchill
who worked not just on Circus, but on a
total of nine Super Bowl spots in-house.
According to Churchill, the most challenging scene was when the Clydesdale

jumps over the canyon. He was shot


jumping over a fence, he explains, but we
had to fly the horse across the entire canyon. Using bi-cubic warping, a feature in
Autodesks Smoke, Churchill was able to
make the horse look as if it was jumping
outward rather than over.
Its a detailed stretch, but you can only
stretch parts of the picture, Churchill says.

Director: Peter Arnell


Agency: The Arnell Group
Production company: The Arnell Group
CG company: Digital Domain, DreamWorks Animation
In the weeks leading up to the Super Bowl, TV viewers were
tantalized with more than the competition on the field: the
chance to see a SoBe commercial in 3D, as well as a trailer
from DreamWorks Animation for Monsters vs. Aliens. PepsiCos SoBe joined its CG lizards with the stars from DreamWorks Animations upcoming theatrical release and NFL
players Matt Light (New England Patriots), Justin Tuck (New
York Giants) and Ray Lewis (Baltimore Ravens). What brings
them together is a highly unlikely event: Theyre all participating in a ballet performance of Swan Lake.
Directed by Peter Arnell, the spot opens on sheet music,
where the Swan in Swan Lake is crossed out and replaced
with Lizard. As the curtain opens, we see the NFL players
in white uniforms, cleats, and tutus attempting their best to
plie and jete across the stage. When Ray Lewis steps out for
12

April 2009

There were tons of rotoscoping to get rid


of the fence that was blocking part of the
horses leg, which I made up with bits of
cloning, tracking, and paint. He also relit
the horse to make it match the background,
using mattes and 3D lights inside Smoke.
In the first, early morning scene, Thompson relit the barn to make it look like
morning, and to sell that further, added
little digital insects flying around the horses. Ryan, meanwhile, painted the outside
of the barn.

a bottle of SoBe Lifewater, the stumbling dancers send the


bottle hurtling through the air. The characters from Monsters
vs. Aliens and the SoBe Lizards invade the stage, and the
ballet quickly turns into a wild dance party.
In charge on the CG side was Digital Domain visual effects
supervisor Jay Barton, who also led the studios team on last
years SoBe Super Bowl spot starring super-model Naomi
Campbell. Due to the limited amount of time with the NFL
players, the filming occurred in two days: one with the hero
characters and a second day for the secondary dancers and
stunt doubles. Production company 3ality handled the 3D
shoot, using Sony HD cameras on a stereoscopic rig.
In a regular 2D spot, youve got one camera to shoot two
different people on greenscreen, says Barton. You can do a
lot of fudging to get them to work together in a scene. It becomes quite a different matter to do that in stereo 3D. You not
only have to make them work in the correct size and screen
position, but now, also, in correct depth. We talked about
where the characters all belong in that 3D space. Everyone
was savvy enough that we had the language to know where

n n n n

BroadcastStereo

Filmworkers Club also added the circus


tents. The artists started with stock footage
as the base structure, and then used Smoke
to paint in the tent. Later, they relit the tent
and Ferris wheel elements in the opening
scenes. The group also crafted flags that
would rustle in the wind, and built a matte
painting of the barn door for when the
Clydesdale decides to leave the barn. The
group relit the barn exterior as the horse
runs out and the establishing shot of the
barn, as well, using CG sun and sunbeams
and adding CG insects fluttering around
for good measure.
Jen [Paine] repainted all the foreground
trees but left room [for viewers] to see the
Ferris wheel, which she put in the background, says Churchill. All of that work
makes it say circus.
The crew also replaced many of the backgrounds. The inside of the circus tent was
shot greenscreen, and Thompson composited the clowns and other circus perform-

ers, with a new inside-the-tent background


composed of stock footage that was digitally enhanced and manipulated. In the
scene where the silhouetted Clydesdale is
running toward the circus in the distance,
Filmworkers Club used a shot of the horse
inside the barn running toward the exit,
stripped him out of that scene, and put him
in this scene. Then, they added a nighttime
CG/matte painting/live-action exterior.
The compositors even replaced the skies.
The idea was to have the spot start in the
morning, go through the day, and end at
night, Thompson says. You couldnt plan
to shoot at all the right times of day, which
is why we had to replace the skies throughout. The sole use of Autodesks Maya was
to put digital bridles on the horses in the
beginning sequence when theyre nuzzling
over the fence.
This is the most exciting time in the yearly cycle, contends Churchill. The agencies
are giving it their all, and so are we.

Lizard Lake, featuring the SoBe lizards, NFL players, and characters from Monsters vs. Aliens, was the second stereo event during
the Super Bowl, following the 3D trailer of Monsters vs. Aliens. The
following evening, an episode of Chuck was shown in stereo, as well.

14

April 2009

Avatar

Coca-Cola

Director: Smith & Foulkes


Agency: Wieden + Kennedy,
Portland, OR
Production company: Nexus
Productions, London
CG company: Nexus Productions

CG avatars share the scene with live actors


in this ad from Coca-Cola.

Coca-Cola can always be counted upon to


create a playful commercial for the Super
Bowl, and 2009 was no exception. (Re-

the characters were in the 3D space and where the convergence would be. Some of our phone calls probably sounded
like we were speaking Greek.
Shooting live action and compositing CG characters is already difficult, notes DreamWorks global stereoscopic supervisor Phil Captain 3D McNally. To add the third dimension,
we had to coordinate our stereo settingsthe z axisas much
as we could in advance. We all sat around a table and agreed
between us what the ballpark was wed be working in, so 3ality
could set up its cameras and we and Digital Domain could set
up our virtual cameras and coordinate.
3alitys camera rig provided metadata with camera settings,
zoom information, and the interocular distance (between the
left and right eyes) and point of convergence. For all intents
and purposes, its a motion-control rig, describes Barton. They
provided us with that metadata, and we used it to set up our initial 3D cameras in Track, our in-house tracking system. Based
on that initial data, we fine-tuned the two cameras to lock to the
plate. Then we transferred that data onto a rig that was used by
us and DreamWorks to render out [Autodesk] Maya scene files
of our characters and CG environment.
The first step was to do the scene registration and obtain
the camera data for the shots that DreamWorks would work

CGW :309_p

3/4/09

12:32 PM

Page 1



 
  







6RIW:DUH$UWLVW7UDLQLQJ

eyeon
&RS\ULJKWH\HRQ6RIWZDUH,QF$OOULJKWVDUHUHVHUYHG
H\HRQOLQHFRP

n n n n

BroadcastStereo

member 2008s aerial struggle between


the Stewie and Charlie Brown balloons in
the Macys parade?) This year, Coca-Cola
played with the idea of computer avatars,
young adults, and Coke as the beverage
that brings people together. In a charming,
free-flowing spot, people transform into
self-absorbed avatars as they move about
their daily business in New York City. A
young man navigates the lonely crowd and
enters a diner, where he reaches for a Coke
and transforms the avatar next to him into
a lovely young lady. Coke opens happiness is the tagline.
Nexus Productions shot the live action and produced all the CG avatars,
while Framestore CFC London did the
compositing. Ben Cowell, Nexus head
of 3D, first looked at the agency brief for
the commercial in August, and spoke to
Nexus directors Alan Smith and Adam
Foulkes about the look. They described
the spot as though it were all shot for real,

as though we really found a


part of a city somewhere in
the world where everyone
was plugged into their
devices, says Cowell.
The camera would be a
passive observermuch like
our hero.
Smith and Foulkes also described
how the people would transform into
their avatars through the use of 3D
pixels or voxels, while Cowell, Matt
Clark, and Luis San Juan Pallares
wrote the software that would
achieve that. That way, we could
then animate this effect over time to
create the transformation, explains
Cowell, who describes how the real people turn into game avatars.
The spot Avatar incorporates approximately
60 CG characters, each a reflection of the world
from which it resides, into live plates. They were
lit with HDR maps captured on set.

on. When you have all these layers shot at different times, we
had to come up with a common scene file, to put everything in
the correct place, says Barton. It had to work for everyone,
and we created a 3D scene that we could all work on for all
the characters. McNally reports that Digital Domain took the
camera data from 3ality and gave it to him. We then put in our
charactersThe Missing Link, Dr. Cockroach, and BOBand
sent it to Digital Domain, he says.
DreamWorks main animator for the movie Monsters vs.
Aliens is David Burgess, and he took on the Lizard Lake shots
as the sole animator, working for a two- to three-week period,
with lots of back and forth with Digital Domain.
Both Barton and McNally note that working with director Arnell isnt a typical experience. Every time wed see a new shot,
there would be 10 more lizards in it, says McNally. Luckily for
us, Digital Domain was in control of the SoBe lizards. We had
five shots that our characters were in, so we had relatively light
work. Digital Domain did the heavy lifting.
In addition to its 20 shots, Digital Domain also created 3D
backgrounds for every shot, which boiled down to rendering
elements for nearly 40 shots total. We continued creating our
3D scenes and comps to make sure all the live-action characters fit in the stage, while we developed what that stage would
16

April 2009

be, he explains. The director has a real designers eye and


was particular about the values we used in the background.
The lizards were also updated from last years Super Bowl
and Summer Magic spots with Naomi Campbell. We used
new technologies, says Barton. It was more about the way
we created our displacement maps. Weve always had subdivision surface characters with displacement maps for texture.
For this spot, we gave more details by repainting the displacement maps. We also added new wardrobe: the whole football
uniform with cuffs, shoulder pads, and helmets for the footballplaying lizards. We also did the cheerleader uniforms and pompoms, and a tuxedo for the conductor lizard.
All of Digital Domains character animation was set up in Maya
and exported out and rendered through NewTeks LightWave.
All our coloring, lighting, and texturing was done in LightWave,
says Barton. It was a combination of the right tool for the job
and the preference of the available artist. Meanwhile, the team
used Side Effects Softwares Houdini for the fluid simulation of
the drink flying out of the SoBe bottle. At the beginning when
the curtains open, the artists employed Maya nCloth to move
the curtains, after which they piped the imagery through Houdini
to get even more breakup and wrinkle action after the fact.
Compositing software Nuke also played an instrumental role.

Expansive,
not expensive.

Introducing the V100:R2 camera, from OptiTrack.


With a wider, 60-degree field of view lens, you can now
capture precise movements in even smaller spaces.
And the price? Still an amazingly low $599.
From cameras to software to accessories, OptiTrack
makes it easy to create your own affordable motion
capture system. Learn more at www.optitrack.com.

60
45
35

FOV
options

Camera features:

Only

Ultra-wide 60-degree field


of view lens
Three lens options available
100 frames per second
Up to 18 cameras per capture
volume using OptiTrack Arena
or Arena Expression software
USB connectivity
11ms latency
640x480 resolution
Full grayscale image preview for
easy camera placement

2009 NaturalPoint Inc. All rights reserved. OptiTrack, Arena, Arena Expression, the
Capture Your Vision slogan and all associated logos are the property of NaturalPoint Inc.
Pricing and specifications subject to change.

Enlarged for
coolness factor.
Actual dimensions:
2.75" x 1.78" x .81"

n n n n

BroadcastStereo

Smith and Foulkes put together a story


board, which they transformed into a 2D
animatic that was hand drawn based on
photographs; then they scanned it into
Adobes Photoshop, cleaned it up, and
imported it into Adobes After Effects for
editing. Normally with a project like this,
much of the creativity happens up front in
the 3D animatic, as its far too expensive
to make decisions on set, says Cowell.
However, with this project, much of the
camera work would be a reaction to the en
vironment and characters. We decided to
go on location to Buenos Aires [Argentina]
early and spend a week or so scouting and
taking photographs and imagining how
this could be pieced together.
That decision proved fruitful, maintains
Cowell: Every night the group created a
fresh 2D animatic using the new photo
graphs. From Framestore London, visual
effects supervisor Mike McGee also at
tended the shoot. That got us thinking

about lighting and how these shots


would link together, long before the
shooting began, he says. On set, they
collected HDR maps, which helped
tremendously in the lighting of the
CG characters.
Patrick Krafft and Maelys Faget
did a fantastic job of getting these
characters to fit into their environ
ment, says Cowell.
Meanwhile, back in London,
Nexus senior 3D lead Dave Fleet ran
the character modeling and design
process, which relied on Autodesk
3ds Max and some Pixologic ZBrush
work. Given how many games, chat
Artists added digital scuff to make it seem as if the
systems, and online environments we
avatars didnt really belong in the real world.
were hoping to evoke, it was quite a
challenge to create the 60 or so characters up looking too generic. Each character
required, says Cowell. Each needed to needed some evidence of their chat or
feel that they came from their own world, online interface, and this was designed in
so a lot of work went into the design and the same way Nexus Productions tackled
modeling phase to ensure they didnt end the charactersmaking sure each one felt

In Lizard Lake, the stereo SoBe lizards move and groove, and change
colors, thanks to Digital Domain and assistance from DreamWorks.
Originally developed by Digital Domain, Nuke is now distributed by The Foundry, which has created Occula, a suite of tools
specifically for dealing with 3D imagery. Those tools were
absolutely vital to what we needed to do, says Barton. They
streamlined the process and made 3D accessible on every
desktop with regard to combining left- and right-eye images so
everything in the composite happens to both sides equally and
correctly. We also used them to warp, stretch, and tweak the
live-action plates as necessary to make them stereo-perfect.
The Occula tools were also used to warp the live-action
18

April 2009

characters, place them in space, and put them in the correct


stereoscopic distance.
Digital Domain checked its work daily in the 3D screening
room with shutter glasses. But each artist could also view stereo 3D, with anaglyph glasses, while working in Nuke. They
could adjust the stereo while they were doing it, says Barton.
In the old-school way of doing 3D, youd make your best guess
and then view it after the fact. These new tools that enable people to view real time in stereo are pretty cool!
Just before delivering the project, McNally helped Barton and
his team tweak the entire spot to make sure the stereo worked
in the best possible way throughout. We got his Captain 3D
stamp of approval, says Barton. McNally demurs: We were
just there to provide our assets at the beginning and help DD
with anything we could in terms of ideas or suggestions about
the stereo, he says.
The Monsters vs. Aliens trailer that ran during the Super
Bowl was, of course, done entirely at DreamWorks Animation.
Obviously, most of the content comprises shots from the movie, says McNally (see Monsters of the Deep, March 2009).
What we added was a transition from the DreamWorks logo,
and then we added the glasses.
When editing 3D content, McNally says he always keeps a

CGW :109_p

1/29/09

7:13 AM

Page 1

www.aja.com

AJA makes the whole process easier.

Keith Collea
Co-writer and Producer, The Gene Generation

From capture to conversion, Keith Colleas entire


post production workflow relies on AJA at its core.
With a resum including work on blockbusters such as Alien:
Resurrection, Independence Day and Pearl Harbor, the effects and
post production veteran has been a longtime user of AJA products.
Colleas entire post production workflow depends on AJA products,
including KONA, Io HD, and Converters. Im a huge AJA fan, says Collea.

10-bit, full-resolution SD/HD over FireWire


Apple ProRes 422 codec in hardware
HD 720/1080, SD NTSC/PAL
Up/down/cross-conversion, hardware-based and realtime
Connect via a single FireWire cable to
MacBook Pro or MacPro

I o H D .

B e c a u s e

i t

On his recent feature, The Gene Generation, Keith chose Io HD as the


centerpiece of a fully portable projection system. Using the Io HD, an
Apple Intel MacBook Pro and a G-Tech G-DRIVE, we were able to create
a system that allowed the film to be shown on a Sony digital cinema
projector without the use of a tape-based VTR for playback, he
explained. It saved us time and money, and the picture quality
was better than if it came off of tape. It was just incredible.

m a t t e r s .

To find out more about how Keith uses AJA products to enhance
his workflow, check out the full details at www.aja.com/keith

n n n n

BroadcastStereo

as though it came from a different world.


These were all animated in Adobes Flash
by Kwok Fung Lam.
Once the edit was locked, animation
began. This is always the fun part of the
project, as its the chance for directors to
insert as many details as they can, says
Cowell.
At Framestore, visual effects supervisor William Bartlett, assisted by Darran
Nicholson, did the compositing within
Autodesks Flame. Bartlett notes that the
companys chief contribution was making
the avatars appear as if they didnt quite
belong in the real world. They did so by
adding digital scuff, or a bit of noise and
compression.
We have various features in [Autodesk]
Flame that can mimic what JPEG compression looks like, as well as scanlines and
RGB color shifts, to make [the characters]
appear as avatars youd see on a computer
screen, says Bartlett. We were doing the
finishing touches, but Nexus led the job.

Scarecrow
General Electric
Director: Traktor
Agency: BBDO New York
Production company: Traktor,
Venice, CA
CG company: Framestore CFC NY

Selling something as seemingly abstract as


a smart grid sent General Electric to Oz.
BBDO New York and Traktor brought the
famous scarecrow back to life, and he, in
turn, brought GEs product to life with
some warm and fuzzy thrown in for good
measure.
It was an interesting challenge for us,
says David Hulin, Framestore CFC NY
visual effects supervisor. We do character
and creature animation, and scarecrow was
a humanistic, naturalistic character.
To create a humanistic character, it
made sense to shoot a real person, and
thats what the group did. A dancer, wear-

close eye on how the shots fit together. Its hard for your eyes
to keep up with depth jumps, such as when a shot thats deep
cuts to one thats close, he notes. We smooth out the transitions by adjusting the stereo across the cut. Its invisible to the
viewer but makes it easier to watch.
McNally continues: When you cut a trailer, the pieces are
now rearranged and trimmed tighter than before, so the blending pass becomes much more important. A trailer is more like
an action sequence, so we have to craft the transitions to
keep up with the depth jumps.
McNally also brought on the Danish company Color Code
3D, which provides a technical process to ensure color fidelity
in 3D; the general 3D process often results in less brightness
and muddier colors.
Color Code took the images, put them through the software process, and then checked them with the glasses,
explains McNally. The colors need to be pure. In cinema,
the colors are pretty much how we want them. In the 3D
version, we do a slight color correction to deal with the
tint of the glasses. In that sense, when you go to the 2D
and 3D versions of the movie, you see full color as weve
designed it. The idea is to preserve as much of the color as
20

April 2009

ing a blue suit with markers, stood on


a 20-foot platform and performed the
movementssome of the more acrobatic
ones requiring him to swing on wires. This
capture wasnt motion capture in any traditional sense of the word, but the blue
suit and markers were a way to get a bit of
a leap on the painstaking rotoscoping that
would be required to replace the actors
body with a CG scarecrow body.
We did it a bit like mocap, without the
mocap, says Framestore NY lead Flame artist Murray Butler. To add to the realism and
physicality, a practical scarecrow head was
built, which the dancer wore during the filming. After the shoot, a team rotoscoped the
actor by handa huge job, notes Butler.
Senior TD Theo Jones, who was CG
lead for the commercial, notes that the
spot ended up being 90 percent CG.
Thats because, ultimately, the physical
scarecrow head worn by the actor didnt
work as planned, mainly because of issues
with articulation and lip sync. Suddenly, it

The crew had to composite CG characters into live action, coordinating the stereo settings as much as possible ahead of time. Transitions
from deep shots to closer shots had to be gradual.
we can while gaining as much depth as we can.
It was important to have the trailer look fantastic since, after
all, its a little taste to get viewers interested in the movie. 3D
is now a completely different experience in the movie theater,
where its more controllable than it was in the 1950s, says
McNally. Jeffrey Katzenberg is behind 3D and wants to show
that were behind it. Seeing Monsters vs. Aliens in the theater
will be like being at the Super Bowl rather than seeing it on
TV. Debra Kaufman

CGW :209_p

3/5/09

3:52 PM

Page 1

n n n n

BroadcastStereo

The GE scarecrow, which comprises thousands of wires instead of the traditional straw, was
rigged using Maya nCloth. Hand animation and other subtle motions sold the concept.

became apparent that Framestore would


also be replacing, in nearly all the shots, the
scarecrows head. Fortunately, Hulin, who
attended the shoot, was able to bring the
physical scarecrow head back to Framestore
to be photographed and then accurately
modeled. It was already designed and
built, which was good, Hulin says.
In the end, the practical scarecrow head
was retained in three shots; in all the other
appearances, it was CG.
Creating a believable scarecrow body was
also a challenge. Jones struggled with the
best way to create a rig that would realistically represent the thousands of wires that
compose the scarecrow. After trying Maya
Hair from Autodesk and thinking about
writing an in-house system, he built the rig
using Maya nCloth. It turned out to be a
very dynamic rig, says Jones. We made
strips of nCloth and used its collision to
simulate the collision in every frame, then
cached it out. It took five minutes a frame
to calculate all the dynamics.
For both rotoscoping and animating the
CG scarecrow, the crew relied on information gathered by two extra cameras on the
set that were placed at 90 degrees from
the principal camera. We had three viewpoints to rotoscope from and animate,
says Jones. Tricky bits included the shot
where the scarecrow/dancer walks a tightrope on electrical lines.
We shot the dancer on wires because
he was doing acrobatic moves, adds Hulin. But because he was on wires, he didnt
22

April 2009

have the weight and impact of a 170pound guy. We had to spend time to animate those physics back in, to feel that he
was really metal and wire. We didnt want
him to feel like an animated character but
a real-world wire man.
Framestore also built a set in CG and
augmented that with matte paintings for
the hills and sky, using Maya and Adobe
Photoshop, and composited all the elements in Apples Shake before the final
beauty composite. In the final shot, the
gate and cobblestones at the end were live
action, but everything else was CG, Hulin
says. Details that sold the scarecrow included the scarf he wears around his neck
and loose wires wrapped around his waist
and upper arms. Every single shot has
some hand animation, says Jones. It was
a huge animation job.
Butler, meanwhile, did the composite
in Autodesks Flame. The biggest challenge was blending everything together to
create an overall look, he says, We gave
the skies a Maxfield Parrish blue/yellow/
pink lookcool, but not overwhelming.
And, of course, we had to get rid of the
bluescreen dancer in every shot. That was
painstaking and had to be done before we
started the comp.
Jones reports that, despite the massive
amount of work, the job went remarkably
smoothly. This was a good spot to work
on, concludes Jones. Its nice to see a
script with a creative idea you can get your
arms around.

Hot Item

Bridgestone

Director: Daniel Kleinman


Agency: The Richards Group
Production company: Rattling
Stick/Epoch Films
CG company: Framestore CFC NY

A dune buggy, a UFO, and House of Pain


come together in the rollicking comic Hot
Item for Bridgestone tires. To the tune of
the dance-floor hit Jump Around, the
spot requires no dialog as we see two spacesuited men collecting rocks on a moon
of Saturnand returning to their buggy
in time to see a UFO fly away with their
Bridgestone tires.
There was a design on paper, recalls
Framestore TD David Mellor, who was
the CG lead. But it was challenging; the
concept was quite loose. We had to develop it and make it work with the buggy,
which was shot on location. As a result,
we became involved with the design.
The commercial was shot by director
Daniel Kleinman in the Mojave Desert
in Californiaand not just for the otherworldly terrain. We knew the moon would
be lit by a single source, and in the Mojave,
you just had the sun in a big blue sky that
could be that single source, says Mellor.
Framestore senior producer James Razzall and compositor Murray Butler attended the shoot where, over two very
hot days, Kleinman filmed the actors
wearing spacesuits and careening around

CGW_THIRD_vert:CGW_THIRD_vert

b o o k s

Digital Character Development


By Rob ONeill
ISBN: 9780123725615
$69.95

The Complete Guide to Game Audio, 2e


By Aaron Marks
ISBN: 9780240810744
$39.95

Game Art Complete


Editor: Andrew Gahan
ISBN: 9780240811475
$69.95

F O R C R E A T I V E SS,,
BY CREATIVES

m
c o
o c a

Focal Pr
Focal
Press
ress help
helpss
yyoumaintain
oumaintaain yyour
our
competitive
compe
p titive edge.
eddge.
BBuy
uy online
onlin
n e
oorr in your
your favorite
faavoriitte
boo
bookstore
kstore today!
tooddayy!

l p

VVisit
isit www.focalpress.com!
www.focalpress.com!
View
tutorials,
View video
video tu
torials, cchapter
hapter ssamples,
amples,
podcasts
podcasts and
and more.
more. Sign
Sign up
up for
f or
specialized
e-newsletters
specialized e-n
ewsletters with
w
tips
tips and
and specials
specials throughout
throughout the
the year.
year.

w w . f

Debra Kaufman is a freelance writer for numerous


entertainment industry publications. She also covers
video and other entertainment content for the mobile
platform at www.MobilizedTV.com. She can be reached
at dkla99@verizon.net.

ffocal
oca
c l press
press

Aside from the wheels and suspension, the


buggy used in the commercial is digital.

All the lighting information was captured at the shoot with HDR images. The
team created a still of a spacescape, using
stars, planets, and nebula patched together
in 2D using Apples Shake. We used those
to create our own HDR image, and replaced the original sky, says Mellor. We
used those HDR images of the environment to light the buggy. It was rebuilding
an alternate environment.
Adding to the reality of the spacescape
were some 3D elements, including floating rocks, and planets and rings of Saturn
that rotated. It gave the frame a bit more
life, so space didnt look so static, Mellor
points out.
Creating realistic CG reflections in the
helmets on the spacemen was another challenge. We had to track the helmet movements and render the reflection in the
helmet so it would match up with the environment, explains Mellor. Likewise, the
UFO was created in 3D and rendered in a
similar manner, as a reflection to be placed
in the helmets.
That was quite fun, and was one of the
last things we did, Mellor adds.
Compositor Butler says he had six weeks
to do the entire tricky composite. We had
to clean up the edges of the steel cage and
make sure the spacemen were behind the
CG, he says. We had to make daylight
disappear by cutting mattes for all the rocks
and edges very precisely and replacing it with
a night sky. A bit of lens flare and luminance
from the stars also helped to sell it.
In all, seven animators and seven TDs
worked on the spot.
It was definitely a fun job to do,
concludes Mellor. We had a great bit of
creative involvementand its space and
rockets! n

Directing the Story


By Francis Glebas
ISBN: 9780240810768
$39.95

in a stripped-down dune buggy over the


desert. The deserts stark terrain was transformed into the surface of a Saturn moon,
with matte-painted elements and some still
photos of unusual rock formations found
in the Mojave, the latter shot by Framestore
London shoot supervisor William Bartlett.
Except for the wheels and suspension,
the real-world dune buggy was entirely
replaced with a CG version. That made
tracking a big challenge: The crew shot
from a camera car that followed the dune
buggy, so they couldnt place any markers
on the actual dune buggy, yet the wheels
and tires had to stay in motion to capture
all the appropriate dust elements.
We built a rig in the computer that
matched the real buggy and tracked it in
[Autodesks] MatchMover, says Mellor.
The group used the exact movement as a
starting point, and then added tweaks with
hand animation if something didnt quite
work as planned. Then they added final digital tweaks, such as jet boosters and decals.
The wireframe and all the 3D modeling and animation were done in Autodesk
Maya. The smoke trails and jets were created in Side Effects Softwares Houdini and
rendered in Mantra.

12/23/08

aZVgcbVhiZgXgZViZ

April 2009

23

A filmmaker experiments with a new CG pipeline based on


the Unreal Engine 3 while creating Chadam By John Gaudiosi

hen it comes to the global $50


billion video game industry,
theres no game engine technology more ubiquitous than Epic Games
Unreal Engine 3 (UE3). The same technology that powers Epics own Gears of War 2
and Unreal Tournament 3 is being used for
a vast array of games, including Electronic
Arts Digital Illusions Creative Entertainments (EA DICE) first-person adventure
Mirrors Edge, Ubisofts voice-controlled
Tom Clancys EndWar, and Midway Games
TNA Impact! wrestling title.
Leave it to a former game developer,
turned Hollywood executive, turned filmmaker to utilize UE3 in a whole new way.
Jason (Jace) Hall, founder of Hollywood
entertainment company HDFilms, has
spent the past year or so creating a pipeline that he hopes will change the way CG
entertainment is developed for the Web
and television.
A short time back, Hall had overseen the
creation of the Lithtech engine at Monolith Productions, developed in conjunction
with Microsoft, and was the primary per24

April 2009

son responsible for licensing the technology to other developers. Because of my


background in developing games,
I have a lot of experience understanding how in-game
cinematics come together,
what the limitations can
be, and once you set the
limitations, how rapidly
you can make changes or get
someone up to speed who
has limited experience in constructing cinematics, explains Hall.
I thought, in theory, you could take game
technology and create a one-hour story.
Today, Hall is putting that theory to the
testtwo times over. First, he is working on
Chadam, an original 10-part Web series
featuring characters by Alex Pardee that he
created for the band The Used. More than
a year ago, Warner Bros. Television Group
(WBTVG) announced plans to offer the
series through its Studio 2.0 production division. According to Hall, that should occur
sometime this year. Halls second project is
still undecided, though he says he would

GamingMultimedia

Intended as a Web and television series, Chadam


is a CG project whose production pipeline is built on
top of the Unreal Engine 3 game engine.

like to try something less violent and less


dark than Chadam, suggesting a sugary
girl-focused story with music as one possible
idea. The concept would be to show two
diverse examples of how the development
pipeline hes layingon top of the UE3
foundationcan be implemented for any
type of CG project. Hall is especially interested in bridging the gap between linear and
interactive entertainment. After all, he got
his start developing first-person shooters.
Hall founded development studio
Monolith in 1994, which went on to create games such as Shogo, No One Lives
Forever, FEAR, and The Matrix Online.
Seven months after Warner Bros. Interactive Entertainment (WBIE) hired Hall as

its senior vice president in January 2004,


WBIE purchased Monolith. During his
tenure, Hall oversaw a number of game
hits (FEAR, Batman Begins) and misses
(Catwoman, Superman Returns). In February 2007, he left the executive world
to seek out new creative opportunities at
HDFilms.
The story that Hall and his team are telling through HDFilms focuses on the character Chadams battle against Viceroy, a
serial killer whos out to destroy the world.
The whole concept revolves around the use
of the imagination and an overarching story about how someones imagination can
be so powerful that it can affect the lives of
those around that person.

Laying the Pipeline


Hall has an in-house core team of eight
people working on world construction for
Chadam. The art pipelines were designed
similar to the way games are built today, except the team didnt have to be concerned
with gameplay or frame rate. Instead, its all
about telling a story and pushing resolution.
Our goal wasnt to create the most realistic, detailed world, but more on developing a story, says Hall. However, some of
the sets built for Chadam are pushing well
over hundreds of thousand triangles, along
with some textures at 4096x4096 pixels.
Since our concerns were renders rather than
frame rates, our limitations, as far as quality
was concerned, is pretty much, at the moApril 2009

25

n n n n

GamingMultimedia

ment, limited to our computer specs.


HDFilms is working with Exigent,
former game developer Paul Steeds company, to re-create the original character
models designed by Pardee for use in the
Unreal engine. Paul Steed is well known
for being the key character modeler in top
video game franchises, such as Quake,
says Hall. Taking two-dimensional characters that were not designed for 3D space
has its challenges, and it is important to
have someone building your models who
understands and brings an essential creative component to the mix to help make
it all work.
A good portion of the work is actually
done outside the game engine. The artists
build and animate the characters mainly

OC3 Entertainments FaceFX.


For some action sequences, the group
turned to motion capture, provided by
House of Moves (HOM). The goal was to
shoot everything in one day, but in the end,
an additional day was required for pickups.
The main stage at HOM is equipped with
64 Vicon MX-F40 cameras for capturing
body and basic finger articulation. For the
Chadam project, the team focused solely
on the movements of the characters bodies,
which was done inside the 70x40x22-foot
volume. Later, the data was applied to the
characters with Autodesks MotionBuilder.
Hall says collaboration has been a key
component of this project, even when it
came to outsourcing the mocap work.
Rather than walking in and asking for a

Former founder of game developer Monolith, Jace Hall is trying to bring game techniques to
the Hollywood stage in an attempt to revolutionize and streamline the production process for
television and online projects.

using Autodesks 3ds Max. Because many


of the characters have odd designs and
proportions, keyframing offers the most
suitable animation option. In addition
to 3ds Max, the group also is doing some
keyframing within Unreals Matinee. According to Hall, theres a lot that can be
done in Matinee that helps development
on the fly; for instance, the team is able to
review many of its changes in real time,
rather than waiting for new renders within 3ds Max.
For the facial animation, the artists used
26

April 2009

list of animations, he instead requested


that the HOM team read the Chadam
script to determine which actions would
be handled more efficiently and creatively
through motion capture as opposed to
keyframing.

Within the Chadam World


Digital ToyBox, led by Landon Pascual, is
filling in the environments and creating
the special effects. The artists use primarily 3ds Max, along with Pixologics ZBrush
and Ryan Clarks CrazyBump, for the en-

vironments. Most of the special effects, on


the other hand, are handled by the particle
system within the Unreal engine.
The lighting, meanwhile, also is handled within Unreal. Were using a heavy
amount of dynamic lighting and highquality ambient occlusion. Theres no need
to optimize based on what were doing.
The results are in real time and look beautiful, Hall explains. What youre going to
see will be in-game renders.
The size of the Chadam team is actually a little smaller than those Hall assembled in the early days of Monoliths game
development. When creating a game and
cinematics for in-game presentation, developers dont have the advantage of controlling exactly what the player will see at
all times because of the interactivity. Yet,
they have to live within the game-specific
environments because [the scenes] will run
from a level to an in-game cinematic. In
contrast, HDFilms operates much more
like a film studio. There doesnt have to
be a back to the buildings in the world
of Chadam. The crew has the luxury
of knowing it can go back and edit these
UE3-rendered scenes and switch back and
forth from camera angle to camera angle
with a nonlinear editing system.
All the camera work is handled in Unreal
through Matinee. Once the scenes are set
up, the renders are dumped as single images and composited in Adobes Premiere.
Were designing our show around whatever the limitations are for game technology, explains Hall. That makes the size
of the team manageable because we dont
need 50 people or an army of programmers. When youre doing a game, youre
trying to make something that surpasses
whats previously been done with the Unreal Engine and distinguish yourself that
way. In this case, [we dont believe anyone
has] ever done anything like this with the
Unreal Engine.
That said, Hall emphasizes that hes not
out to create the next Shrek, nor does he

CGW :209_p

3/5/09

4:22 PM

Page 1

n n n n

GamingMultimedia

Artists are using primarily 3ds Max to model and animate the characters, and Max and ZBrush are
the tools of choice for the backgrounds.

have the budget to do so. (Confidentiality


agreements prohibit Hall from revealing
the sum.) Visually speaking, theres a certain level of realism in Chadam, a look
that is in step with the story: Chadam,
whose strong imagination can physically
alter his environment, lives in a hyper-stylized, metropolitan island city of Vulture.
All the imagery is being captured in high
definition at 1920x1080. Hall reasons that
the Web series should be an improvement
over traditional television content done in
CG because there will be more content generated with HDFilms game-development
process than a traditional production at
the same budget.
Hall offers this analogy to support his
point: If I only give you $10 for gas, you
will go much farther in a Prius than a Ferrari. So, when you are trying to get the most
distance with limited money, you would use
a Prius because it processes fuel differently,
he explains. If you had as much gas as you
need no matter what, or if you need to go
fast, well then, you would use the Ferrari.
Even so, Hall acknowledges that in terms
of his analogy, a Prius is not supposed to
replace a Ferrari. We are living with a controlled budget and building a machine that
can get the most distance while still looking
great. We are not trying to negate the need
for [traditional 3D software] or traditional
3D productions.
28

April 2009

Some Bumps
Like with any project, the Chadam
crew has encountered its share of bumps
along the road. One of the major limitations was not due to a lack of realism,
contends Hall. Rather, it was the number of sets and the total number of animations being made.
With any 3D animated project, artists
essentially have an infinite amount of resolution in terms of how detailed their animations, textures, and models are going
to be, explains Hall. Part of the genius
of what were trying to do is control the
overarching budget. The trick is figuring
out what you can accomplish with a fixed
budget, and how compelling you can
make that content. Its hard because it is a
subjective scale. For every piece, whether
its the special effects, the models, or the
level of detail in the animation, they can
go on forever.
As of press time, Hall and his team were
still tweaking and learning as they finetuned their creation and the new pipeline.
WBTVG has given the project plenty of
time to gestate in the hope of getting it
right the first time.

An Unreal Editor
Selection of UE3 was an economical move
from the beginning. One of the reasons
Hall chose to buy the UE3 license was

because Epic Games releases its complete


UE3 tool set to gamers through PC titles
like Gears of War and Unreal Tournament
3. So, in effect, he got the kit for free. Hall
then partnered with a graduate from The
Art Institute who had a UE3 background
to lead this project and assemble a team,
thus further controlling costs.
You have this tool set and all these
people who know how to use it, says Hall.
Unreal has become pervasive. It turns
out you only need a few experts in terms
of payroll, and they can guide others who
are newer to this, and you can create something like Chadam.
Epic Games designed UE3 this way, and
the key tool that has opened the door once
segregated to programmers is indeed Matinee, which HDFilms is using extensively for
keyframing and camera work. In addition,
this software within UE3 gives artists control over lights and particles. In addition, it
offers the ability to edit between shots.
Anyone who has worked on any type of
software in Hollywood, whether it is Adobe
After Effects or Premiere, Apple Final Cut
Pro, or Avid Media Composer, can easily
jump in and see how Matinee works, says
Greg Mitchell, digital cinematographer at
Epic Games. Hollywood is used to working in a system whereby scenes are rendered out shot by shot. In one particular
film scene, there can be 20 shots numbered
individually. UE3 has the director track in
it, which allows you to plot out the entire
scene in the engine and then actually cut
your cameras from shot to shot. You can
have an entire scene in one file and then
maintain it in the one UE3 scene, instead
of having to break it up into small scenes.
Although obviously biased, it is Mitchells opinion that the traditional Hollywood method of progressing from a
moving storyboard or animatic phase to
a final master offers a road map with little
flexibility. With Matinee, creators can look
at things with extra camera shots and then
lock down what they want, as opposed to

GamingMultimedia

having it planned out from the beginning.


Its this on-the-fly nature that has allowed
Chadam to progress so smoothlyeven
if this isnt exactly Hollywood, yet.
As weve seen in game development,
when you give artists tools that allow them
to make modifications instantly and see results instantly, they have an entirely higher
level of productivity, claims Tim Sweeney, founder of Epic Games and creator
of the Unreal Engine technology. Even
if our tools are simpler and less powerful
than [traditional modeling and animation
tools], they allow artists to do cool stuff
faster. I think in many cases thats better.
Sweeney continues: Were about to see
a complete inversion of that marketplace
where real-time game tools take over. With
artists time being so valuable, they dont
want to wait around for a day for previsualization of their scenes; they want to see it
right now. If they can see it right now, they
dont need as many special effects as they
would have had in the offline render.
Nevertheless, even HDFilms did not
completely go real time for Chadam,
and the artists continue to use traditional
3D modeling and animation software for
the core work. Rather, the gaming tools are
supplementing these packages.

kind of production were doing with Chadam, Hall says, but notes that youll have to
make some changes. In games, youre living
with certain memory constraints concerning
texture sizes and the number of animation
types, and were not, he explains. The artists can load up a scene with huge textures,
knowing thats the only place viewers are going to see, and the PC is not going to have
to render a whole city behind it.
Also, the group can be extremely detailed and specific in its animations; they
know they are just going to use the animation for that scene. You have to run a
balance in a game, Hall adds. However,
theres no question that its a huge advantage moving laterally.
Hall certainly hopes Chadam finds an
online audience, whether broadband or

n n n n

created a cost-effective way to create a type


of 3D animation.
Hall continues: Although you can do
cost-effective animation with [traditional
DCC software], the issue is how it looks at a
budget of, lets say, $100,000. If you do the
same thing using Unreal and our process, I
believe our version will almost always contain more content under that compressed
budget. Were not trying to replace [those
other tools]; its just a different avenue for
a controlled budget to produce these things
for the Internet or television.
So far, Hall has had an open informational policy regarding this process, but
he cautions that it may turn out that everything his team has learned becomes so
significant that he wont want to give away
the golden goose. Even if he does decide

A Virtual Backlot
That does not negate any of the advantages
of HDFilms new system, including the
ability to create a virtual backlot of character models, environments, and animations
that can be fine-tuned and ported to television or video games in the future. The fact
that Chadam was created inside a video
game engine makes the transition to interactive entertainment natural. According to
Hall, all the money that is being spent creating these assets translates almost directly
into a video game production.
You come out ahead of the game producing something like this and dovetailing
into a game, and vice versa, contends Hall.
If you create a game, you can go into the

UE3s Matinee enables the artists to review changes in real time. The group also used the tool for
some keyframing, camera work, lighting, and particles.

wireless, and hopes to bring these characters to both television and video games in
the future, as well. But the ultimate goal
with this project, outside of entertaining
an online audience, is to prove that UE3
can be used for much more than creating
blockbuster games.
Our goal is to set up a way of producing
these kinds of animations that are incredibly cost-effective and offer opportunities
for beginning to intermediate artists to
get involved and produce something thats
great under the supervision of an advanced
artist, explains Hall. Essentially, well have

to share this new CG process in more detail, it will be a while before he can actually articulate this to others. The team is
still learning, not just with Chadam, but
with its second test case. Hall believes that
Chadam will open Hollywoods eyes to
what Unreal can do. And hes happy to be
leading the new wave of CG storytelling
using this game engine technology. n
John Gaudiosi has been covering the video game
business for more than 15 years for outlets such as The
Washington Post, Wired Magazine, Reuters, and AOL
Games. He can be reached at JGaudiosi@aol.com.
April 2009

29

ArtAugmented Reality

Digital artist Maurice


Benayoun explores the
new media world
By Barbara Robertson

30

April 2009

A
f
P
h
c
a

ArtAugmented Reality

hen Paris-based artist Maurice Benayoun created the animated TV series Quarxs with Belgian comic-book artist Francois Schuiten in 1990,
people considered the use of CG animation as new media: The HDresolution series, which starred CG creatures that bent the laws of physics, biology, and optics to explain the worlds imperfections, pre-dated even
ReBoot
ReBoot. In the years since, the award-winning artists exploration into digital
media has bent artistic perceptions beyond the world of animated films as hes moved through
virtual reality, on into augmented reality, and out the other side.
Last year, in January 2008, the European School of Visual Arts (ESI) in Poitiers, France,
organized a 15-year retrospective exhibition of 10 Benayoun installations, including his famous 1995 Tunnel under the Atlantic and the award-winning, cave-based World Skin,
A Photo Safari in the Land of War, as well as the more recent Emotion Vending Machine
co-produced with the ESI in 2006.
With Tunnel, as participants in Pariss Pompidou Centre and Montreals Museum of
Contemporary Art watched images, proprietary software developed by Z-A, the production
studio Benayoun cofounded in 1985, continuously selected new images based on such criteria
as how long participants looked at image areas. Tunnelers met when their images matched.
For World Skin, Benayoun put people wearing 3D glasses inside a VR cave, where they
focused cameras on sliding layers of Bosnian War photos. Each camera click blanked out a
photographed area. Now, for a series of installations called the Mechanics of Emotions, hes
taking snapshots of the worlds emotional state by using data from the Web. For example,

For Still Moving, an installation at the Grand Palais, Benayoun used Internet data and rapid-prototyping tools to sculpt an emotional map of the world. Visitors touching the deflated globe feel vibrations
from unheard music mapped to the emotions by composer Jean-Baptiste Barrire.
Artist Maurice Benayouns design
for a new permanent exhibit inside
Pariss Arc de Triomphe, includes
high-definition displays with
constantly changing images placed
at an oblique angle within the space.

in 2006, his Emotion Vending Machine let people create downloadable cocktails of images
and sounds from emotions captured in real time via the Internet.
In 2007, with help from his longtime collaborator, composer Jean-Baptiste Barrire, Benayoun turned a map of world emotions into a music score that played at the Palazzo Strozzi in
Italy. Last December, he installed Still Moving, a large, interactive sculpture of emotions, in
the main entrance of Pariss Grand Palais.
We caught up with Benayoun in his home/office/studio in Pariss densely populated and
trendy 11th arrondissement where he lives with his wife and young daughter. He designed the
modern structure, which wraps around an airy atrium between buildings, by working with
April 2009

31

CGW :808_p

7/16/08

11:45 AM

Page 1

ArtAugmented Reality

Christophe Girault, an architect he also


collaborated with to create a new permanent exhibition at the Arc de Triomphe.
There was nothing here, Benayoun
says of his high-ceilinged live-work space.
There was a restoration of the buildings
on either side, and this was a completely
open space with no walls, nothing. We created it from scratch.

television news by way of example, noting


that CNN presented the war in Iraq using
titles created with 3D graphics that looked
like the receding text at the beginning of
Star Wars. Once youre in a fiction, he
says, world-scale events touch you only like
a movie. The separation between reality and
fiction is not clear; its just another story.
What I am trying to do is not create illusion.

Benayouns red IDWorms, installed during Shanghais eArts Festival in 2008, captured participants
faces and converted them into barcodes that, together, formed an ever-expanding, huge image of
barcodes dynamically extruded at the far edge to produce a constantly growing 3D city skyline.

Rain beats down on the glass ceiling of the


atrium. Benayoun makes strong coffee, and
we settle on a large, black-leather couch. He
looks like a cross between a professor and an
avant-garde artist, and indeed he is both, having taught at the University of Paris 1 (Pantheon-Sorbonne) since 1985. The kitchen
is to our left, stainless steel and modern. In
front of us, floor-to-ceiling bookcases line a
wall behind a long glass table, and a staircase
leads to the familys private living quarters.
Behind us, a large area overflows with computers, printers, files, and storage cabinets.
In the 17th century, playwrights told
stories about the world as an illusion,
Benayoun says. Even movies from the
1950s were about the world being exactly
what comes out of your mind. With The
Matrix, the idea was that fiction is becoming our reality. But now, the real world is
becoming fiction. Thats different.
Benayoun points to reality shows and

Not make people think reality and fiction


are the same, but awaken people.
Take, for example, the permanent exhibit
Benayoun designed for a museum inside the
Arc de Triomphe, which won a competition
sponsored by the National Bureau of Monuments in France. Can you imagine, he
laughs. I thought, if I do something in the
Arc de Triomphe, I will be considered as an
official artist, you know. So, I accepted only
if I could say the things I wanted to say.

Paris Plus
Napoleon ordered the Arc de Triomphe
built after his victory over Russian and

n n n n

Austrian forces in 1805, and the massive


structure, finished in 1836, has become a
symbol of war and a tribute to fallen soldiers. But, Napoleon said his war would
be the last one, Benayoun says, and on
the Arc de Triomphe, there is a sculpture
called Peace. So Benayoun titled his design Between War and Peace.
Visitors reach the museum tucked into
the arch by climbing 284 steps inside one
leg of the monument. Until the 2.2 million Euro renovation led by Benayoun,
that museum had displayed the same collection of dusty memorabilia in a dark
and dingy room since 1930. Now, visitors encounter 13 high-definition plasma
screens that display historical images. The
screens, which Benayoun had mounted on
65-inch-tall heavy, square, steel platforms
set in straight lines, are positioned within
each platform to create an oblique line that
slashes through the space to break and connect parallels, virtual and real. Long, yellow
benches line the walls.
[The officials] asked me why the furniture is not blue and redbecause of the

French flag, of course, Benayoun says. I


wanted to answer, because not blue and not
red. But I say, We need some sun inside.
However, at each end of the room stand
two monitors that, at first glance, look like
typical museum exhibits. But on one we
can see red-tinted images; the other displays
blue-tinted images. The red monitor shows
images only of the monuments wartime
uses; on the blue, images of the monument
used in peacetime flicker past. In front of
each monitor is a joystick mounted on a
platform. Museum visitors can pick a joystick and control the display speed of the
images. They can compete between war
April 2009

33

Essential skills, indispensable books


Course Technology PTR has the resources you need to master essential graphics and animation software and techniques. Featuring
detailed instructions, insight, and tips from industry pros, our books teach you the skills you need to create unique digital art, believable
characters, and realistic graphics using Poser, Vue, DAZ Studio, modo, Anime Studio, Maya, and more.

Vue 7

Figures, Characters, and Avatars

Beyond the Basics


1-59863-884-X $59.99

The Official Guide to Using DAZ Studio


to Create Beautiful Art
1-59863-816-5 $34.99

The Art of Poser and Photoshop

Maya Plug-In Power

The Official Guide


1-59863-431-3 $39.99

1-58450-530-3 $49.99

Visual Design Fundamentals


1-58450-581-8 $49.99

The Official Luxology


modo Guide, Version 301
1-59863-497-6 $49.99

Anime Studio

The Official Guide


1-59863-432-1 $39.99

The RenderMan Shading


Language Guide
1-59863-286-8 $49.99

You can find our complete line of graphics and animation titles online
at www.courseptr.com or call 800-354-9706.
Course Technology PTR books are also available at Amazon, Barnes & Noble, Borders, and other fine retailers nationwide.

ArtAugmented Reality

People visiting Centropolis, Benayouns installation for Frances year in China, saw a composite
world city painted by 12 viewers looking into custom-designed, augmented-reality binoculars.

and peace. If they stop the display to focus


on one picture, the tint disappears.
Benayoun laughs, [The officials] saw
the monitors and said, Oh, good. This is
the red and blue. And I said, Yes. You understand everything.
Also inside the museum space is a small
brass replica of the Arc with all the details
of its sculptures created to scale. The brass
replica is an input device. People can feel
the sculptures, Benayoun says. As they
move [the replica], they control a life-size
video projection of the sculptures.
The most radical part of the exhibition,
though, will be a few steps up: AR binocular telescopes installed on the roof. From
the Arcs rooftop, visitors have a magnificent, 360-degree view of Paris. By looking
into the telescopes, they will see an augmented version of that reality.
We have a complete 3D model of Paris
that we can integrate with the HD video
from a camera inside the telescope, Benayoun explains. He expects to install the first
applications in May.
At the beginning, they will see something simple, like words printed on monuments, Benayoun says. And, if they keep
looking at the monument, theyll see more
information. Another use will be to see
the past, to see how an area looked one or
two centuries ago, using 3D models that

are fully integrated inside the view. With


the 3D model, they can also increase the
virtual light in the city for those looking
through the telescope at night.
But, Benayouns vision for the telescope
reaches far beyond historical data and architectural applications. Imagine, he says,
that the weather in the sky over the city
that you see through the telescope isnt
related to actual weather, but to the emotional state of people in the city. We could
have people in districts say how they feel
today on their cell phones and on the
Web, and that would impact the kinds
of clouds that form in the sky over their
districts. We want to think more about
living together and sharing; to learn more
about how we can live together. So, we are
creating new software and thinking about
many uses.
We is a group of students, artists, and
researchers working within a university
center called CITU, cofounded in 2004
by Benayoun and Jean-Pierre Balpe, a
professor at the University of Paris 8,
sometime after Benayoun closed his production studio. Funded by the European
Union, the Culture and Communication
ministry (DRAC Ile de France), and the
Ile de France region, the multi-disciplinary group blends artistic creation and
scientific research in new media fields.

n n n n

Benayoun serves as the art director.


I was obliged to have a company to do
Quarxs and my VR installations, Benayoun says. But now I can do all those things
through CITU.
Benayoun first put CITUs AR telescopes to work in Cosmopolis, a giant
immersive installation created in 2005 for
the French Year in China and installed in
Shanghai, ChongQing, Chengdu, and Beijing. Twelve four-by-three-meter screens
hanging from a circular structure displayed
video of constantly changing urban environments in 12 cities: Paris, Berlin, Barcelona, Chicago, Johannesburg, Cairo,
Sao Paulo, Beijing, Shanghai, Chongqing,
Chengdu, and Hong Kong. Twelve AR
telescopes pointed at the screens ringed
the outside of the circle; benches inside the
circle allowed visitors to view the other side
of the screens.
When participants looked through the
telescopes, though, they saw five panoramas of one city. A custom system tracked
their viewpoint, and custom software collected and combined the digital images to
create one virtual city from all the intersecting gazes. Visitors inside the circle saw that
assembled panorama.
People painted a world city by mixing
12 different cities from around the world,
Benayoun explains. They painted with
their eyes. If they looked very fast, the image was transparent. If they looked longer,
they painted details.

Mapping Global Emotions


Benayouns Mechanics of Emotion
series, on the other hand, considers
peoples emotional view of the world
by analyzing data on the Internet. His
idea is that we could think of the Internet as a world nervous system, one that
knows the planets pain and pleasure. So,
again using custom software, he queries
the English language data for specific
wordsfear, perhaps, or joy.
At first, Benayoun created a tag cloud
April 2009

35

n n n n

ArtAugmented Reality

Benayoun created a tag cloud of hits on the Internet for the word fear to create an emotional
world map that he projected onto a helium balloon. He titled the exhibition Sfear.

with the result, overlaying text in various


sizes on a map of the globe based on the
number of hits. Then, he began creating
terrain maps based on the number of hits
for a particular word. One citys 630 hits
on the word nervous would produce a
little bump in the map; another city might
build a mountain with 15,700 hits.
Im not trying to do something scien
tific, Benayoun says, walking to a wide fil
ing cabinet with narrow drawers. Im just
trying to see if I get a significant response.
Then I create frozen feelings that are sculp
tures. He pulls open a drawer, and a tray
with platters of emotions slide out from
the cabinet. The platters look like thick clay
plates with bumps. These are dead emo
tions, of course, the artist says, because
they cant move. For his first exhibition
with the dead emotions, Benayoun pro
jected the relief map for fear onto large,
transparent spheres hanging from the ceil
ing of a church in Italy, and below the
sphere, presented platters of frozen feelings
like religious relics.
In December, Benayoun gave the frozen
emotions a little life with Still Moving,
an interactive installation at the Grand Pal
ais that he plans to install in other cities, as
well. For this, he created a giant sculpture
with three emotions in layersnervous,
36

April 2009

anxious, excitedusing blue, yellow, and


red to represent each emotion. To create the
thick, multilevel, flattened sphere, Benay
oun and CITU used Stratoconception, an
innovative rapidprototyping method for
creating large parts from layers of materi
als, developed by CIRTES, the European
Center for Rapid Prototyping.
When people approach the sculpture,
which is three and a half meters in diam
eter, they can feel a vibration that increases
when they touch it. They can feel mu
sicwhich is also made with the level of
emotionswith their bodies, Benayoun
says. They cannot hear it. City names ap
pear on areas they touch.
Still Moving is one of 14 Mechanics
of Emotions projects Benayoun has imag
ined, only some of which hes already imple
mented. I have many new projects relating
to emotions, he says. Some are close to
completion. But, its always difficult to say
too much until they are completed.

The Dump
It is not difficult at all, though, for Benay
oun to talk about an ongoing project that
he calls, The Dump. It is a blog (www.the
dump.net), a collection of his ideas. Hun
dreds of ideas. Everyone can come and take
them and do them if they want, he says.

One concept, for example, is that people


might live rentfree in apartments wall
papered with catalog pages. Each month,
the catalog company would replace the
wallpaper with new pages of products ap
propriate for each room.
Another idea would help critics define
contemporary art. I tried to understand
what it is that makes people say something
is art, Benayoun explains with characteris
tic sardonic humor. I think it has nothing
to do with content, so I thought, maybe
its the smell. And, I found the smell. Its
the smell of the white paint used for the
wall because the galleries are newly painted
for each exhibition. So I propose to cre
ate a new paint perfume that artists could
dab onto their works. Then, the critics and
journalists and curators would say, This is
contemporary art.
Benayoun submitted The Dump as
his PhD thesis and became, arguably, the
first artist to use a blog to qualify for a doc
torate degree. I think The Dump is one
of the most exciting things Im doing, he
says. Its a way to go deeper and deeper in
side to what art could be, what innovation
could be, and what any action should be.
You know, I cant think about art without
thinking about the world. Its not about
political correctness. So why should I de
cide to do something? Very often you only
know what artists like to do because you
see the results. You never know what they
decided not to do, or couldnt do, or didnt
know how to do. So, The Dump is also
about that. Its about artistic intention.
Of all his work, The Dump is the sim
plest form of digital art, but also the most
accessible and most interactive. It is neither
virtual nor augmented reality, or perhaps
its both, but with it, Benayoun might have
created the newest of all new media. At
least, for now. n
Barbara Robertson is an award-winning writer and a
contributing editor for Computer Graphics World. She
can be reached at BarbaraRR@comcast.net.

Mixed Media

38

April 2009

Mixed Media

If 18 months ago you would have


told David Burgess that he would consid
er a 3D computer graphics tool brilliant
and able to inspire incredible creativity,
he might have laughed in your face.
Burgess, an internationally known auto
motive photographer, is no Luddite. He
was among the first wave of professional
photographers to go digital, and calls that
movement a revelation, offering possi
bilities unheard of in film. Until recently,
though, Burgess had resisted the marriage
of photography and 3D CGIand, ac
cording to him, for good reasons.
I explored CGI, looking into [Au
todesks] Maya and other programs, but
found them to be nonintuitive and un
creative, says Burgess. [CG] was the do
main of technicians. I didnt want to do the
equivalent of going back to a university in
order to learn one of those programs.
Burgesss opinions changed approxi
mately a year ago when he was working
on a project for Ford Motor Company.
Jennifer Flake, the companys director of
global brand imaging and design for pub
lic affairs, wanted to show the media the
Ford Inceptor concept car in a realworld
environment. The only obstacle was that
the car had not been built yeta situa
tion Burgess was unaware of when he ar
rived for the photo shoot in Las Vegas.
Burgess and his assistant shot back
grounds at a neonsign graveyard outside
the city. Then, using a spherical camera, they
shot highdynamic range (HDR) images,
which provided a 360degree panorama of
the location with complete lighting data.
The duo brought the photos and data back
to the hotel, where Burgess, using a ren
dering program from Bunkspeed, merged
CAD models of the concept car with his
justcaptured digital photography files.
Within hours, Burgess had created five
complete images, with the car fully integrat
ed into the scenery, reflecting the surround
ing environment and taking advantage of
the captured light from the HDR images to

generate highlights on the car and natural


shadows.It was staggering that I could do
this so quickly with software I had never
used before, says Burgess. I had the free

n n n n

lens type, angle, rotation, and distance.


I had spent a lot of time in the past
looking to add CGI capabilities through
Maya and 3ds Max, says Lee, who has

Automotive photographer David Burgess used Bunkspeeds HyperShot software to place a CAD
model of the Ford Explorer America concept car in US location shots.

dom to exercise creativity and pursue my


own style without compromising quality.
I was able to render the images in hours on
a laptop, something that might have ordi
narily taken days using a renderfarm.

Fresh Look at CGI


According to Burgess and fellow auto
motive photographers Michael Lee, Nigel
Harniman, and Vic Huber, the Bunk
speed software, called HyperShot, war
ranted a fresh look at how photography
can be used with CGI.
Unlike traditional programs, originally
intended for creating images and anima
tions from scratch, HyperShot is designed
exclusively for working with photos and
HDR data. After shooting the back
ground plate and HDR photos and load
ing them into HyperShot, the user im
ports a 3D car model from any standard
3D DCC software, including native data
from popular CAD programs. Next, the
person applies materials, color, and surface
textures using a palette within the software.
Then, he or she chooses an HDR map to
provide the lighting, adds the backplate
image, and adjusts the virtual camera for

done advertising work for Honda and


MercedesBenz, among others. It took a
lot of time to modify materials, and a great
deal of expertise to get the quality I needed.
Even with hardware acceleration using 32
processors, rendering a single image typi
cally took as long as 12 hours.
[Now I can] quickly get to the fun part
of the process for any photographer: light
ing, Lee adds. Photographers are often
called upon to consult on lighting, so its
great to have this much control over that
aspect. The rendering is optimized, so what
took me 12 hours with 32 processors can be
rendered in two hours at the same or greater
quality on a Mac quadcore system.

Interactive Photorealism
HyperShots architecture was designed from
the start to take advantage of new technol
ogy and hardware developments. The en
gineers started from a clean slate, aided by
Henrik Wann Jensen, Bunkspeeds chief
scientist, who is known for his photon
mapping algorithms that make it possible
to realistically simulate effects, such as caus
tics, diffuse interreflection, and natural
phenomena, including smoke and fire.
April 2009

39

n n n n

Mixed Media

Burgess used CGI to insert this Maserati GT Lisbon into a photograph of a racetrack.

Our goal from the beginning was to


create tools that would extend the ability of
photographers to communicate the passion,
vision, and emotional beauty of great automotive design, says Thomas Teger, Bunkspeeds director of marketing and strategic
planning. Having easy access to great materials and lighting, and the ability to quickly
see the results of your work, enables faster,
more insightful decisions that aid creativity.

Is the Time Right?


After years of false starts and retrofitted solutions, is the time finally right for the marriage between photography and 3D computer graphics? Obviously Teger thinks so,
but so do Burgess and Lee.
Photographers see for the first time that
there is little or no compromise associated
with a plunge into CGI waters. The software
mimics the way photographers work, allowing them to exercise and extend the signature
styles they have developed over the years.
You can create pictures that you couldnt
conceive of before, says Burgess. You can
put the car in any position in almost any
type of environment and make it look real.
On the practical side, Burgess is able to
avoid some of the problems associated with
retouching. There are [unsightly] floors on
almost any shoot staged in a studio. When
you retouch the floors, you bleed the living daylights out of reflection and shadow
details. Now, theres much less retouching.
40

April 2009

The paint is proper, with the right metallic


effects, and there is no compromise with
shadows and lighting.
For Lee, this takes away some of the pressure of having to capture everything during
the photo session. [I no longer have] to do
everything behind the camera. Traditionally,
you can only be creative when shooting. After that, you are limited by what you can do
in [Adobes] Photoshop, Lee says. [Now I
have] the freedom to try something different. If you have a good CAD model with
a lot of detail, youre only limited by how
creative you want to go.

Obstacles
As with any new form of technology,
there are perceived and real obstacles to
widespread adoption. The photography
business is shifting, and like any changing
environment, people will have to adapt,
Teger adds. Traditionally, everything was
outsourced from the manufacturer or the
agency to different entities: the photographer, the CG house, the retoucher. Now,
the photographer can do everything: take
the pictures on location, insert the virtual
object into the scene, and fine-tune lighting, colors, and materials. Rather than a
step-by-step process with many players in
a chain, its one-stop shopping.
Burgess and Lee see parallels between
the current situation with CGI and the obstacles faced by digital photography just six

or seven years ago. Initially, clients freaked


out, says Lee. You just have to keep proving that it is a better solution for the client.
In addition, there is the bedrock issue of
obtaining good CAD models. Companies
can take 3D CAD models that have been
engineered for manufacturing and apply
them to advertising, promotion, and PR,
says Teger. However, it takes communication between technical and marketing
people to make it happen.
Burgess and Lee see two obstacles when
it comes to obtaining quality CAD data:
Auto manufacturers are often nervous about
giving out proprietary design data, and the
details required to generate a great image are
sometimes not in the models supplied to
photographers. These arent showstoppers,
however, and should dissolve once there is
a better understanding of the new process.
Similar issues faced CAD itself in its early
years, until auto manufacturers realized the
time and cost savings of using a 3D model to
drive the product development cycle.
If the tool can help photographers do
considerably more in less time, it is worth
it to them, believes Teger.
A major consideration for Lee is that
there is no need for proprietary hardware.
With some other systems, its like buying
a car that becomes dated once it leaves the
lot, he says. I dont want to have a big
investment in hardware that I can only use
occasionally for rendering.
For photographers such as Burgess and
Lee, the question is not when this marriage
of photography and CGI will take place,
but when everyone else will realize that its
already happened.
If the likes of Burgess and Lee are correct,
five years from now well look back and see
photography mixed with CGI as natural
and inevitable, similar to the way we see
Google, the iPod, and the Prius today. n
Bob Cramblitt writes about technology developments
that fundamentally change the way we work and live.
He can be reached at info@cramco.com.

nyus
school of
continuing and
professional
studies

The career your imagination has always imagined.

Masters in digital iMaging and design


The art of visual communication is as much about leadership as it is about artistry. NYUs Center for Advanced Digital Applications
(CADA) offers exceptional educational programs in all facets of animation, technical direction, digital effects, and motion graphics.
Our faculty is comprised of some of the industrys most accomplished professionals, hands-on practitioners who continually guide
and challenge our students both creatively and technically.
The Master of Science in Digital Imaging and Design prepares students to master the artistic and technical aspects of digital media
production as they relate to animation, motion pictures, broadcast, commercials, and other entertainment industries. Students learn
theoretical and practical foundations, then choose a specialization in either 2D or 3D production. From day one, the curriculum
places equal emphasis on both creating original content and managing complex digital productions.
Visit the CADA gallery online to view samples of student work.

Information Sessions:
Wed., June 17, 68 p.m. | Wed., July 15, 68 p.m.
Please call for locations and to RSVP.

scps.nyu.edu/319 1-888-998-7204, ext.319


New York University is an affirmative action/equal opportunity institution. 2009 New York University School of Continuing and Professional Studies

n n n n

Education

ames Thurber, a 20th century writer/


cartoonist from Columbus, Ohio,
was known for his short stories, cartoons, and essays focused on people and animals. He is especially known for his amusing dogma: The humorist identified human
traits in dogs, and brought them to light in
many of his drawings and musings. Known
as Thurbers Dogs, they, along with their
master, became a national comic institution.
Recently, a team of graduate students at
Ohio State Universitys Advanced Computing Center for the Arts and Design
(ACCAD) re-created these old dogs, originally crafted through simple line drawings,
by using new tricks: computer graphics
technology. First, the nine students modeled the dogs using 3D software, and then
brought them to animated life during an
orchestra performance. But this was no ordinary musical selection: It was an encore
performance of composer Peter Schickeles
Thurbers DogsSuite for Orchestra, which
had been commissioned a few years earlier by the Columbus ProMusica Chamber Orchestra in partnership with Thurber

42

April 2009

Education

House, a literary center and museum that


had been the writers former home.
In all, the students animated six sequences, based on Thurbers famous dog drawings
and stories, that accompanied six musical
movements from the suite. The animations,
which are between two and five minutes in
length, were projected onto a large screen
above the orchestra. When the suite debuted
in 1994, slides of the original dog drawings
were used. But for the recent encore to celebrate anniversaries of both ProMusica and
Thurber House, the two groups were looking for more bite in the visual accompaniment, so they approached ACCAD.
They asked if we would be interested
in coming up with animations and stories
based on Thurbers drawings and the musical piece, to bring Thurbers drawings to life,
recalls ACCAD director Maria Palazzi. It
seemed like a perfect collaboration for our
research center. ACCAD, which focuses on
the study of CG across multiple disciplines,
offers hands-on collaborative opportunities
for scientists and artists to work on various
projects such as this (see Extracurricular
Activities, pg. 45).
Our goal was to create an
aesthetically interesting experience for traditional classical music lovers, says ACCAD animation specialist Vita Berezina-Blackburn.

Animal Research
Setting up this project as it would any reallife production, ACCAD selected a team
based on the students individual expertise.
It was a great project for art and design
students, Palazzi says. A lot of the initial
work occurred during independent studies classes, where the students had the opportunity to incorporate personal research
into the production. In the quarter prior to
the show, the students worked together as a
group, during class hours and beyond.
Under the direction of Berezina-Blackburn, the students sunk their teeth into the
work, starting with research, as many of the

students were unfamiliar with Thurbers


work. The group formed last spring and
spent numerous hours reviewing Thurbers
drawings and talking to the folks who run
Thurber House, getting their take on the in-

n n n n

storyboarding and concept development.


The others chose specific roles, as well, such
as rigger, animator, or VFX artist, thereby
assuring continuity in the style of movement throughout the segments.

Graduate students at Ohio State Universitys ACCAD re-created writer/cartoonist James Thurbers
line drawings using 3D tools, and then animated the imagery as a visual accompaniment during a
special orchestra performance of a suite composed in honor of Thurber.

tention and meaning behind the humorists


cartoons. They were committed to creating a 3D animation that reflected the spirit
of Thurbers drawings and the techniques
he used, says Palazzi.
One student, Beth Albright, who now
works at Pixar, says everyone in the group
looked at all the Thurber dog drawings they
could get their hands on, and read the related stories, too. Although the music was
inspired by specific drawings and not specific writings, the writings did give us story
ideas that we could weave in, she says.
For the most part, though, the research
focused on different ways and angles to
draw the dogs, and we found out very
quickly that if you line [the dogs] up, they
werent the same, says Albright. A Thurber
dog has a specific look. It has specific characteristics but many other features, as well. So
we designed our Thurber dogs to be true to
all the references but to fit within our specifications, so they could be animated in the
way we wanted to animate them.
Of the eight grad students who worked
on the project, two of the students did the

Animal Development
Although the six movements of the suite
would run approximately 20 minutes, the
group initially planned to complete just
one minute of animation for each movementessentially adding visual paws
and then pause. However, as these artists became engrossed in their work, they
produced more and more CG to where
animation played throughout most of the
musical piece. Some segments at the beginning of each music movement are left
blank, and then the animation starts about
15 to 20 seconds into the movement, describes Berezina-Blackburn.
Albright, along with another student,
storyboarded each of the six segments; later,
each assumed responsibility for three of the
boards and created animatics using Adobes
Photoshop and After Effects, and Apples
Final Cut Pro. Next came the modeling
and rigging, which was done in Autodesks
Maya. The crew then used After Effects for
the final rigging and postprocessing.
The team was already on the production
for nearly two quarters when grad student
April 2009

43

n n n n

Education

Iuri Lioi joined them. Lioi, who graduates


this year and has a position waiting for him
at DreamWorks, modeled, rigged, and animated the adult dog. He spent 10 weeks
on the productiontwo weeks modeling
and the rest animating.
The project presented both technical
and conceptual challenges. According to
Lioi, the rig had to be flexible enough to
support the changing proportion of the
dogs front and back legs throughout the
piece, to make it look like a Thurber dog
illustration being animated, he adds. The
complex rig, built in Maya, supports FK
and IK, and contains three layers of bones
for each of the two main dog characters.
We were trying to reproduce Thurbers
style, which uses 2D lines and is organic,
only we were doing it in three dimensions,
says Lioi. That was challenging.

Leashed Creativity
As Lioi explains, artistically, the students had
to ask themselves, How would Thurber
have animated his own drawings and illustrations? Then the group had to animate the
art based on Thurbers style, not their own.
This was easier said than done. The group
started with a source that was nontechnical: 2D and, in some cases, 1D drawings
that are cartoony. His is not a subject you
would look at and immediately think you
should do it in 3D, points out Albright.
For us, the challenge was using the tools
we had available that allow us to do many

44

April 2009

things, yet remain true to


the charm and the particular look of Thurbers work.
Its very specific. People can
identify his drawing style.
Palazzi agrees with the
students assessment. In the
beginning, the art looked
deceptively simple because
Thurbers drawings are deceptively simple, she says.
But once the students started studying the anatomy he
was drawing for the dogs,
they found that the anatomy changed from drawing
to drawing.
Thurber particularly foThe Thurber dog animations were projected above the
ProMusica Chamber Orchestra during the live performance.
cused on hounds, Scotties,
and poodles. However, when
inspecting the dogs closer, the students disAnother big challenge was simply procovered that in one instance, Thurber had ducing the 20 minutes of animation,
used a part from one dog and another part which is still laborious, no matter the subfrom a different dog, and combined them ject. The models were rendered using the
into a breed that didnt exist. As a result, Maya software renderer, which supported
the students had to interpret the anatomy Maya Paint Effects, used to create the black
of the unique dog and come up with dif- cartoon outlines for the subject matter. The
ferent types of movement based on that students also created custom paintbrushes
anatomy.
within Maya Toon shaders. Technically,
At one point in his life, Thurber become Paint Effects and Toon are rich tools, but
blind in one eye, and many of the charac- there were many variables that were diffiters he drew were especially flat, with very cult to control, says Berezina-Blackburn.
little perspective and an orientation that fa- You can get a lot of cool effects easily, but
vored one side, notes Berezina-Blackburn. achieving something precise is difficult.
As a result, the ACCAD students imagery
The group used After Effects for posthad to reflect those same qualities. Also, processing, to give the imagery a handthe artists had to proceed without ortho- drawn feel and a lower frame rate than the
graphic views or maquettes as references.
traditional 30 frames per second.
They didnt know what was on the
Thurbers drawings are very conceptual
other side of the dogs because [Thurber] rather than physically accurate or photodidnt draw that, but the students had to realistic, says Berezina-Blackburn. He
include [the missing information] in their used lines to express a mood or a state. The
3D models, Palazzi says. Many of the team had looked at early hand-drawn anistudents started out thinking, Oh, this mation material to see if they should imishouldnt be too hard, but once they re- tate or replicate a particular style, or come
ally studied the drawings, they found the up with their own way to stylize the movework to be more of a challenge than they ments of the dogs. In the end, they decided
had realized.
to reinterpret Thurbers drawings.

Education

They contain movement and are so rich


with gesture that you wouldnt know how
he would have approached the movement
itself, Berezina-Blackburn says. This project offered students the chance to deal with a
unique subject. It is not the typical photorealistic 3D animation process they are used to.

Setting the Tone


Another unique challenge the artists had to
face was the musicin particular, animating the dogs to the music. As Palazzi points
outs, each drawing did not necessarily have
a full-length story written for it. So, the student artists interpreted a single drawing and
came up with a story line for it. The music
allowed us to imagine a story, she adds. In
fact, that is what the composer had done
when writing the music, though he only
used a single drawing for his inspiration.
We listened to the music and imagined
what was happening during the pieces,
Palazzi says. That inspired different types
of designs for objects and movement. The
animation fused together what Thurber
and the composer were expressing.
Conceptually, the group was working
with music that would be performed live,
so there was no way to perfectly synchronize the animation to the music. You
would think that classical music is performed with great precision, but sometimes the music would fall behind during
the performance, and then get ahead, but
it always ended perfectly, says BerezinaBlackburn. So, we were not trying to
match a specific Bam! with a visual splash.
We had to make a semi-loose structure for
the animation.
Had the group been given more time,
Berezina-Blackburn says she would have
liked to have stylized the dog movements
even more, and added more stylized deformation to the body in motion. Looking at
the drawings, when a dog is jumping forward, the limbs are sort of curling backward,
doing something not anatomically possible.
We didnt quite go there because we didnt

n n n n

Extracurricular Activities
When Charles Csuri founded the Computer Graphics Research Group (CGRG)
at Ohio State University (OSU) in the early 1970s, his goal was to realize the
potential of computer animation across a number of disciplines, offering the opportunity for scientists and artists to collaborate on projects. He believed that
both sides (the arts and sciences) needed each other if CG was going to come
into being, explains ACCAD director Maria Palazzi.
CGRG eventually evolved into ACCAD, but the spirit of uniting the disciplines
remains today. And the cooperative nature of the program extends to the outside,
too, as ACCAD offers students a chance to work in a real production environment. And with the Thurber project, it became a perfect opportunity to link current
OSU students with a famed past student: Thurber.
For former OSU student Beth Albright, working on an outside project such as
Thurbers Dogs augmented lessons learned in the classroom. We had several
production-based classes where you are in a class environment working on a
class assignment. But a class like this [independent studies class], where you are
doing a production with others, really does emulate the work environment where
you work alongside others, have deadlines, restrictions, and specifications, must
meet the outside clients needs, and have to find a way to get it all done, she
says. Sometimes you have to give up your personal aesthetic and goals to accomplish the aesthetic and goals of the group and the project as a whole. And
that is not a lesson that is usually taught in the classroom.
Albright notes that for Thurbers Dogs, the team had to complete the work in
10 weeks, teaching them the value of a deadline. You cant not get it done, she
adds. It was going on stage on a certain date.
The project also offered the chance to learn and grow by giving the team a
chance to try different roles and new things.
Iuri Lioi, another student, also found the project experience invaluable. He was
completing a thesis on the process of creating an animation, from design to final
product. I was studying that on my own and writing the thesis, but for the first
time I was able to see and experience the whole process, he adds, analyzing
how big the project was, assigning tasks, coordinating, communicating, organizingthings we never had an opportunity to do in class. We talked about it and
studied it, but there was no chance to put those things into practice.
Lioi says he will take the practical knowledge he gained on Thurbers Dogs
with him when, this summer, he begins his job at DreamWorks, where he interned last summer. Karen Moltenbrey

have the time to figure it out, she says.


Nonetheless, Berezina-Blackburn found
the project fun and different. We wanted
things to be simple, because that is the
quality of his work. But with 3D, you have
to make it more complicated than it appears, she adds. One lesson we will take
with us is how to streamline our own work.

You dont always need to make something


function like a real object.
Often, its the more simple things that
are the most treasured. And perhaps that is
why Thurbers work was so captivating. n
Karen Moltenbrey is the chief editor of Computer
Graphics World.
April 2009

45

For additional product news


and information, visit CGW.com

SOFTWARE
Painting
Painter Update
Corel has unveiled Corel Painter 11, the
latest version of its painting and illustration software, which boasts more than
40 new and enhanced features. Painter
11 enables artists to expand their digital
tool set with such advanced painting and
natural media utilities as new pressuresensitive brushes, enhanced drawing
tools, and customizable media. Users
can create and customize brushes and
media variants, as well as tap new artistic
media, hard media brushes, and selection
tools. Improved color management, the
ability to undo brushstrokes and other
effects, enhanced brush performance,
and increased responsiveness round out
Version 11. Now available, Corel Painter
11 is priced at $399 for the full version
and $199 for the upgrade.

WIN

Corel; www.corel.com,
www.painterfactory.com

WIN

MAC

Artist: Ed Steinerts. 2009 Gardner Denver Nash.

CaD
Certified for Inventor
Okino Computer Graphics is now
shipping software products that have
received Autodesk Inventor 2010 Certifi-

cation. Okinos Autodesk Inventor solution


enables users to transfer crack-free geometry, hierarchy, and materials from native
disk-based Autodesk Inventor files, or
from a running copy of Autodesk Inventor
directly into any Okino data conversioncompliant application. Users can import
complete assemblies from a running copy
of Autodesk Inventor, or from native .iam
and .ipt files on disk, without requiring a local
copy of Inventor. The Inventor CAD importer
is a component of the Okino CAD/Pack
add-on license. The Okino CAD/Pack is
priced at $245.

Okino Computer Graphics; www.okino.com

WIN

Plug-in

You to Go in 3D
Big Stage Entertainment has opened its
proprietary 3D facial modeling system
to third parties for integration into video
games, virtual worlds, Web sites, mobile
applications, and kiosk-based systems.
PortableYou enables third parties to offer
a consumer experience that features an
animated, 3D likeness of a user to enhance
everything from entertainment to communications and retail. Customers can create a
sophisticated 3D model of their face in less
than two minutes using one to three photos
from a standard digital camera. The PortableYou system includes APIs, code samples,
methods, and reference libraries.

Big Stage Entertainment; www.bigstage.com

Tentacles for TrueSpace


TurboSquid has released Tentacles, a
3D software plug-in for use with Caligari TrueSpace 3D modeling software.
Tentacles is an Adobe Flash-based plug-in
that infuses 3D applications with the TurboSquid online library and purchasing. Users
can browse, compare, and purchase 3D
content, such as models, textures, and
mocap data, within their 3D programs.
For the first time, users of the new version
can publish their TrueSpace creations for
sale on TurboSquid.com, also from within
Tentacles or TrueSpace. Furthermore,
users can save media, materials, and
models with free online storage, create
secure and shareable workspaces, and
manage their content library, without leaving the 3D application. Tentacles is free
and available now as a stand-alone tool or
a plug-in for Caligaris TrueSpace, Maxons
Cinema 4D, and Autodesks 3ds Max,
Maya, and Softimage XSI.

TurboSquid; www.turbosquid.com
Caligari; www.caligari.com

MoDeling

WIN

ShaDerS
Shady Software
Mental Images revealed that an updated beta version of the Mental Mill Artist
Edition shader creation tool is bundled
with Nvidias FX Composer 2.5. The
Mental Mill Artist Editions intuitive graphical interface enables artists, designers,
and shader developers to create, test,
and maintain shaders with real-time
feedback and without any programming.
This latest version boasts enhanced GUI
navigation, improved preview rendering, new shaders, and user-controlled
compiler and export options. Mental Mill
and Nvidia FX Composer 2.5 can be used
to create shaders for HLSL, Collada FX,
and CgFX in DirectX and OpenGL. Nvidia
FX Composer 2.5, which includes Mental
Mill Artist Edition, is available for download
from www.fxcomposer.com.

Mental images; www.mentalimages.com


Nvidia; www.nvidia.com

WIN

April 2009, Volume 32, Number 4: COMPUTER GRAPHICS WORLD (USPS 665-250) (ISSN-0271-4159) is published monthly (12 issues) by COP Communications,
Inc. Corporate offices: 620 West Elk Avenue, Glendale, CA 91204, Tel: 818-291-1100; FAX: 818-291-1190; Web Address: info@copprints.com. Periodicals postage paid at Glendale, CA, 91205 & additional mailing offices. COMPUTER GRAPHICS WORLD is distributed worldwide. Annual subscription prices are $72, USA;
$98, Canada & Mexico; $150 International airfreight. To order subscriptions, call 847-559-7310.
2009 CGW by COP Communications, Inc. All rights reserved. No material may be reprinted without permission. Authorization to photocopy items for internal
or personal use, or the internal or personal use of specific clients, is granted by Computer Graphics World, ISSN-0271-4159, provided that the appropriate fee
is paid directly to Copyright Clearance Center Inc., 222 Rosewood Drive, Danvers, MA 01923 USA 508-750-8400. Prior to photocopying items for educational
classroom use, please contact Copyright Clearance Center Inc., 222 Rosewood Drive, Danvers, MA 01923 USA 508-750-8400. For further information check
Copyright Clearance Center Inc. online at: www.copyright.com. The COMPUTER GRAPHICS WORLD fee code for users of the Transactional Reporting Services is
0271-4159/96 $1.00 + .35.
POSTMASTER: Send change of address form to Computer Graphics World, P.O. Box 3296, Northbrook, IL 60065-3296.

46

April 2009

Win Ma

Displays

HARDWARE
Virtual Camera
Gamecaster Tools
Craft Camera Tools for GCS3, a software
and hardware bundle developed by Craft
Animations AB and Gamecaster, enables
users to direct real-time virtual camera
controls from within Autodesks 3ds Max
and Maya. Craft Camera Tools for GCS3
enables virtual cinematography, eliminating the need for traditional keyframing of
virtual cameras in 3D animated scenes.
The solution combines Craft Animations
Craft Camera Tools and Gamecasters
patented GCS3 virtual camera control
hardware to deliver a new way of directing animation. Users can direct through
the GCS3 virtual camera controllers
viewfinder, and gain hands-on control

over the 3D animation, visual effects, and


previsualization.
Craft Animations and Gamecaster also
offer the GCS3 FreeCam and GCS3
TripodCam, providing physical control of
the virtual cameras pan, tilt, zoom, crane
and dolly, walk-cycle, explosion motion,
and spline control motion. Craft Director
Toolsincluding Craft Camera Tools, Craft
Vehicle Tools, and Craft Accessoriesis
now available for Autodesks 3ds Max
and Maya on Macintosh and Windows
platforms, and is priced between $129
and $1199.

Craft Animations; www.craftanimations.com


Gamecaster; www.gamecaster.com
WIN

MAC

Windows-32/ Windows-64/OS X-32 / OSX-64

SpectraView-Enabled Workspaces
N EC Display Solutions of America
has released its 26-inch MultiSync
LC D26 9 0W2-B K-SV and 3 0-inch
LCD3090W-BK-SV widescreen displays,
complete with the SpectraViewII colorcalibration sensor and software. The two
MultiSync LCDs take advantage of SpectraViewII technology to deliver accurate, reliable, and repeatable display calibration and
profiling. The new products are designed
for users in the graphic design, digital
animation, photography, print production,
image analysis, and CAD/CAM industries,
as well as soft-copy clinical viewing in the
medical field. The combined NEC display
and SpectraViewII solution features three
internal 12-bit look-up tables, automated
calibration, multiple calibration sets, calibrated display information and status vali-

32-bit only$399

HighPerformance
Camera Tracking

Use SynthEyes for animated critter insertion,xing shaky shots,


virtual set extensions, making 3D movies, architectural previews,
accident reconstruction, virtual product placement,
face and body capture, and more.

See the website for details


of the latest version!
April 2009

47

NUX

dation, monitor locking, and a colorimeter


function. The MultiSync LCD2690W2-BKSV and LCD3090W-BK-SV wide-gamut,
widescreen LCDs use ColorComp technology to reduce LCD uniformity errors. The
companys SVII-Pro-Kit, which includes a
color measurement sensor and SpectraViewII calibration software, is available as

an accessory for select NEC


LCD displays for $329. SpectraViewII software can be purchased
separately for $105. The LCD2690W2BK-SV and LCD3090W-BK-SV, which
are each backed by a four-year parts and
labor warranty, are available for $1449 and
$2449, respectively.

NEC Display Solutions of America;


www.necdisplay.com

Graphics boards
Low-profile Graphics
Nvidia has unveiled its Quadro NVS
420, a low-profile professional graphics solution designed to maximize
display real estate with a small formfactor computer. The new release
boasts Nvidia nView display software
and support for up to four 30-inch

48

April 2009

displays at 2560x1600
resolution, with the goal of
increasing users productivity. In addition, the Quadro NVS 420 professional
graphics solution features display gridlines, virtual desktops, and an extended
Windows Taskbar. The Quadro NVS 420
GPU offers a large frame buffer, high
memory bandwidth, and Nvidia Unified
Driver Architecture, ensuring forward
and backward compatibility with Nvidia
software drivers. The Nvidia Quadro NVS
420 GPU is now available and carries a
price tag of $499.

Nvidia; www.nvidia.com

T:8 in
S:7 in

HP recommends Windows Vista Business.

Only in theaters

powered by the brilliant performance of the Intel Xeon processor, Linux


operating system and the billion-color HP DreamColor display.
Learn more about the next-generation design, extreme speed, massive
expandability and the new Intel Xeon processor 5500 series of the
HP Z800 Workstation at hp.com/zworkstations

DreamWorks used the HP xw8600 Workstation in Monsters vs. Aliens. Monsters vs. Aliens & 2009 DreamWorks Animation LLC. All Rights Reserved. Certain
Windows Vista product features require advanced or additional hardware. See http://www.microsoft.com/windowsvista/getready/hardwarereqs.mspx and
http://www.microsoft.com/windowsvista/getready/capable.mspx for details. Windows Vista Upgrade Advisor can help you determine which features of Windows Vista will
run on your computer. To download the tool, visit www.windowsvista.com/upgradeadvisor. Intel, the Intel logo, Xeon, and Xeon Inside are trademarks or registered
trademarks of Intel Corporation in the U.S. and other countries. Copyright 2009 Hewlett-Packard Development Company, L.P. The information contained herein is
subject to change without notice. Simulated images.

T:10.75 in

S:10 in

To create Monsters vs. Aliens, DreamWorks turned to the HP Workstation,

Das könnte Ihnen auch gefallen