Author: Barbara Robertson

  • Animation Magazine’s Oscar Watch Part 3 of 4: Surveying the VFX Fireworks

    Alternate realities, science fiction, action adventure, thrillers, children’s stories, graphic novels, mysteries’this year, visual effects crews peppered films in all these genres with images that could not have been created in any other way, opened directors’ imaginations and gave us unforgettable artistry. As for trends? Destruction. Heavy metal. Blue humanoids. Performance capture. Stereo 3-D. Total Immersion. And mechanical suits of armor. Did Iron Man start the trend? This year, we saw humans climb inside and operate robotic metal suits in G.I. Joe: The Rise of Cobra, District 9 and Avatar.

    This month, seven films were singled out by the executive committee for the Academy of Motion Pictures Arts & Sciences’ visual effects branch. On January 21, the vfx work of the teams working on these seven titles will be showcased at the annual “bake-off.” The three films with the most votes become Oscar nominees on February 2, then all Academy members can vote for the Oscar winner, announced March 7. Here’s a quick look at the accomplishments and odds of each of these envelope-pushing vfx-laden movies:

    2012

    Studio: Columbia Pictures

    VFX: Alex Lemke FX, Caf’FX, Crazy Horse, Digital Domain, Double Negative, Evil Eye, Factory FX, Gradient FX, Hydraulx, Picture Mill, Pixomondo, The Post Office, Scanline, Sony Pictures Imageworks, Uncharted Territory, UPP.

    Fans loved this film’s non-stop action to the tune of $715 million at the box office. Led by Volker Engel, the film is a vfx demo reel of extremely difficult digital simulations. Cities crumble. Yellowstone erupts. Waves smash against the Himalayas. If the job of visual effects is to support the story, then this enormous effort deserves a seat at the bake-off. Director Roland Emmerich simply could not have destroyed the planet like this without visual effects.

    Avatar

    Studio: Twentieth Century Fox

    VFX: BUF, Framestore CFC, Gentle Giant Studios, Giant Studios, Halon Entertainment, Hybride Technologies, Hydraulx, Industrial Light & Magic, Lola Visual Effects, Pixel Liberation Front, Stan Winston Studio, The Third Floor, Weta Digital.

    Director James Cameron’s Lightstorm Entertainment set the stage. Led by multiple Oscar winner Joe Letteri, Weta Digital created it with a little help from their friends. Filmmakers have never before immersed audiences so effectively in an entirely alien and totally digital culture and environment. We believe the 10-foot tall Na’vi people with long tails and cat eyes are real. Plus: Beautiful alien dragons fight grungy military airships. What could be better? You can talk about the other vfx candidates until you’re blue in the face, but this one is a shoo-in for the bake-off, an Oscar nomination and as a frontrunner for the Oscar.

    District 9

    Studio: TriStar Pictures

    VFX: The Embassy, Goldtooth Creative, Image Engine Design, MXFX Special Effects, Weta Digital, Weta Workshop, XYZ-RGB (scanning), Zoic Studios.

    The aliens look like a cross between an enormous grasshopper and a greenish-yellow lobster. They scrape out a hard life in a South African township, and somehow the visual effects crews made this fantasy seem real, possibly normal. Writer/director Neill Blomkamp’s newsreel documentary style helped send this low-budget sci-fi film to number one at the box office on opening weekend, generated much buzz, and incited rare critical acclaim for a sci-fi film. It could certainly go all the way.

    Harry Potter and the Half-Blood Prince

    Studio: Warner Bros.

    VFX: Cinesite, Cube Effects, Double Negative, Foreign Office, Fugitive Studios, Gentle Giant Studios, Industrial Light & Magic, Kerner Optical, Luma Pictures, Moving Picture Company, Plowman Craven & Associates (Lidar), Rising Sun Pictures, Vee Eye, The Visual Effects Company.

    One of the most acclaimed films in the Potter series finds a teen-aged Harry back at Hogwarts despite Death Eater destruction. Harry zooms his broomstick through a Quidditch match, fights off thousands of zombie inferi that drag him underwater, and manages his hormones. Dumbledore steals the vfx show, though, by conjuring an astonishing CG firestorm (thanks to ILM). Tim Burke supervised the magic for the fourth time. Will the fx wizards conjure up a second Oscar nom for the Hogwarts crew? You need a Pensieve to answer that one.

    Star Trek

    Studio: Paramount

    VFX: Digital Domain, Evil Eye Pictures, Industrial Light & Magic, Kerner Optical, Lola Visual Effects, OOOii, Persistence of Vision Entertainment, Quantum Creation FX, Svengali Visual Effects, The Third Floor, Tinsley Transfers, Vital Distraction.

    The visual effects led by ILM’s Roger Guyett helped director J. J. Abrams take the 11th film in the franchise to where the last few hadn’t gone before’to rave reviews and nearly $400 million at the box office. State-of-the-art sim systems destroyed a planet, but the CG ships stole the show. Will it make the bake-off? Beam ’em up, Scotty!

    Terminator Salvation

    Studio: Warner Bros.

    VFX: Industrial Light & Magic, Kerner Optical, Matte World Digital, Pixel Liberation Front, Plowman Craven & Associates (scanning), Proof, Realscan 3D, Rising Sun Pictures, Rodeo FX, Stan Winston Studios.

    The technical advances at ILM for this film mostly landed on the surface of metal robots, cars, trucks and airplanes that the studio blended seamlessly into live-action and photoreal digital backgrounds. How ’bout those mototerminators! And, did you catch the young, digital Arnold Schwarzenegger? Talk about alternate reality. Should make the bake-off.

    Transformers: Revenge of the Fallen

    Studio: DreamWorks SKG, Paramount Pictures

    VFX: Asylum VFX, Digital Domain, Industrial Light & Magic, Kerner Optical, Proof.

    Sixty CG robots, two-thirds created at ILM, the rest at Digital Domain’some brutal and ginormous, others small and funny, one old and cranky, another as agile as a cat’battle each other and some humans onboard an aircraft carrier, in a forest, in the desert, on a college campus, in a chop shop. They dive underwater. They visit another planet. And in one dramatic scene, the biggest baddest ‘bot of them all rips apart a pyramid thanks to ILM’s artists and state-of-the-art technology. Look for Optimus Prime and the Devastator at the bake-off.

  • Out of the Blue

    How years of R&D, a revolutionary virtual camera system and long hours of detailed performance capture and keyframe animation gave birth to the 3-D, magical world of James Cameron’s awe-inspiring Avatar.

    Raise your hand if you want to become a Na’vi, live on Pandora, ride butterfly-colored dragons, suck juice from fluorescent flowers, and plug your braid directly into a rainforest energy field.

    By taking audiences to an alien planet created entirely out of his imagination ‘ not filmed in Italy, Morocco, or anywhere else on this planet ‘ director James Cameron’s personal rhapsody in blue called Avatar has transformed science fiction films. Never before have we spent hours living with the native people of another world, the 10-foot tall willowy Na’vi who have tails like lions, blue skin with little spots that light up, yellow eyes, twitchy ears. Never before would we have thought this could ever be believable. And yet, here we are at the end of the first decade of the 21st century, visiting the planet Pandora and loving every minute. On opening weekend alone, the film earned $242 million worldwide.

    Avatar

    Actor Sam Worthington plays Jake Sully, the hero of Avatar, a disabled soldier who inhabits and puppets a body built like a Na’vi, that is, an avatar. The avatar allows him to, in effect, breathe the air on Pandora and assimilate with the Na’vi.

    In a brilliant piece of parallelism, Worthington also inhabits and puppets the CG avatar that we see in the film. He does so via a performance capture system designed, for his body, by Cameron’s Lightstorm Entertainment and Giant Studios, and a facial capture system designed by Weta Digital. Zoe Saldana plays Neytiri, the beautiful daughter of a Na’vi leader, who becomes Jake’s tutor and our guide into the Na’vi ways. Sigourney Weaver as Dr. Grace Augustine and Joel Moore as Norm Spellman also inhabit Na’vi avatars.

    All the avatars and Na’vi are CG characters that live in a totally CG world created at Weta Digital in Wellington, New Zealand. Joe Letteri supervised that work, spreading the effort among six visual effects supervisors, Dan Lemmon, Stephen Rosenbaum, Eric Saindon, Wayne Stables, Chris White, and Guy Williams.

    In Los Angeles, on a motion capture stage, with props built to emulate the digital landscape that Weta would create later, Cameron could see, in realtime, actors’ performances applied to digital characters in a game-like version of Pandora’s environment. As the actors performed, Cameron could ‘film’ the action using a virtual camera system, a nine-inch LCD screen with a steering wheel around it.

    Data captured from tracking markers on the camera and from all the performers moved through Autodesk’s Motion Builder, even their facial expressions. Cameron was fearless. He filmed Jake and Neytiri giving emotional performances in tight close-ups of their faces.

    Avatar

    ‘Jim [Cameron] had done some facial motion capture tests for another film and was looking at a video head rig,’ Letteri says. ‘We knew that the more cameras, the more accurate the data, but they get in the way. Our goal was to work with a single camera system and software to track everything and re-project it back onto 3D characters.’

    Rather than using the typical retro-reflective markers on the actors’ faces, Weta applied green dots with makeup. Each actor wore a helmet with a tiny camera attached to a boom arm. The boom arm swung into place between their nose and upper lip, high enough to capture their eyes, yet close enough to capture mouth movement. ‘We could track the facial gestures and muscle movement in realtime and apply the facial capture to the characters in realtime as Jim [Cameron] captured the performances,’ says Rosenbaum.

    A ‘solving’ team at Weta cleaned the data from the body and facial capture sessions and applied it to animation rigs in Autodesk’s Maya. ‘The motion data drove our systems as if an animator was doing key frames,’ says Andy Jones, animation director. ‘It gave us a lot of the life and frame by frame motion that animators don’t want to do. Our biggest challenge was probably facial animation.’

    Animators could look at footage taken with the facial camera and at HD reference and scrub the faces they animated to see if everything moved in the same way as the actors’ faces. ‘Jim would do 10 or 15 takes and pick the ones he wanted,’ Jones says. ‘He might like the way a lip moved, the inflections in the face. We had to make sure it was all in there and then some.’

    Although the motion data was especially helpful for lip synch and mouth movement, animators keyframed much of the brow and eye animation, and the ears, which weren’t captured. They also keyframed Na’vi hands, fingers, and tails.

    Avatar

    In addition to the Na’vi, the animation team had a planet-full of fantasy creatures to animate, flying dragon-like creatures, six-legged horse and panther-like creatures, bugs, insects, and luminescent white jelly-fish like souls that rain from the skies. ‘We worked for four months just doing motion studies and tests,’ Jones says.

    A new volume-preserving muscle system that calculates fat layers more accurately than previous systems at Weta added secondary motion. ‘It took probably a year of R&D and another year to make sure it worked properly in the shots,’ Saindon says. ‘We had similar ideas before, but this system is completely written from the ground up to do more accurate simulations and more accurate volumes.’

    As astounding as the Na’vi and these creatures are, the environment they live in is astonishing as well. At night, the exotic plants in shades of blue, purple, orange, white, and green shimmer and glow with bioluminescence; a stunning design that reflects Cameron’s love of underwater photography.

    The landscape team at Weta started with simple representations of the environment from Lightstorm, FBX files that showed where Cameron wanted plants placed for particular camera angles. The digital landscapers then painted areas where they wanted plants to grow, and a landscape system placed pre-existing geometry in place at correct angles. The average plant had 100,000 polygons, and each plant had dynamics so that something could move on its own or react to the characters in every frame.

    ‘We also created a riverbed, an ocean-front scene, floating mountains, and there are waterfalls everywhere,’ Williams says. ‘It’s like an Amazon jungle on steroids. All those aerial shots in the third act take place in 3D space with 3D mountains and 3D trees because stereo is unforgiving. To have Jake running through a forest, the camera needs to move as fast as he does. We have 3D fluid solvers for water and clouds, hardly any 2D elements; this is a true stereo show with everything rendered in stereo.’

    Avatar

    Rendering happened through Pixar’s RenderMan 15. ‘Although we wrote a ray tracing engine to calculate spherical harmonics, we pushed everything through RenderMan at the end,’ Letteri says.

    To handle compositing for the stereo 3D film, Weta built a new compositing pipeline based on Apple’s Shake. ‘It does everything in parallel,’ Letteri says. ‘We had to do volumetric lighting, smoke and fire, and composite them volumetrically. It’s all depth based, so characters running through the jungle look good. Every piece is in the right space.’

    And that’s what makes this film so immersive and compelling. With a digital assist from Industrial Light & Magic, the mean aerial warships and the cruel cowboys at their helm provide stark contrast to the native Pandorean world, visually and emotionally. And their actions keep moving the story ahead. But, it’s the strangely beautiful and elegant Na’vi, the fascinating Pandorean creatures (even the nasty purple panther ones), and the extraordinary ecosystem on Pandora that makes us want to return to this film and enter this world again.

    Fox’s Avatar is still playing in theaters nationwide. The feature surpassed Cameron’s own Titanic as the biggest-grossing pictures in the U.S. in February.

  • Making Mayan Mayhem

    No question about it: Audiences will get their money’s worth of eye-popping, end-of-the-world vfx in Roland Emmerich’s disaster movie, 2012.

    This month’s disaster-zeitgeist movie 2012 isn’t only about destruction, but when a story centers on the Mayan calendar’s end of the world in 2012 and the director is Roland Emmerich, you know that planet-wide destruction plays a major role. The wreckage begins with a crack in the firmament, soon ratchets way up to a big earthquake sequence in’where else?’Los Angeles and from there, all hell breaks loose.

    Sixteen studios, under the overall supervision of co-producers Volker Engel and Marc Weigert, created the breakage. Engel and Weigert’s own Uncharted Territory led the mayhem by ripping apart the landscape in Los Angeles and Las Vegas for 400 shots. Studios that contributed around 100 shots or more included Digital Domain, Double Negative, Sony Pictures Imageworks and Scanline. Pixomondo, which previz’d the show, contributed 93 shots. Hydraulx had 60. Gradient, Evil Eye, Factory FX, UPP, The Post Office, Crazy Horse, Alex Lemke FX, Cafe FX and the Picture Mill shared the rest.

    ‘It wasn’t like we started with three studios and ended up with 16 because we didn’t get the shots done,’ says Volker. ‘We planned this from the beginning. We figured out who would be our best partners.’

    Volker and Weigert chose Digital Domain, for example, to do the second half of the earthquake sequence in Los Angeles, which we see from the air. Double Negative bubbled up all the trouble in Yellowstone National Park. Imageworks and Scanline split the third act, with Imageworks creating huge arks inside a Himalayan mountain and Scanline providing the rush of water that floats the boats. Imageworks’ boat-building job extended sets on a massive scale, but for the most part, the studios’ work required rigid body simulation, fluid simulation and particle simulation to create fire, water, landslides, lava slides, earthquakes and general destructive chaos.

    ‘You can’t give these shots to small houses,’ Weigert says. ‘You have to give them to a sizeable house with the people, the pipeline, and the programmers who have done this before. So many mid-sized visual effects houses have closed down there’s literally a shortage of houses that can handle this kind of business.’

    Splitting huge sequences into pieces helped manage the global destruction, as did finding smaller sequences for smaller studios. Hydraulx, for example, cracked a supermarket in half near the beginning of the film, and later destroyed Hawaii with volcanic eruptions and lava flow. ‘They are typical Hydraulx shots with complicated computer graphics and particle simulations,’ Engel says.

    For their part, Engel and Weigert set up shop for their production company Uncharted Territory on the Sony Pictures studio lot. As they had for earlier projects’Independence Day and Coronado‘they staffed and equipped Uncharted Territory specifically for 2012, buying machines and software and hiring over 100 people to handle the effects and manage the project. ‘We wanted to be close to Roland [Emmerich] in editorial,’ Weigert says. And, close to the 400 terabyte server, too, that sent the digital files wherever needed during the post-production process thanks to a proprietary project management system, and then, when finally approved, moved them to editorial.

    The artists working at Uncharted Territory used Autodesk’s Maya for modeling, 3ds Max for effects, and The Foundry’s Nuke for compositing. They also used Cebas’s finalRender, a hardware-accelerated ray tracer for rendering, and that company’s Thinking Particles system, a 3ds max plug-in, for the destruction.

    ‘We learned that they were developing a volume breaker that made it possible to destroy buildings without having to cut them apart by hand,’ Engel says. ‘We made a deal with them to finance part of their development, which gave us exclusivity during production and a close collaboration.’

    To create the Los Angeles earthquake sequence, Engel and Weigert started with three lines from the script. ‘[The script] read kind of like, ‘They run out of the house and as they drive in the limo, buildings crumble around them,” Weigert says. ”And then they arrive at the airport.”

    At first, they considered filming the route and replacing only the buildings they wanted to destroy, but they soon realized they’d need to create an entirely virtual environment except for the limousine, which they shot on a blue-screen stage in Vancouver. ‘In a lot of the shots, we even had to replace the road because it breaks in the shots,’ Engel says.

    To expand the three lines into what became a three minute sequence, the crew at Uncharted Territory created a previz using simple geometry. ‘It was like a little LEGO set,’ Weigert says. Once Emmerich approved the basic idea, Weigert and Volker moved the previz on to Pixomondo, which previz’d around 90 percent of the film.

    ‘We had really good previz,’ says Mohen Leo, who was visual effects supervisor at Digital Domain for 2012. Digital Domain handled the L.A. earthquake sequence from the point at which the limousine arrives at the airport. ‘Pixomondo worked a long time to get the layout and camera that Roland [Emmerich] was really happy with. We did previz ourselves or made changes on a handful, but the whole flow was established. We just had to flesh it out.’

    To do that, the Digital Domain crew shot reference for the parts in Los Angeles that Emmerich wanted to see destroyed in the fly-over, and then built break-apart models of buildings, fire hydrants, traffic lights and so forth to match. Proprietary technology split the geometry into pieces procedurally. To hold the objects together until they shattered, the studio developed Drop, which is custom code built around Bullet, an open source rigid body solver. Both proprietary systems worked inside Houdini, which gave artists the ability to manipulate the simulation.

    For Scanline, though, the previz wasn’t as useful. ‘There’s a big difference between a polygon wave,’ says Stephan Trojansky, visual effects supervisor, referring to the previsualization of moving water, ‘and a simulation that shows in detail how the water moves.’ Finding the right speed to move the huge amounts of water realistically, yet within the time allotted for particular scenes, was a particular challenge because changing the speed of the simulation also changed the look. For example, water shooting off a fast moving wave vaporizes, but at a slower speed, looks like droplets.

    At Double Negative, the challenges were literally groundbreaking. Using proprietary software implemented through Houdini, the studio caused a pool of lava to erupt through the Earth’s crust in Yellowstone Park and then created an ash cloud from which lava bombs and chunks of earth shoot after a fleeing RV.

    ‘This film pushed the limits of everything,’ Weigert says. ‘When you have to move and break the entire environment, it makes everything more than ten-fold as complex.’

    When work on 2012 ended, Weigert and Volker disbanded Uncharted Territory, but will start it up again for Emmerich’s next film, which takes place in 16th century London. ‘When we learned we needed to create 100 percent photoreal environments, we just laughed,’ Weigert says. ‘We said, ‘No problem. Nothing has to break.”

    Sony’s 2012 is now in theaters nationwide.

  • The Giant Lobster-Heads of Summer

    The Toronto-based vfx team behind Syfy’s new hit series, Warehouse 13, discuss the fine art of creating a cornucopia of paranormal artifacts.

    Start with a little Indiana Jones action adventure, mix in some X-Files-style otherworldly elements, toss in a little Moonlighting-style comic banter, and you have Syfy Channel’s hot summer concoction, Warehouse 13. Last month, the series debuted as the top-rated cable show of the night with 3.5 million viewers and most came back the second week giving it a big timeslot win in the ratings war two weeks in a row. And the show’s vfx extravaganza had only begun to kick in.

    The series centers on a storage facility in South Dakota where the government has snagged, bagged and tagged all the paranormal artifacts ever discovered. The stars are a warehouse guard named ‘Artie’ (Saul Rubinek) and two bickering secret service agents assigned to find more artifacts, Peter (Eddie McClintock) and Myka (Joanne Kelly).

    Initially, Rocket Science VFX created the pilot, which aired as the first episode. Toronto-based Keyframe Digital has picked up the work going forward. Co-founders Clint Green and Darren Cranford lead the effort with Green acting as the on-set supervisor and Cranford the visual effects director. In addition to the supervisors, 12 of the other 15 people in the studio work on the series, using 3ds Max, Combustion, FumeFX, boujou, PhotoShop, Illustrator, SynthEyes and a little Maya.

    ‘This show always has something different going on,’ Green says. ‘People walking through walls, giant lobsters on someone’s back.’

    Although the show has only a 10-day turnaround for effects, work on the project began in January. ‘It’s rare that we get in that early, but we were able to really start hashing out ideas on how the effects could be done,’ says Green.

    In meetings with executive producers Jack Kenny and David Simkins, Green and Cranford learned that anything goes. ‘They wanted anything from CG creatures to set extensions,’ Cranford says. ‘But the beauty of it was that we were part of the creative process. We could give them previz that showed a number of ways to create the effects.’

    Before preproduction began on the project, Keyframe received the CG model that Rocket Science had created for the pilot, and developed methods for fast renders of the huge object. They also created an fx ‘bible’ that contained shots they had created for other shows as well as new effects. ‘We showed them walking through walls without blue-screen, fire effects, water effects, ghostly effects and other cool stuff we came up with,’ Cranford says. ‘The second episode wasn’t a big effects episode, but the one after that has all kinds of goodies. Ghost characters, particle work, time travel.’ And, Claudia.

    Claudia (Allison Scagliotti) is the geek’a young techno-wizard who first appears in the third episode. ‘She tries to retrieve her brother from the netherworld,’ Green says. ‘He’s a ghost until he materializes as a human.’

    To create the ghost, Cranford created a human head and torso in CG and animated it through virtual room. Then he pinned on its head a large piece of cloth with tendrils hanging off and ran a 3D simulation on the cloth. ‘I rendered it with transparency, morphed it onto an actor shot on green-screen as an element, and used displacement on the background and a smoke layer to help it sit into the plate,’ Cranford says.

    For the other ghost characters, Green suspended actors on wires on against a green-screen so they were floating. ‘We had them wear dark pants, so we could have the pants disappear into nothing,’ Cranford says. ‘And, [in post] we used a lot of god rays and particle work.’

    According to the vfx execs, about 10 percent of the shots of the first two episodes were green-screen. In later episodes, the number of green-screen shots jumps to 60 percent. ‘The exterior of the warehouse is only a small fa’ade on the lot in Toronto,’ Green says. ‘Any time an actor was outside the set fa’ade, he was in blue-screen. And, in the interior, we used blue-screen for shots outside Artie’s office set.’ Many of the effects, however, relate to the paranormal artifacts.

    For example, in a later episode, Claudia finds a gadget in the warehouse that can project digital photographs onto 3D objects hanging in the air. ‘She always has gadgets,’ Green says. ‘And they’re always messing with her. In one shot she climbs a beam while wearing a magnetic jacket. Certain emotions enhance the magnetism of the jacket, objects start sticking to her, and she gets stuck.’

    The crew also attached a CG robotic lobster to people for another episode. ‘We enhanced a prosthetic piece,’ Cranford says. ‘We always try to use as many practical elements as we can, especially for fire. The combination is always better than just one or the other.’

    As for walking through walls without a blue-screen, they accomplished that with a camera technique. ‘We have the camera 45 degrees from a wall, not more severe, and have someone step sideways into the area, which makes it look like he’s stepping through,’ Cranford says. ‘We move the camera to the wall, lock it, have the character walk on, then unlock the camera.’ Later, they rotoscope the actors and give the contour line an ember glow, which helps provoke the illusion.

    The early success of the series has Green and Cranford hoping for a Battlestar Galactic-type run. The two founded the company in 1997 after a games company laid them off. While they were creating an animatic to help them design the animation for a cinematic, special effects director Colin Chilvers, who was working on X-Men, walked in off the street.

    Cranford tells the story, ‘He said, ‘Do you know how good this would be for previz for movies? If you can figure out how to do the camera, I’ll introduce you to the visual effects supervisor.” They did, he introduced them to Mike Fink, the visual effects supervisor for X-Men, and Fink hired them. After X-Men, they created previz for a number of other films. Along the way, they landed the visual effects work for the series Mutant X, which ran from 2001 to 2004, as well as other TV shows, including The Dresden Files, which Simkin executive produced in 2007. All that led to Warehouse 13, and Green and Cranford are as happy as, well, anyone with a warehouse filled with magical stuff could be.

    ‘We’ve got a full staff,’ Cranford says. ‘The series is well written with good storylines coming up. We couldn’t ask for better producers. Everyone’s morale is high. It’s unreal.’

    Warehouse 13 airs Tuesdays at 9 p.m on the Syfy Channel.

  • SIGGRAPH Quick Bytes: Final Day

    Balance

    It’s odd to think of a product that won a Technical Achievement Award in 2002 from the Academy as ‘new,’ but ScienceD claims that its match-moving software 3DEqualizer4, which releases soon for Linux, Mac OSX, and Windows XP, features a rewrite for 90 percent of its core code. That rewrite includes automagical tracking and a new user interface. Among the biggest boasts are that the new software is up to 100 times faster and it can handle lens distortion over a zoom. We can’t speak to the speed improvements, but the lens distortion over a zoom function is cool. The company’s online tutorials, PLE versions, and leasing programs can help new users and those who need only an occasional match move.

    Power to the Renderer

    Pixar has announced unlimited threading with one license of RenderMan Pro Server 15.0 RenderMan 15 (fifteen!) also introduces a simpler and more efficient method for rendering volumetric effects using two new geometric primitives, a new shading pipeline stage that Pixar dubs ‘refinement,’ new subdivision surface API, and support for Disney’s Ptext texture format Performance enhancements include faster shadow rendering, better memory performance for subdivision surfaces and brick maps, faster ray tracing, faster baking for re-rendering ‘ and more.

    NVIDIA’s Realtime Ray Tracing

    From hours to minutes to milliseconds. Soon, users will be able to play with lights, reflections, refracgions and shadows using software running on PCs equipped with NVIDIA cards. The software applications need to take advantage of NVIDIA’s OptiX engine (API) and SceniX for the hardware company’s Quadro FX GPU’s. But, why wouldn’t they?

    More Realtime Rendering

    MachStudio Pro from StudioGPU digs into the graphics processing units to offer realtime rendering on the desktop. According to the company, what you see is what you need: Images created in realtime are of sufficient quality for finals for broadcast applications, games interstitials, and architectural applications and as previs for high-resolution films. MachStudio’s workflow allows artists to manipulate cameras, lighting, ambient occlusion, animation, and materials in realtime. The company has written exporters for most 3D software including Maya, 3ds max, Rhino, SketchUp Pro, and ArchiCAD, and imports FBX files. The software is priced around $5,000, and that price includes and ATI card.

  • SIGGRAPH Quick Bytes: Day Three

    Visualization in the Pipeline

    I started my day by moderating a panel on pitchvis, previs, postvis, and it was great. Rick Sayre from Pixar, Rob Bredow from Sony Pictures Imageworks, Matt Aitken from Weta Digital, Justin Denton from Halon, and Steve Sullivan and Michael Saunders from ILM all shared their personal and their studios’ approach to visualization for live action films, animation, and commercials. The newsmaker during the session was Steve Sullivan demonstrating, for the first time publicly, ILM’s ZViz software. ILM has no plan to market the software, but if they did, they’d have people standing in line to get their hands on it.

    ILM’s Zviz demo (c) LucasFilm

    Software Eruptions

    Prime Focus, formerly Frantic Films Software, sneaked a peek at Krakatoa 1.5 and a prototype Deadline application for the iPhone. The latest release of Krakatoa, their particle rendering, manipulation and management toolkit, adds a voxel engine and a node-based channel editor. And, with the iPhone version of Deadline, artists can remotely control jobs running on a render farm. ‘By selling software, we can support an R&D department that doesn’t depend on projects,’ says Chris Bond, president and senior VFX supervisor. With Oscar-winning vfx supervisor Mike Fink now on board at Prime Focus, Bond sees the studio, which is known for its expertise in natural phenomena, moving into more character animation.

    Tweak

    Three major announcements from Tweak Software send its RV image and sequence viewer for VFX and animation artists rolling into new territory. First, the popular program is now cross platform, running 2K uncompressed playback from RAM or Disk on Linux, Mac OSX, and Windows systems. Second, Tweak now offers live sync so that two artists can work together, but remotely on RV. And third, a strategic alliance with Shotgun adds sophisticated web-based collaboration.

    Boujou 5

    Vicon has now absorbed Boujou and the company announced a major upgrade that puts users in a supervisory role for motion tracking. The automation is faster, but because users can guide that automation in a variety of ways, it’s more successful. ‘We can threshold the solver now,’ says Philip Elderfield, product manager.

    Tropic Thunder tracked with Boujou

    Moving House

    At Vicon’s House of Moves a team of animators is now creating keyframe animation and facial animation as well as motion capture performances. HOM will also process motion capture data for customers now. ‘We can take care of the whole back end,’ says Tom Armbruster, vice president of sales and support.

    More later!

  • Quick Bytes from SIGGRAPH: Day Two

    SCAD Welcomes Scott Ross

    The Savannah College of Art and Design (SCAD) announced a new and powerful executive advisor to School of Film, Digital Media and Performing Arts: Scott Ross, co-founder, former chairman and CEO of Digital Domain and, before that, general manager of Industrial Light & Magic and vice president of LucasArts Entertainment Group.

    One of Ross’s first jobs is to help plan SCAD’s new school in Hong Kong, which is scheduled to open in the fall, 2010. In addition, Ross is working with SCAD to launch a new digital media center in Atlanta that opens in 2009. The 60,000 square foot facility offer industry-standard learning resources such as multi-camera sound stages with high definition broadcast capability, professional sound recording and mixing suites, editing rooms, a screening room, nearly two dozen classrooms, and set and prop fabrication studios. Programs offered will include animation, interactive design and game development, motion media, and television production.

    Keynote: Will Wright

    ‘How do we entertain the hive brain,’ asked Will Wright, creator of The Sims and Spore, in his immensely entertaining and stimulating keynote address. Noting that Spore users have now created more than 100 million unique creatures, and pointing to the action surrounding the television series Lost, Wright has noticed that casual members of the hive brain enjoy and appreciate depth provided by highly involved members. ‘Data coming from the community is the hub.’ Wright says. Spore creators, for example, can now create their own games and, as part of the latest patch, can export their creatures they can build in minutes into Maya.

    Motion4U

    (www.3Dmotion4u.com)

    I love finding the little booths on the edges of the exhibition area that have clever ideas, and this is very clever. Using little lollipop paddles or tiny sticks with little balls on top, animators can control characters or a virtual camera in Maya and other software programs. The little digital input devices, which range in price from $495 to $3,995 provide, in effect, motion capture on a desktop using a camera mounted on a tripod.

    Talent Sighting

    I also ran into Rob Coleman at the airport waiting for luggage. He’d just flown in from Sydney where he’s working at Dr. D Studios as animation director for George Miller’s Happy Feet 2, scheduled for 2011 release. He came with a big recruiting crew and is actively seeking animators. The production will feature motion-captured characters with key-framed faces, as before, but he says there are a number of new characters that will all be keyframed. (You can find out more at www.drdstudios.com)

    Artist Pete Crumrine, owner of capa2act, holds up the latest issue of his favorite magazine at SIGGRAPH!

  • SUPINFOCOM Finds Super Success at SIGGRAPH

    Each year, SIGGRAPH highlights some of the best animated films created with computer graphics at its annual festival, and each year, the students from one school are always among the top contenders. That school is SUPINFOCOM, a collection of three schools: the founding school in Valenciennes, France; a second in Arles, France; and a third in Pune, India.

    This year, Anima, one of four films by SUPINFOCOM students to win acceptance in the Computer Animation Festival, is a nominee ‘ not for a student prize, but for the prestigious jury award, competing against Dix by The Mill, a live-action film with CG effects, and Love Child, an animation visualization from National Taiwan University of Science and Technology, Taiwan. In Anima, we see an elephant escaping from a city that seems constructed from the shapes of animals. The jury described the film as verging on the experimental, with graphics and lighting making the film particularly compelling. Yankee Gal, a second film showing at SIGGRAPH, won the E-Magiciens competition last fall. The other two SIGGRAPH selections from SUPINFOCOM students are Facteur Mineur and Ed.

    Why does this one school breed so much success? The answer is two-fold, and unfolding. First, SUPINFOCOM traces its history to a decision by the Valenciennes area government to encourage digital media by establishing schools and by incubating fledgling digital media companies. SUPINFOCOM and a sister school for industrial design were early products of that decision. Their success spawned a third school in Valenciennes, SUPINFOGAME, which targets game development, and the branches in Arles and India.

    Although graduates of these schools can and do find work in France, throughout Europe, and in the United States, the chamber of commerce’s incubation programs, which attract companies to northern France and nurture homegrown efforts, help ensure jobs are available near Valenciennes and nearby Lille, which has emerged as a transportation center for high speed trains to and from Brussels, Paris and London. And that, in turn, helps the school provide the students with training from instructors working in the field.

    Second, SUPINFOCOM’s founder Marie-Anne Fontenier decided from the beginning to focus on 3-D computer graphics rather than broadcast television, even though 20 years ago, broadcast television might have seemed like a better choice to some. The SUPINFOCOM program evolved into a unique blend of creativity and practicality. Students must pass an entrance exam that focuses on their talent as artists to be accepted. Once accepted, they spend the first two years studying the fundamentals of art and animation. During the third year they begin working in 3-D. Then, for their final year, they work as part of a team to produce an animated film, much as they would in a studio; all the students submit ideas to the instructors who choose a small number. The films at SIGGRAPH are a product of this last, graduation year effort. SUPINFOGAME adopted a similar program.

    More recently, though, and this is part of the unfolding, the school has decided to change from a four-year to a five-year program. The fourth year, rather than working as a team to create a three to five-minute (or so) film, the students now will create individual, one-minute films. Then in the fifth year, as before, they work on teams to create a longer production. Anne Brotot, deputy director, at SUPINFOCOM Valenciennes, who is replacing the retiring Fontenier, notes that the extra year gives the students a personal film to show prospective employers, and more time to work with 3-D animation.

    SUPINFOGAME also changed its program. The school for prospective game developers still extends over four years, with fundamentals occupying the first two years, but students can now choose whether to concentrate on game design (and level design) during the final two years, or on visual creation.

    Meanwhile, the chamber of commerce has beefed up its support and its determination to increase the number of jobs in the area. Each year, for the past 10 years, the chamber and SUPINFOCOM have held E-Magiciens, a conference and film festival. Last year, the chamber sponsored a new, separate conference called E-Creators, a combination trade show and job fair for game developers, animation studios and others focusing on digital media creation.

    Through the chamber’s incubation program, fledgling studios continue to receive financial assistance, marketing support, consulting services, and low rent. But in addition, the French government designated this area of France as a ‘region of excellence.’ With the Valenciennes chamber leading the effort, serious games developers have exciting new possibilities: The designation opened the doors to large grants for game developers who want to prototype and develop applications in transportation, health, and civic law, especially.

    It’s likely that one result of the serious game effort might be a third school concentrating specifically on serious games. Given SUPINFOCOM’s success, it will definitely be a school to watch.

    SIGGRAPH announces the Jury Award winners Monday night, but with four films from students accepted by the jury for the festival, SUPINFOCOM is already a winner.

  • Building an Island of Lost Souls

    How the sharp crew at ILM helped shape the breath-taking, fiery climax for Harry Potter and the Half-Blood Prince.

    Toward the end of Warner Bros.’ film Harry Potter and the Half-Blood Prince, Dumbledore and Harry travel to a secret cave to retrieve Slytherin’s locket, which contains a Horcrux, one-sixth of Voldermort’s soul. They arrive at a beach, 30-feet wide, and walk into a crystal cave. Inside the cave is an underground lake with a crystal island in the middle, which holds the locket. The dark, vast cave is protected by the Inferi, the trapped souls Voldermort has charged with defending his slice of soul. When they drag Harry into the water, Dumbledore conjures a firestorm 100-feet tall that scares the Inferi. As Harry swims to the surface, a firestorm swirls around Dumbledore and the Inferi quickly disappear.

    A crew of around 60 people at Industrial Light & Magic worked for about a year to help bring this difficult, climactic scene to cinematic life, ‘We started with a small crew to develop the fire,’ says Tim Alexander, visual effects supervisor. ‘We didn’t know how to do it.’

    While the fire developers worked on their magic trick, layout artists took the storyboards from the sequence and created visualizations of the parts of the cave that the camera would see. Director David Yates filmed two sections on set: the entry and the island. ILM created the rest. ‘We oriented the scene with our layout department, rendered those layouts, and did rudimentary lighting with a hardware render to give [visual effects supervisor] Tim Burke and the director a good sense of what the sequence would look like,’ Alexander says.

    To build the cave, the crew developed two techniques. The first was to instance crystals using the crowd pipeline. ‘It’s a heavy approach,’ Alexander says, ‘but it gave us randomness and a crystalline structure.’ For the ceiling, they stacked pieces of simple do-nut shaped geometry and turned them into rectangular crystals with a shader. Matte paintings filled the far background.

    The Inferi are CG characters that look like skinny people. Some shots have 10 Inferi; others had thousands. Animators keyframed the creatures in shots with less than 100. For shots with more than 100, the crew used particle simulation with sprites instanced onto cards. The cards have images of the animated creatures doing any of eight cycles for around 800 frames. ‘We rendered cycles and populated the whole underwater areas of the cave with sprites,’ Alexander says. Although they’re sprites, ILM didn’t bake in their lighting. Instead, the technical directors rendered multiple passes with normals so they could light the cards with 3D lighting.

    To create the shot with the Inferi pulling Harry into the water, stunt actors in scuba equipment grabbed at actor Daniel Radcliffe and pulled him down into a tank of water on a green-screen stage. Radcliffe held his breath until the end of the take and then one of the scuba divers sitting just out of frame gave him a mouthpiece so he could breathe.

    In the film, though, Harry/Radcliffe is in CG water. ‘When we’re underwater, it’s mostly about atmospherics,’ Alexander says. The crew rendered a water surface so when the camera points up and we see the surface turn orange from the fire, we also see water ripples. Otherwise, they achieved the illusion that Harry is underwater because his hair and clothes truly are floating. When seen from above, the water is a CG plane with a 2D simulation on top for displacement that’s rendered with refraction.

    Framing the Firestorm Finale

    For the firestorm, which Alexander describes as looking like a giant wildfire in the forest, the crew considered using a heavy-duty particle sim that they would render volumetrically. ‘That’s the standard approach,’ Alexander said. ‘But [CG supervisor] Chris Horvath had another idea.’

    One of the reasons fire simulations had not been as realistic as they wanted in the past was that it took too much compute power and time to simulate them in high resolution. Horvath’s light-bulb idea was that because edge detail is more important than depth when simulating fire, it might make sense to do the simulation in two dimensions rather than in a three-dimensional volume. That would allow them to concentrate all the compute power on a full 2K image, rather than reduce the resolution to simulate the fire in three dimensions.

    The trick would be to create the realistic motion. The answer was to start with a low-resolution 3D simulation to produce a fluid particle motion. Then, to slice the volume like a loaf of bread and do high-resolution simulations on the slices in two dimensions. The number of particles in a slice affected the temperature, and therefor, the color. ‘When it’s really hot, we get less smoke and more fire,’ Alexander says. ‘When it’s cool, we have more smoke and less fire.’ As the simulation runs’that is, as the fire burns’parameters control the type of fire, which can range from a bonfire to a lava flow.

    To render the scene’the crystal cave, the water, and the flames’wasn’t as difficult as it might seem. ‘The question was how black is black,’ Alexander says. ‘It’s a dark cave with two light sources without the fire. When we see the fire, we have internal scattering and refraction on the crystals, but the cave is so huge, it gets dimmer behind the island.’

    This is the third Harry Potter film that Alexander has worked on at ILM, and although the studio’s work represented only 160 shots, the shots required new technology and painstaking work. ‘One thing that was really nice is that we got involved at a design level,’ Alexander says. ‘Second, having all our shots in one sequence meant we could really dig in and do the whole thing. It was a moderate-sized Potter for us, but focusing in on one sequence with water, fire and a giant environment was really nice.’

    Warner Bros.’ Harry Potter and the Half-Blood Prince opens in U.S. theaters on July 15.

  • To South America and Beyond

    Up, the 10th feature from John Lasseter’s unbeatable hit-making studio is described as a ‘coming of old age’ movie and travels to some unchartered territories.

    Sometimes the most amazing animated projects simply begin with just a single image. ‘We were goofing around and someone drew a house with a bunch of balloons floating it,’ says Pete Docter, who shares directing honors with Bob Peterson for Disney/Pixar’s latest feature animation, Up. ‘As animators, we aren’t always the most social people, although we can be, and sometimes we like to get away,’ notes Peterson. ‘So, we do what anyone would do, which is tie balloons to our house and fly away.’

    But who lives in the floating house? As Docter (Monsters, Inc.) and Peterson (story writer on Finding Nemo, Ratatouille and story artist on Toy Story 2) searched their imaginations for an unusual character, they realized that a senior citizen, the kind of character Walter Matthau might play or The New Yorker cartoonist George Booth might draw, had never starred in an animated feature.

    ‘The thought of a cranky old guy with brightly colored balloons made us laugh,’ Peterson says. And thus began the animated life of grumpy old Carl Frederick, retired balloon salesman and widower, in 2004. For the next three years or so, Docter, Peterson and storyboard artist Ronnie del Carmen pulled Carl back through his childhood and propelled him forward into his future. In Up‘s opening scenes, we learn that Carl and his wife had wanted to have a great adventure like their hero, the pilot Charles Muntz, but life always got in the way. So, when Carl finally adventures off in his balloon-powered house, he pilots toward his long-lost dream. An enhanced version of Pixar’s dynamics program made it possible for 10,000 balloons to interact in the wind’and crash land at exactly the right place.

    ‘In our first draft we had him crash land on a tropical island,’ Docter says. ‘But, that’s been done so often. The table-top mountains in Venezuela gave us everything we needed. They could be stuck somewhere in a way we’d never seen before.’

    ‘They’ includes Russell, an excitable eight-year-old boy who accidentally stows away on Carl’s floating house. Russell isn’t the only member of Carl’s new family, though.

    ‘Most characters have empty parts that the other characters can help fill,’ Docter says. ‘Carl lost his wife and he’s not connected to life in any exciting way. Russell is all excited about everything, and he doesn’t have a father. But, Carl is a tough guy to turn. We needed more than just one character.’

    The other characters include a 13-foot tall exotic bird and a goofy dog named Dug who talks via a high-tech collar that translates his thoughts into words. ‘Pete and I are lifelong dog lovers and big fans of fantasy movies that keep you surprised,’ Peterson says. ‘So when the idea of a talking dog came along, we decided to try it. It’s more emotional to hear the thoughts of a dog, and the collar afforded us a way to have natural dog behavior while their thoughts were coming out.’

    Dug, a Golden Retriever/Lab mix voiced by Peterson, has a big nose and an extra floppy tongue. He’s a nerd and eager to please. Though when we meet the rest of the talking dog pack, led by Alpha, a Doberman also voiced by Peterson with an electronic assist; Beta, a Rottweiler; and Gamma, a Bulldog, we soon realize they would rather hunt than please.

    ‘We barely exaggerate anything with the dogs,’ says Scott Clark, supervising animator. ‘The way we animate them is dog-like. But, they have almost human expressions built into their rigs that add a little flavor. I loved cracking the code with the dogs.’

    Odd Bird

    Kevin the bird, on the other hand, a giant flightless creature with an orange beak, long blue neck and purple tail feathers, gave the animators a way to stretch their wings.

    ‘We gave it realistic motion, but it can do silly things,’ Clark says. ‘We had a lot of fun animating Kevin. It cradles Russell like a baby and throws him up in the air like a monkey. Pete came up with the idea of having the bird’s eyes always stuck in the middle to give it a dumb quality.’

    For the human characters, Pixar leaned toward caricatured versions of real people, and those caricatured designs affected the performances. ‘Ricky and I tend to be attracted to simplified characters,’ Docter says, referring to production designer Ricky Nierva.

    For example, Carl, voiced by Ed Asner, is only three heads tall. He has a square head, square liver spots, and stocky arms and legs. His default expression is a scowl. ‘He’s an old man who has shriveled into his suit,’ Clark says. ‘He’s so short, it looked like he didn’t have elbows and knees, so for certain poses we lengthened his arms and legs in the rig. It’s like the kind of cheat we’d do in drawn animation.’

    The caricatured design also had an impact on cloth simulation. ‘We have an amazing tailoring system, but we’ve never had a character quite this pushed,’ Docter says. ‘We wanted the cloth to look real, but we didn’t want a ton of wrinkles when he bent his knees. It took a lot of back and forth.’

    To create the 78-year-old’s performance, the animators concentrated on ‘less is more’ as well. ‘The key to his success was not moving him,’ Clark says. ‘I remember something Chuck Jones told me as an intern, that animators most of the time think of over-pushing things. You can exaggerate. But sometimes, ‘exaggerated’ is how much you don’t move.’ Sometimes, for example, the animators would leave Carl’s face sculpted into one pose and just have him blink.

    Russell, by contrast, is in constant motion. Pixar found Jordan Nagai, who provides Russell’s voice, through a multi-city casting call that resulted in 400 voices to consider. ‘[Jordan] was talking about activities at home and he made me smile just by talking,’ Docter says. ‘But, he wasn’t an actor and wasn’t very interested in pretending. So, we did a lot of physical activities with him. I’d hold his arms and tell him to see if he could wriggle free while saying, ‘Let me go.’ Or, I’d have him run, run, run, up to a microphone before he’d say a line.’

    The animators loved the result. ‘We had the same quality as from an improv performance, or a Nick Park film,’ Clark says. ‘There was something about having a non-actor that makes Russell seem more human. I love the dogs, but I also love it when something quirky and human comes through in a character. That’s the hardest thing to get’true human acting and emotion. This film has it in spades.’

    The Simple Art of Blocking

    Working with Clark were three directing animators, Shawn Krause, Dave Mullins and Michael Venturini, who cast groups of animators by action and emotion’an animator who is particularly good with acting would work on subtext; another who excels with physics and weight would handle action shots.

    All the animators started by presenting a first blocking pass during dailies. ‘They show a basic series of poses,’ Clark says. ‘A palette of acting ideas. The best animators have an elegant sense of blocking. They use the fewest amount of controls and poses to tell the most.’

    Clark believes that constructing the scenes in this slow way through dailies builds the best performances. He uses a scene with Muntz, voiced by Christopher Plummer, as an example. ‘Something’s not quite right in Muntz’s head but we didn’t want to play him crazy, just suggest that in a subtle way. We did that through dailies, through animators seeing each other’s work. It’s like Brad Bird says, ‘The best animator is all of us together.”

    Pixar is known for creating animated films that tell new and interesting tales, that open audiences’ minds to an art of animation that extends beyond cartoons. Up is a masterful extension of that tradition.

    ‘This is some of the most fun I’ve had working on a Pixar film,’ Clark says. ‘I really enjoy the fact that we’re making new stories that will be around in 50 years, not just cranking out merchandise.’

    Disney/Pixar’s Up was the opening-night movie at this year’s Cannes Film Festival (May 13). The feature is currently playing in theaters nationwide.

  • Revenge of the Sci-Fi Victims

    Monsters vs. Aliens, DreamWorks’ latest epic, puts vaguely familiar characters from our favorite ’50s movies into a ginormous stereoscopic package.

    It all started with the monsters in Cannes. The monsters in a script, that is. Conrad Vernon, who had just finished directing the 2004 film Shrek 2 with Andrew Adamson and Kelly Asbury, was in Cannes promoting that film and was reading scripts during his down time. One of the scripts was based on the Rex Havoc comic-book series. ‘It had monsters in it,’ Vernon says. ‘But the concept’sci-fi horror’wasn’t what I wanted. I thought it would be great to do a film from the monsters’ point of view, not the people’s point of view.’ The greenlight team at DreamWorks liked that approach enough to assign an artist to the project.

    Meanwhile, Rob Letterman, who had directed the 2004 film Shark Tale with Bibo Bergeron and Vicky Jenson, had begun working on an idea for an animated feature with a Dirty Dozen type of theme. DreamWorks suggested marrying the two concepts’monsters from the 1950s with prisoners given a suicide mission in return for the promise of freedom. The merging result became Monsters vs. Aliens, directed by Vernon and Letterman.

    ‘Monsters versus aliens was always part of the concept,’ Vernon says. ‘We put the two concepts together and had the monsters locked up for the past 50 years in a secret military facility. When an alien menace attacks, none of the military branches can do anything about it. So, they create a team of monsters to throw at the alien menace as a Hail Mary pass.’

    First, Vernon and Letterman needed to pick some monsters. After starting with a dozen, they reduced the potential cast to five: The creature from the Black Lagoon, the 50-foot woman, the blob, the human fly and Godzilla. Then, they decided to create their own monsters. ‘We decided, first, because of all the rights problems it would be easier to come up with our own,’ Vernon says. ‘And, second, we wanted to.’

    The creature from the Black Lagoon merged with King Kong to become The Missing Link (Will Arnett). The human fly morphed into Dr. Cockroach, Ph.D. (Hugh Laurie). Insectosaurus (Jimmy Kimmel) is a 350-pound grub that can shoot silk out its nose. Susan Murphy from Modesto, also known as Ginormica (Reese Witherspoon) is a 49-and-a-half-foot woman. And, B.O.B. (Seth Rogen), a clever abbreviation for Bicarbonate Ostylezene Benzoate, is a one-eyed gelatinous mass created by accidentally mixing a genetically altered tomato with ranch-flavored dessert topping. General W. R. Monger (Kiefer Sutherland) sends the monsters on their mission at President Hathaway’s (Stephen Colbert) request.

    Dave Burgess led the team of animators who brought the characters to life, a core group of 35 animators that swelled to 50 during the final weeks of production, with the help of five supervising animators. ‘Generally, we have the supervising animators each run a sequence with a team of six to nine animators,’ Burgess says. ‘When they finish one sequence, they move on to the next. But, we were getting the story in bits and pieces. We still had each supervisor run a sequence, but we cast their teams based on how much work they had and how many people they needed. People like to work with their gang, but this time, the animators benefited from working with each of our talented supervisors who have strengths in different directions.’

    Riggers based the controls for most of the characters roughly on PDI/DreamWorks’ generic ‘Man A’ rig. ‘Once you know how to use the rig for one, the controls are similar for the others,’ Burgess says. The tricky bits were the tails. Link and Insectosaurus both have long tails and Burgess wanted the tails to drag on the ground, so the crew devised a way to stick the tails to the ground plane.

    In addition, Burgess asked the riggers to do a full body simulation for the giant slug to add rippling that would help create believable weight and mass. Giving the right weight to Susan, the giant woman, was also a challenge. ‘We didn’t want her to lumber like a Godzilla,’ Burgess says. ‘But, she isn’t a normal-sized person, either.’ Because we see Susan in her normal size as well as her Ginormica expansion, the riggers and animators needed to make sure she stayed on model through the size shifts.

    Dr. Cockroach was Burgess’s favorite character to animate. ‘We could explore the contrast between a brilliant, elegant man in control and out of control,’ Burgess says. ‘He has a huge cockroach head, like a toothpick with a giant olive on top, so he was fun design-wise.’ For the alien Gallaxhar (Rainn Wilson), riggers devised an FK/IK system to help control his six tentacles and four eyes.

    B.O.B. was the most challenging because of his gelatinous blob-like qualities. ‘He’s a character and a fluid simulation,’ Burgess says. ‘I don’t think anyone has created a character that elaborate and flexible, with a head and base, but no body.’ Animators could work with a stand-in version of B.O.B. composed of blue rings with an eyeball in the middle; he turned into a blob once the crew rendered him.

    For the style, the team settled on a compromise between the broadness of animation in Madagascar and the naturalism of Shrek. ‘We kept going back to Chuck Jones,’ Burgess says. ‘We’d stay in a pose as long as we could and then explode out and end up in another great attitude. The more we could milk those beautiful poses, the more successful the acting.’

    Because DreamWorks authored the film in stereoscopic 3-D from the beginning, animators viewed all their dailies in stereo. ‘We had to be super careful,’ Burgess says. ‘Stereo is not forgiving at all for cheated eye-lines, and hands almost touching had to be right next to each other.’

    Stereo 3-D was also a learning process for the directors. ‘We’d gone three to four months into production when Jeffrey [Katzenberg] said that we would do the film in 3-D,’ Vernon says. ‘3-D didn’t change the story; everything is in support of the story. This is kind of a perfect movie for 3-D. Our main character is [almost] 50 feet tall and there’s no better way of showing that than by seeing that depth go away. You can really feel that General Monger is tiny, and she is gigantic.’

    Creating a film that merges two eyes’two cameras’to create one deep image is a good analogy for the working relationship between directors Vernon and Letterman who had to merge two different ideas’monsters fighting aliens and the ‘dirty dozen”into one story.

    ‘If you’re leading a team with two visionaries in charge, you have to get that vision straight,’ Vernon says. Judging by the final results, they certainly did.

    DreamWorks Animation’s Monsters vs. Aliens is currently playing in theaters nationwide.

  • Eyetronics Lends EFX Insurance to Bulletproof Monk

    Over the last year or so, it has become a standard practice in action films to scan and create digital models of actors and objects for effects shots that cannot be filmed. An even newer development is the use scans to anticipate the unexpected in post-production.

    Visual effects supervisors for MGM’s Bulletproof Monk used Eyetronics’ 3D scanning services (www.eyetronics.com) to ensure that all of their bases were covered when creating effects for the film. Eyetronics scanned four of the primary actors in the movie and provided three 3D models for visual effects sequences.

    A lighter side of martial arts

    Bulletproof Monk is being described as Crouching Tiger, Hidden Dragon meets The Matrix, but with a lighter side. The film tells the story of the Monk, played by Chow Yun-Fat, a Zen-calm martial arts master whose duty has been to protect a powerful ancient scroll that holds the key to unlimited power.

    Now faced with finding the scroll’s next guardian, the Monk’s quest brings him to America. According to an ancient prophecy, the Monk’s successor is a charming, street-tough young man named Kar, played by Seann William Scott. As the Monk instructs Kar in the ways of a protector, the unlikely duo become partners in shielding the scroll from Strucker, a relentless power-monger played by Karel Roden, who has been chasing it for 60 years. Amidst a flurry of high-flying acrobatics, martial arts action, and quick-witted humor, this comic odd couple has to work together to keep the scroll safe.

    An ounce of prevention

    John Sullivan, Bulletproof Monk‘s visual effects supervisor, went into the film knowing first-hand the value of having digital replicas of actors. When coordinating visual effects for another movie last year, a studio he was working with decided to tone down the violence in a scene that showed close-ups of a main character’s face covered in bloody make-up. Sullivan’s team had to remove the make-up from the character’s face in post-production. But painting out the make-up left very little of the actor’s face in tact.

    “We would have had to model the actor’s face from scratch and match-move and light the model over the actor’s bloody face,” says Sullivan. “We ended up leaving what looked like grease smears on the actor’s face, which wasn’t the best visual solution. Because of that, I decided to scan the faces of principal characters in future movies so we can have a bit of insurance and possibly avoid a similar situation.”

    A quick and painless process

    Sullivan had heard about the Eyetronics scanning service from other visual effects professionals, and he liked the portability of the company’s scanning technology.

    For Bulletproof Monk, he needed facial scans of Yun-Fat and Roden. In the past, that would have meant hiring service bureaus with large, stationary scanners. The actors and the entire supporting crew from hair and make-up to wardrobe and lighting personnel would have to travel to the scanning company’s facilities, usually in New York or Los Angeles.

    Sullivan’s 3D artists evaluated numerous scanning options to determine which would best support their needs. They chose Eyetronics, in large part because the company’s technicians could bring their equipment to the film’s set and do the scans between takes.

    “Scanning four actors took about an hour,” says Sullivan. “The entire process was pretty quick and painless. Not only did Eyetronics’ models meet our technical needs, but since their system is portable, we were able to save time and money by not having to leave the set in Toronto.”

    The fountain of youth

    Eyetronics first used its ShapeCam system to scan Roden in old age make-up. ShapeCam is a hand-held scanning system that consists of a digital camera and specially designed flash devices mounted on a lightweight frame. It allows Eyetronics technicians to freely move around persons or objects, capturing dimensional and texture information by simply taking pictures.

    Eyetronics used ShapeCam to capture several angles of Roden. Sullivan also wanted a scan of Roden without the make-up, so while it was being removed, Eyetronics scanned Yun-Fat, Scott and Victoria Smurfit, who plays Nina in the film. MGM decided in post-production that only the models of Roden and Yun-Fat were needed.

    The scans of Roden and Yun-Fat were processed at Eyetronics headquarters, where they were made into 3D models with both wire frame and texture information. Animators at a subcontracted effects house imported the models of Roden into 3ds max and Maya to create the final scene in which Strucker transforms from an old man to his younger self.

    Reshoots without film

    The model of Yun-Fat was used for insurance throughout production, as well as for visual effects in a scene in which Monk flips over and lands on his feet on top of a car. After the scene was shot, the director decided he wanted it filmed from another angle. Neither actors nor stunt doubles could be used to film the scene because of the possibility of aerial wires becoming entangled with the camera in its new position.

    “We could have done the shot with stunt doubles in front of a blue screen,” says Sullivan “But even then, we would have had to map Chow’s face over the double. Since we were faced with creating effects for the scene no matter what, we chose to go full CG with the entire scene.”

    Eyetronics’ model of Yun-Fat’s face was imported into Maya at Boy Wonder Visual Effects, where the final scene was created and rendered.

    Sullivan sees Eyetronics’ technology and methods as great assets in the world of post-production. “Modeling the human face from scratch would not have been logical for this movie,” he says. “It was important that our scanning solution have mobility, ease-of-use on location and speed. Eyetronics was able to meet all of these expectations and provided great models for our visual effects animators to work with.”

    Erin Hatfield (erinh@cramco.com) is a freelance writer specializing in computer graphics and other technology topics. She works at Cramblitt & Company in Cary, N.C.