Digital Domain Pandas Coke Spot
By Ron Magid
When Coca-Cola gave the Edge Creative agency carte blanche to pull out the stops and prove that "Coke brings things to life," the result was a cutting-edge 13-spot international ad campaign that broke the conventional mold and led to some major technical breakthroughs as well.
Less peripatetic and more character-based than its sister spots, Pandas challenged DD's artists, animators, and software designers to make audiences believe in the rich computer-generated fantasy.
Much of the spot's complex character animation was achieved using Softimage software running on Silicon Graphics Indigo2 IMPACT systems. "The animators were responsible for their own characters so that there was a consistency from shot to shot and so the animators could put their own personal acting skills to work on their respective characters," says Pandas' CG supervisor, Daniel Robichaud. "Stephane Couture was in charge of the three bear kids, who really act like children and whose behavior is very different from the one we called Uncle Dopey, who's like a beer drinker. That character was handled by Dave Hodgins, who also set up the controls that enabled the animators to give life to these characters. Dan Fowler was responsible for the facial animation setup, and animated the one walking on all fours as well. There were just enough view controls on the `dashboard' to create facial expressions conveying the pandas' various feelings and emotions. Although the animators dealt with only very few controls, they could combine the movements of the mouth, eyes, ears, and nose to achieve an infinite combination of expressions."
Once the bears were animated, DD's software developers, including software engineer Darin Grant, were under the gun to devise realistic CG bearskin rugs to cover their digital stars. Considering that creating digital hair has amounted to the Holy Grail of CG animation from the inception of the medium, the order was outrageously tall. Somehow, DD came through with the best CG hair ever, created on Silicon Graphics Indigo2 IMPACT workstations.
"Unlike previous hair programs, which used either a volumetric or a particle-based approach, DD's shader created hairs that looked individually modeled because they were individually modeled. "We basically treated the hair as geometry," Robichaud says. "Each hair was indeed made of its own individual polygons, which is why it can self-shadow and cast shadows on each other and can be lit like every other object in the scene without any artifacts."