Mittwoch, 3. Oktober 2012

Tutorial - Facial Animation and Lip Syncing 02

Once again, it took me longer than expected to get my stuff done until I could post the next step, but anyway, here we go...

Facial Animation and Lip Syncing 02

In the first part of this tutorial I was talking about the 3 different ways of rigging for facial animation:

1. Absolute morphs
2. Bone based
3. Blended morphs

Like I also explained, I went for the blended morph setup.
After my experience with it, I wouldn´t recommend it. The blended morph setup works by setting up asymetrical morph targets for all the animateable parts of the face: The eyebrows, the eylids, cheeks and mouthshapes.
The only bone driven part in this scenario is the jawbone, controlling the opening and closing of the jaw.

These are the two problems I encountered that are the reason why I wouldn´t recommend it:

1. The crease in the middle of the blended morphs, especially for the mouth shapes.
I created the following morph targets for both left and right side of the mouth:

  • Narrow
  • Wide
  • Happy
  • Sad
First I simply created the symetrical morph targets in Zbrush, exported them as obj and imported them back into 3ds Max.
To create asymetrical shapes for the left and right side, I used this workflow:

A) Clone the basemesh.
B) Add a morpher modifier to the cloned mesh.
C) Choose the morph target.
D) Select the vertices on one side of the morph target.
E) In the morpher modifier of the cloned mesh tick the checkbox of the morpher modifier that says "Use vertex selection". Set the morph target to 100%
F) Collapse the cloned mesh and rename it to the morph targets name and side (p.E. "Left Smile").
G) In the original mesh add that morph target to the morpher modifier.
H) Repeat for the other side.

Here´s a video of the process:

There is one annoying problem in this workflow though: Since you chose half of the vertices for the morph target in step D), and you select the other half of them when you repeat the process for the other side, the vertices in the middle are adding up, if you animate both morph targets to create a symmetrical pose ( a smile for example).
You can just select all of the vertices except for the ones in the middle (at the symmetry line), but then those vertices are missing in the morph.
I tried to select the vertices with a soft selection to circumvent this problem, but that didn´t work.

The second problem I encountered was, while blending several mouth shapes together and also animating the jaw: Some of the combinations just gave me weird results.
While the narrow mouth shape looked good when the jaw was in an idle position, it looked really bad if I opened the jaw. To solve this problem I could only think of two solutions:

1. Leave the jaw bone out and instead create morph targets for all of the jaw positions (open, closed, left and right).
But then I would probably still get weird results, because the different morph targets for jaw positions and mouth shapes would also add up.
2. Create corrective morph targets for the problematic combinations. I haven´t actually tried this solution because I already had to do some corrective morph targets for the jaw positions and they are a real pain in the ass to create: Basically you just clone the base mesh, pick the clone as a morph target, set the jaw bone in the position where the problems occur (p.E.: teeth poking through the cheek when opening the jaw), set the morpher modifier to "automatically reload targets" and then work on the cloned mesh until the problems disappear. But since you´re not working on a cloned mesh of the problematic position, but you´re working on a clone of the mesh in an idle position, it´s really hard to see wich vertices to manually push around and where to push them to...So if anybody knows a better workflow to deal with those problems, I would be gald to hear them...;)

I recently stumbled upon a script from Clovis Gay called "Morph UI creator":

In the demovideo it looks like he got around those problems and from what I could extrapolate, it looks more like he is using absolute morph targets for all the different shapes, but I´ll have to take a closer look and maybe ask him about it directly, because it seems to be working pretty well.
Apart from that... the script looks really great and is taking out a couple of steps I wanted to cover in the next part of the tutorial...

There are also other arguments for using a bone based setup.
The setup time might be more time consuming if you´re not already really comfortable with rigging and skinning, but in the end its just more flexible and you don´t have to deal with morph targets at all.
Another argument to forget about morphs completely and go for a bone based set up: Some game engines can only deal with bone based rigs.

I also created shapes for the different visemes, but I also wouldn´t recommend that for a couple of reasons:

1. I set up a PEN attribute holder to hold sliders for the different visemes:

The problem with this is, that if you start keying with these sliders, you can´t really filter the keys by visemes, so if you´re only trying to work with the timeline, correcting the timing becomes pretty tedious.
Of course you can just work with the dope sheet or curve editor, where you can select the keys easier.
2. You have more control if you just use the mouth shapes you rigged before: Visemes can look quite different depending on how the character talks - is he smiling while talking? Talking quietly or loud? Talking very pronounced or kind of mumbling into his (or her) beard?
3. I thought using predefined shapes for visemes would make the process of lip syncing faster, but in the end I still had to spend quite some time fiddling around with the other controls for the jaw and mouth shapes to get it to look right, so I could have just completely dismissed the visemes.

You can see a pretty good setup for lip syncing in the video mentioned above.
And you can still put a PEN attribute holder on your GUI to add presets for visemes you use regularily if you need to.

Like I said in the overview for this tutorial: This is by no means meant as astep by step guide, but I hope I can shed some light on questions and answers I found while researching and experimenting with facial rigging for Lip Sync.

In the next part I´ll try to cover some more steps like setting up a proper UI for animation.


  1. Hi Samuel.

    I have a solution for mirror morphs, without plugins or scripts. You need to create or duplicate the head. This second head (I usually call it as Head_Ref), doesn't have any modifiers... only edit poly, obviusly. Create a side morph in another copy of your base head. When your finish your side morph, copy this head and with this head selected (side morph head02), go to graphite modeling tools, displays the poligon modeling menu and choose simmetry tools. In its window pick a main model (head_Ref) and flip simmetry... wuuoola!! You have a perfect simmetry morph. Is important reset xForm of the original head before made the morphs.

    Sorry for my english, not my language jeje.

  2. Danke! Jetzt muss ich das auch probieren.Bin aber gerade beschäftigt mit real estate Datenraum Empfehle!