top of page
Search

WEEK 10 - Facial Capture Successfully Put on a Custom Rig

Updated: Apr 6, 2020

With my character now re-exported from Maya with correct naming conventions and sent into Reallusion 3D exchange I can continue with the joint naming.

Firstly you convert the skeleton to ‘non standard human’, this enables the Characterization Profile to be entered.

Here you select the picture of the bone you want to label on the right, then select the correlating joint on the skeleton. Exactly the same as in MotionBuilder.


Once all of the bones are mapped in the joint hierarchy you can make the characterization ‘Active’. This then allows the facial bones to be mapped. I do not have any facial bones other than the eyes in my model. Instead I am trying to use blend shapes which I believe can be mapped later on elsewhere.

Once you are done with the joint renaming, hit ‘Convert’ to finalize the mesh. Now the character can be sent back into iClone with the ‘Apply to iClone’ button in the top right of the menu screen.

Now, once back in iClone, you can see that I am able to apply any of the pre-made motion to the model here are some examples:


However, my project is focusing on facial capture data so I believe I need to look more into the ‘expression editor’ tab.

For the eyes, you have to manually move the eyes into the desired location for ‘Rightward’ , ‘Leftward’ , ‘Downward’ and ‘Upward’. As seen below.


The eyes are simple to do in my model as they are attached to a joint. The rest of my model instead relies upon blend-shapes ( unfortunately Fuse and Mixamo’s auto-rigger is only able to create facial blendshapes on export rather than a full facial skeleton) so I will need to approach this slightly differently.


Applying Facial Blend Shapes

In Maya, you can see the Facial Blend shapes that have been exported from Mixamo (only possible when using Fuse CC) and edit them using the _ncl1_2 tab in the Attribute editor (can rename accordingly). In the outliner you can see each individual blendshape as an individual mesh.


Here you can see how Fuse has done a pretty great job in automatically generating blend shapes for this model.




So, according to the tutorial provided by Reallusion when exporting this fbx from Maya back into 3Dxchange the BlendShapes should follow automatically. Twice I have imported my fbx model straight from iClone to 3Dxchange and the blendshapes have not followed, hopefully this error was caused by following an incorrect pipeline.


That worked! New options came up when importing the mesh which meant that animation was preserved. Now under the ‘Face Setup’ tab you can see all the different Facial Blends that were also available in Maya.




To get this character into iClone, firstly I need to do the above steps where I turned the character into a Non-standard Human. Then characterize the bones by clicking and dragging. (I didn't have to do this myself this time as in the ‘Presets’ drop down the Maya Human-IK was available). Next I tick ‘active’ and map the eyes.

Once done, I can now convert the skeleton to take me back to my Face Setup Tab.

Once in the Face Setup tab I can see all of my Blend Shapes down the side of my expression editor, this is how I will be able to map my expressions ready for live facial capture!



As can be seen to the left of the above image, during the transfer of the blend shapes the names had all been lost. Luckily you are able to re-name the blendshapes in the Face Setup tab. This took a while but should be really useful for the next task.


Using the expression editor I then mapped every blend shape to its corresponding pose. For the Visemes I do not have blend shapes specifically designed for each viseme, so I made rough representations of each using the blend shapes available. This can be seen below.











iClone and Facial Capture


Finally I send the model back into iClone and open the ‘MotionLive’ plugin. My character can be seen in the drop down and when I check the Mapping panel my blend shapes can be seen ready for motion capture data.


The system works! As can be seen in the video below I was able to stream the real time facial capture onto my Viking model now that my expressions are mapped out using blend facial blend shapes. This is a really significant moment in my project as it proves my super quick and efficient pipeline works!

I can now effectively create any character or model and put real-time facial capture data onto the model using iClone. I may still be able to complete Fenrir!


Here you can see the equipment in action:


Eyebrow Problems

On loading up my save file from yesterday some kind of animation is being applied to the eyebrow in iclone. When I send this to 3DXchange there is no such animation.

Even when re-importing the model (which seems fine in 3Dxchange) back into a clean file the eyebrow is still lowered. This affects all of the facial capture data as it is affecting the data no matter what I do. It is almost as if having the brow down is being recognized as the base pose for the face. When I go into the face Key editor you can see that my ‘default key’ is set to this odd frown.



If I set a new key with the face in the correct orientation it overwrites the bad default pose, however the overall blend shape is still wrong. This is particularly noticeable in the eye-lashes which pull away from the eye-lid mesh.



On a positive note this tool is really useful, and I can see a lot of interesting potential uses later on in the project. For example if I like the way my actor sounds but the emotion wasn't quite right in the performance I could use a facial key to make his appearance closer to my needs without having to edit data frame by frame.


Still. This doesn't fix the issue, and after trying to find solutions for a whole day now I’m going to have to go to my last resort which is re-doing this process from scratch. In this way I’m hoping that if I have accidentally baked something into the animation in iclone it will be removed. Luckily it shouldn't take too long now that I know what I am doing, but it is still frustrating nonetheless.


Before re-mapping all of the data I did a quick test to see if this fixed the issue and it has. So I’m going to continue on with this idea.




It is fixed! This sets me back half a day or so, but I can catch up this weekend. By looking further into streaming into unreal from iclone.


 
 
 

Comments


Post: Blog2_Post

+44 7501 853242

©2020 by Genevieve Myhan. Proudly created with Wix.com

bottom of page