WEEK 7 - IClone, FACEWARE, HEADSHOT (and a little mesh work)
- Genevieve Myhan
- Mar 10, 2020
- 5 min read
Updated: Mar 14, 2020
Fixing errors on final high poly mesh:
Whilst continuing with the fine detailing on Fenrir, a few major errors with the mesh became apparent.
Around the mouth in particular were polygons that behaved strangely when i tried to interact with them
On the high poly mesh below I wanted to change how the folds looked in the deep creases. The idea being to remove them almost completely, however when I began to build up the mesh in the folded areas some poly tearing began to occur which I was not able to smooth out.
📷📷 📷📷
It was as if the mesh had a small hole in it that I was unable to edit. I figured that this issue would be down to the topology of the model in particular the way ZRemesher had created these three areas.
📷
To fix this odd topology I re-ran the ZRemesh tool. Freezing the Subdivision levels of my mesh so I would be able to re-project the high frequency detailing onto the new topology.
In the end the new topology looked like this:
📷📷
With the mesh errors that carried over during the projection being able to be removed easily using the smooth tool.
📷📷
Now that this issue has been fixed across the mesh I can continue with the high frequency detailing without any more unexpected issues.
Reallusion / Iclone / Faceware
At this point in the project I think it’s a great time to finally write up some of my experiments with Iclone (in particular Faceware and LiveLink). I’ve been delving in and out of the product over the past few weeks, trying to understand how to use the basics of the software before really putting it to use this week.
📷
After installing the Reallusion hub there's a few add-ons that are needed in order to use Faceware in the iclone application. Luckily, for the free trial, all of these applications are available (Can be seen circled in red). The only downside of the free trial version are the limitations that can be seen here:
📷
Although I don’t think that this should affect me too badly as I will be using IClone to import data into Unreal. It will mainly be work done in and rendered in IClone that will be watermarked or scaled down in resolution.
Reallusion has a huge database of tutorials for setting up IClone. (https://mocap.reallusion.com/iclone-motion-live-mocap/tutorial.html) Following this particular string of courses I created a short video clip using some pre-made assets and motion capture data built into the software. Here I familiarised myself with the interface, use of cameras, actors, lighting, scene creation and importation of animation. The tool felt like a really intuitive mix of maya and Unreal and was easy to understand and pick up after using the tutorials.
(I didn't want to render out the clip to save time so instead see this short clip showing the interface, timeline and realtime render of the scene. )
Getting LIVE working (Faceware)
Firstly I had to download and en-able the Motion LIVE plug-ins for Iclone (See above). Allowing access to the Motion LIVE window in iclone. All super easy stuff.
📷
The harder part is to now use facial capture data to drive a generic rig. There are two ways I can do this either by using LIVE (realtime facial capture) or recording data then importing that data into Motion LIVE. Something I am looking at alongside this project with the custom face rig and go-pro.
Firstly, importing a character into my scene in iclone. It needs to be fully rigged and have its face mapped out (or named) so that the data has something to be transferred onto. This is very similar to how MotionBuilder needs you to characterise a skeleton in order to place motion data onto it, and pretty much works the same way. (So far I have only looked into this briefly as I have been using pre-made characters in IClone which are pre-built with this.)
Next you have to connect IClone to the Faceware Realtime application. You do this by pressing the green LIVE button. Once active you can see the how the data is being tracked in the data inspector as well as edit the mesh in real time as your actor is performing.
📷
Now you are able to preview the data straight onto the mesh as can be seen in these two tests.
The second test (bellow) is my actor, Max, running through the ‘quick brown fox’ test I like to use during facial animation. The results aren't amazing but are much better than my tests as I created a small (and very temporary) stand so the camera had a better angle and lighting.
The Makeshift Setup:
📷
Using Character Creator : Headshot
Another tool found in the Reallusion kit is headshot. Within character creator headshot uses Ai to create digital humans from photographs. As can be seen below, I have imported max to use as a quick test and the results are quite spooky.
📷
This mesh was created in only a few minutes, fully rigged and able to be used in Iclone for facial capture. The potential for this tool is amazing!
Next step I need to refine the mesh. The photo I took was not the best, as can be seen the harsh shadow along the right side of max’s face has been baked into the texture map. After looking at the Headshot guidelines (https://manual.reallusion.com/Headshot_Plugin/ENU/Headshot_Plugin_for_CC/1.0/ID_PG_Auto.html) I need it to be more like a passport photo.
📷
Now Max’s face is a lot skinnier than the one generated, however this is easily fixed. The face shape itself can be edited using the Sculpt Morph tool under Shape Adjustment. As well as the Body which can be edited in the same manner.
📷
If you have ever played the Sims 4, this tool feels remarkably like the character creator tool in this game. It’s intuitive and unbelievably easy to accomplish great results in a really short amount of time.
The Pro version of Headshot (Headshot 1000+), takes this idea even further integrating sliders for more fidelity in altering facial features, allowing even more character customisation. (https://www.reallusion.com/ContentStore/Character-Creator/Pack/Headshot-Morph-1000/)
However, using these adjustments adds a watermark to the viewport and means I am no longer able to save my character or project file (Which, unfortunately, I found out at the end of the process). I will have to stick to using the basic tool whilst experimenting.
So… It turns out that under the trial version you cannot export or save any characters you have made using Headshot (or any of the software). This is really disappointing as it means I won’t be able to use any of this to create an asset that can be imported into unreal or Iclone.
To prove the pipeline would work however, here you can see the automatic mesh topology and bone structure that the software creates:
📷📷
I could see this tool being unbelievably useful for production. Being able to let one person fully generate a really nice engine/animation ready character in a matter of hours (The one bellow that I unfortunately couldn't save took only 15 mins). With the added use of iclone and motion capture technology the tool becomes even more powerful, creating super quick pre-production models that can be used in conjunction with unreal to set up virtual actors (of the actual actor themselves!) in minutes by one person. Something I know from experience in my past motion capture modules would usually take weeks for a small team of people.
Comentarios