top of page
Search

WEEK 11 - To Unreal and successful use of LiveLink

Updated: Apr 11, 2020

Setting up the Plugins


Firstly I need to make sure that the LiveLink plugin for both iclone and unreal is installed on both programmes.

To check the Iclone install worked a new tab should now appear called ‘Unreal Live Link’ in the Plugins tab.

In Unreal a Plugin called LiveLink will now be able to be activated in the plugins manager.



In Unreal I can now open the ‘LiveLink’ manager tab from Windows → LiveLink. As seen below.



This is where I can manage the link between Iclone and Unreal. At current when I try to add my Iclone source there is no Iclone option in the plugin. This is because there are some extra steps involved in setting up Iclone so that it will recognize my unreal project.


For this setup I followed these two really helpful guides from reallusion :


A basic breakdown of the steps are as follows-

Firstly navigate to the iClone LiveLink plug-in for Unreal and select the version of Unreal your project is using, at the time of making this there are plug-in versions available from Unreal 4.20 to 4.24. Copy this ‘Plugins’ folder.



Now navigate to your Unreal engines plugins section (making sure that it is the version you are using for your project) and paste the content of the plugins folder into the Unreal plugins folder.



Next paste the plugins folder again, but this time into your project folder.



As explained in the walkthrough this allows Unreal to make sure that ‘ ...each project can use different but correct versions of iClone Live Link plug-in.’




Now when I re-open my Unreal project you can see that the Iclone LiveLink plugin is now available to activate as well as use as a source in the LiveLink manager tab.


But in order to have the complete pipeline I also need to install the CC and iClone Auto Setup Plugin. You can find the plugin download here:

Unlike the live link plugin the documentation for this step is a little harder to find, but once you do the setup process is basically the same as above. However this time both the content and Plugins folders need to be pasted into your Unreal folders.


Here you can see the new plugin available on opening Unreal.

Now that I have the plugins set up I can continue with seeing if I can set up my Viking.


First I have re-exported my Viking from the IClone 7 Trial (Due to it being a trial version I am limited to the size of my textures which could be a quality issue later on in the pipeline). I have done this as I believe I need to re-import my Viking character into the scene with the new CC plugin. By exporting my character from IClone I should also already have my bone posing and blend-shapes setup correctly.

In the export options there is also an ‘Unreal’ dropdown select which means the fbx export will be optimized for use in Unreal.




Next I have dragged the fbx file (ONLY) into my unreal project which has brought up this new dialogue box. This shows that the CC plug-in is working. As I have imported from I clone I can use these settings, if not I would not be able to use them.

Now, the import options need to be correctly set. For this I followed this setup tutorial, again provided by Reallusion. Having imported a lot of characters and animations into Unreal I felt that this would usually be an area I’d be able to figure out for myself. However there are a few options that seem to differ from what I would usually use.






Note: Do not import Materials or Textures.

Once done hit import all.


I now have my Viking in Unreal ready textured and rigged for animation. With the tutorial video you can really see the power of using Reallusions Character Creator as extra details in the skin and hair can be easily transferred straight to Unreal. For example you can see subsurface scattering in the nose and hyper realistic details in the skin and hair of the character.




It is a real shame I am unable to use the programme for this project.


LIVE LINK


With my character imported using CC setup I should be able to link everything.

Opening the LiveLink setup I can now see the Iclone live link. An important thing to note is that the two Meshes need to have the exact same name otherwise this link will not work. (Mine do not, so I will have to fix that now)



For a while I was a little stuck as my newly imported character just isn't linking with Unreal. I now realise that I have completely missed the most important step. (found here https://www.reallusion.com/iclone/live-link/unreal-engine/tutorial.html)


In order to correctly set up my Live Link character from Iclone to Unreal I need to transfer the character over via Iclone. This can be done in the transfer tab of the Unreal live link tab.



Once you hit ‘Transfer File’ a cmd prompt will open and run through the process of setting up the character link automatically in Unreal. This can take a couple of minutes as mine did. By using this transfer method the characters blueprints and more are automatically completed saving a tonne of time in the pipeline.

Next you need to re-do the CC setup by selecting the new skeletal mesh and clicking CC setup. This edits the skin textures making them look nice. This works best with the when using characters from the Iclone character creator pipeline so is hard to show at work with my model (as mentioned earlier).


You can now also see where I was going wrong with manually importing my fbx file, even with auto CC you would have to manually edit the LiveLink event graph after importing.


In this screen grab you can see all of the new options on this import mesh highlighted in red.



Finally I am ready to use LiveLink with my character.

With Iclone and my character linked I can select activate link in Iclone on the Live Link tab to start sending data over to unreal.


In unreal I can now see my linked scene in my LiveLink management tab. Viking_03_Export is what is now driving Viking_03_Export in Unreal.


With everything linked up I then opened up the Motion Live options and managed to successfully stream Facial Capture data on a custom model into Unreal.




Refining The Capture


When capturing Facial data with Iclone there are a number of ways to edit facial expressions to enhance the overall capture. This can be done at any point in the production as there are ways of expression editing live data, recorded data and editing tools for pre-production so that data can be better interpreted onto custom characters.


The first that I am going to be exploring is the expression mapping panel (which I briefly played with before and ended up with the dodgy eyebrow…). The expression mapping has huge effects on how the data is translated during facial capture (explained in detail here: https://youtu.be/1m3QlrnqWXw?list=PLNV5zSFadPdlufCKxE-HmWHGGDsnneUn8 ) so you have to be careful what you edit as it could have unforeseen negative impacts on data.


The expression mapping settings are stored as ‘.json’ files and can be accessed via the expression mapping panel.




By clicking the load and save options you can save configurations that you custom build for your characters.



Faceware has three default configurations that have been created for specific uses. (the location of these files seems to differ from the tutorial installation to my own so I had to search the iclone install folder in order to find them).

The mapping files differ hugely and can have some interesting effects if used in the incorrect circumstance. For myself it's best I use the ‘Faceware.json’ as it acts as a default configuration used in learning how expression editing works.

The profile I have been using till now I think is based on the expression mapping I made in 3DXchange. Here is the difference in capture data when using the two, compared side by side:



When I use my character there are a few main areas it has trouble with. The eyes tend to squint too much and the lips can clip into each other during speech.




For the Squinted eyes a good fix was to calibrate Faceware slightly differently. By overexaggerating a certain pose the system seems to compensate by doing the opposite on the model. For example if I calibrate with my eyes squinted and my eyebrows raised, then my model has wider eyes and a frowned brow. This can likely be done actor to actor to take into consideration how different faces may be read by the system.


For the expression mapping, in this example I tried to give my character a permanent ‘growling’ expression. For the performance piece he needs to seem violent or angry.

This took a while and was pretty hard to do as the different expressions all affect each other during capture. Pulling some odd faces for an hour whilst editing has left my face feeling a little sore….




Now, I’m going to try and use Max’s motion capture data I gathered before the lock down came into effect. It is likely that this may be the only data I can use so I may have to edit the footage a little to get the desired performance.


Using pre-recorded data in Iclone


For the next section of my workflow I am unfortunately separated from my brother, Max, who I planned on using for the performance section of this piece. I have an idea for capturing some data via a Skype call which I am going to experiment with a bit at a later time, otherwise I may have to record myself for the facial motion and get the voice-over afterwards).

At the moment however I have some pre-recorded footage I took of Max whilst he was practicing his lines with me back in week 3 or 4.



The Caricature in the clip above, that Max is portraying, is a little outdated to the current state of the project. The clip above is designed specifically for Fenrir, whereas now I need a more rough ‘Northern Viking’ sounding clip. But for this test, the clip should do nicely.


In Faceware it is possible to change the camera input from a live source to an image sequence. Using Adobe Premier, I exported the above clip into a usable sequence I can stream into Faceware. With the Audio also exported separately I can insert it afterwards for the automatic lip-syncing system.





For a first test this isn't too bad. The main issue appears to be frame rate, potentially my export is incorrect, or Faceware is struggling a little when using OBS at the same time?

Another issue is the lack of a proper pose for calibration. I tried to get a quick one in when he begins speaking but you can see how the eyes and brows begin to change and droop as the sequence progresses.

As a quick test I’m going to record my own brows over the top of Max’s data, this should fix the feel of the performance, however it does pose the new question, how much of Max’s performance can I edit and record over before it’s no longer his?



 
 
 

Comments


Post: Blog2_Post

+44 7501 853242

©2020 by Genevieve Myhan. Proudly created with Wix.com

bottom of page