Week 13 - Creating The Final Scene and Performance in Unreal
- Genevieve Myhan
- Apr 22, 2020
- 8 min read
Updated: Apr 26, 2020
This week I plan on completing the initial goals of this project by streaming my own performance piece into Unreal.
I am going to stream it onto three different characters to start with. An Iclone Pre-set, Lithariel and The Viking. Then compare the captures of all three to assess what has worked and not in my pipelines. I’m also going to look again at the lip syncing techniques in iclone, as to achieve a better capture on my custom characters I need to assign visime shapes in 3Dxchange. Something I haven't done yet due to time constraints.
Importing all character to Unreal
For the two female characters this was super simple and just a case of importing them straight into iclone using the Live Link Plug-in as shown in week 11.
The Viking on the other hand is in a separate iclone file and can’t be imported into my current one due to limitations on the trial version of 3D exchange. Therefore I can run the two female meshes at the same time but will have to import the animation I do onto the male mesh afterwards.

As you can see from the screenshot above the Iclone character definitely looks the best in terms of texturing compared to the other characters. This is due to her built in compatibility with Iclone and Unreal, if I had access to the Iclone marketplace there may be some more ‘viking-esq’ characters there I could use which would also be pre-built.
Live-Link
First I’m testing running some pre-made animations on the full rigs and streaming that into Unreal. Strangely, Lithariel is facing the incorrect way in Unreal. This wasn’t happening with the Viking so I’m going to put it down to a bad import on my part, potentially the origin on one of my axes is incorrect. At the moment however this isn't a massive problem as I can just rotate her in Iclone.
The test worked well and all the motion looks good for both characters, Lithariel has some mesh issues but that is to be expected as she doesn't have any cloth or hair physics set up. The reversed mesh is also strange as her trajectory is following the Iclone animation. So long as she doesn't move from one place I should be able to work around this.
If I have time I will try and go back and fix this problem.
Scene Creation
For the scene itself I’m trying to create a stormy camp that looks to be deserted except our main character, who will be telling the short story.
Using Iclone I created this test scene, complete with environmental effects which I should be able to link straight into unreal. Below you can see the whole process sped up as well as the end result.
The end lighting isn't great, but hopefully this is something I can work on in Unreal with the real time renderer instead of Iclone.
Scene transfer to Unreal
Unfortunately (I should have checked earlier) I cannot export this scene I have created in Iclone into unreal due to my trial license. In order to export FBX files out of IClone you have to be using the ‘Pipeline 3Dexchange’ version.
This isn't an issue though, as there are plenty of free asset packs provided by Epic Games for Unreal. There are even some amazing maps from Paragon which I could use sections from that are game ready with lighting and atmospheric effects already taken care of.
Whilst creating my scene the only thing I need to do is have Iclone linked up so I can correctly pose my character within both engines.
After a small browse on the Unreal marketplace a number of packs are currently available for free use which will be perfect for my scene. The first one is ‘Poplar Forest’ by Tirido which contains a forest path and gentle cliff side in different seasons.
Autumn -

Spring-

Summer-

And Winter-

For my scene the Winter one is perfect, a great moody atmosphere I can use for the story.
Importing Actor

After re-installing the two plugins for Unreal version 4.24.3 I have successfully placed Lithariel into the Unreal scene. She looks great against the scenery!
The Link itself is a little frustrating as I need to correctly orient her in space within Iclone. With no reference point this is a little hard. I tried importing my Unreal scene as an FBX into Iclone but with two crashed attempts I may need to find an alternative.
By confining the export to just the rocks around the selected area I want to use I can now see how far my character is in Iclone and put her in the perfect spot for use in Unreal.

Again, she has imported into the scene backwards but this shouldn't be too much of a problem for a static performance.


The Link works well, the animation however could definitely be better. The revered mesh in Unreal is frustrating and should be the first thing to look into, I will also have to do some slight re-targeting with the body too. But first I’m going to check the facial capture works.
There were a few issues with importing the iclone Jade rig so I had to re-create the project from scratch in an older version which I knew worked. This fixed my issue and I did this short side by side comparison of both Jade and Lithariels facial data being streamed into the snow level.
Lithariel still has some graphical issues with her facial skinning which is unfortunate. The reason for this is my own fault as I had to copy the skin weights from a low poly model to a smoothed version and must have missed some parts of the face in clean-up.
If I still have time after completing at least one run through of the scene, I’ll go back to edit this in maya. With this workflow it's nice to be able to edit things like that quickly and then bring them back into the final scene.

I felt that the scene was a little empty, so using some free sketchfab models I have fleshed it out just a tiny bit. The campfire can act as a better secondary light source during the performance. I made the campfire by creating a particle system blueprint that can then be placed on the campfire fbx mesh.

(Jade linked into the scene via Iclone playing in real-time)
Unreal and Capturing the Performance
Another drawback with the use of the trial version of Iclones Faceware is that you are unable to actually record the facial data, only preview as I have been doing so far.
This puts a bit of a dampener on my current workflow as in order to test the special visime editing tool I need to record the data first. This shouldn't, however, put a halt on the project altogether as there may still be a way to record the performance data ‘preview’ from iclone via the Unreal Engine sequence recorder.
After creating a few cameras and setting them to my desired views I now need to correctly set my main one as my viewport for when I play the scene and record my data. I have done this using a simple level blueprint you can see below.

When in action it does this:
So now I should be able to use the sequencer to record data that I am streaming into unreal during the simulation.
Test001: Unfortunately Unreal crashed as soon as the sequencer began recording. I’m going to trouble shoot by opening a completely new empty level and importing my meshes.
Sequence recorder
Here is my new blank scene with just Jade being streamed into it from Unreal. I’ve also set up the camera to work like in the previous scene.

Next I open up the sequence recorder and add Jade to the Sequence. The Unreal documentation is a little vague on what adding different things to the sequencer can actually do so I’m just going to test this first.

With her added I then link myself up to faceware and start the recording preview in Iclone, then hit play and record in Unreal.
It didn't crash, and in fact worked even better than I thought it would! After ending my recording a new ‘Test002_BLANK’ sequence was added to my content browser. This can then be brought into my scene or opened in the Level Sequence.

In the image above you can see that, although no longer linked to Iclone, a duplicate mesh (of what I have just recorded with the sequencer) has been added to the scene. The playback buttons can then be used to scrub the timeline and I am still able to edit the environment around the actor.
The sequencer also allows for audio to be recorded in the engine, so there shouldn't be any lip sync issues. I am also able to re-record animation and potentially layer things on top of one another like you might do with animation layers in maya. Overall this tool is going to be great once I get it working with my character In scene.
A potential idea (I want to write down before I forget) if it crashes again is to record my sequence for the actor in a blank scene then import, or layer, the animation into the sequence for the snowy scene.
Layering Tests
For this test I am going to be recording two separate tracks (One for the Body’s Idle, one for the facial data) and see if I can layer the two on top of one another.
To start with I have recorded a long Idle via live link from Iclone to act as my base animation.
Using sequence recorder I can actually create an asset of this animation that I can either save or bring into my scene immediately, instead of having to convert the file to fbx from iclone (or Motion Builder) and import it separately into Unreal each time.

Looking back on some of my previous work at Teesside this tool would have been so useful to have during Journeyman or Beta arcade. When using the motion capture system we could only look back at data on a base mannequin (that sort of looks like a power ranger) one take at a time. This data would then be cleaned in VICON shogun and could take a huge amount of time and resources. With this tool you can very easily compare takes side by side on your character mesh with the actor present, during the shoot, to be able to better communicate how you need motion to look and feel. Not only is there immediate feedback on work but you could export these fbx files from Unreal and send them to game teams as assets they can test out in the engine immediately. The whole pipeline becomes smoother with more opportunities for immediate feedback from all areas of the team.
For the next step I have created a new sequence named ‘Blend_Test’. Within this sequence I originally added my Live Link Jade mesh (Jade001) into the sequence as an actor. When I added animation however the playback wasn’t working. I figured out this was because the Mesh’s Animation mode was set to Animation BluePrint meaning it cannot be controlled with a sequencer.

To fix this I dragged in a new Jade Skeletal Mesh and renamed her ‘Jade_Sequence’ then changed her animation mode to ‘Use Custom Mode’. Once inserted into a track you can see that she is being influenced by the Sequencer.

Next I have created three different animations using the link; Happy , Sad and Chattering. My aim is to be able to overlay these animations on top of eachother to potentially replicate the re-recording feature seen in Iclone.
The results can be seen below.
The results show that the sequencer system has the potential to work similar to that in Iclone with animations being able to be played on top of one another with varying override strengths. Combine this with editing animations so that only certain facial bones are recorded at a time and I have a system very similar to the one I was planning on using.
In the Animation blueprint I am also able to turn off or remove certain tracks of animation in a recorded piece. This can be used to great affect when merging animations in the sequencer to better fine tune what I want replaced or not.

Overall I feel confident to start trying to put together my final short scene.
Comments