Live Link Face (Epic Games)

Wether or not you use the Unreal Engine in your project, you can use the Live Link Face App to record motions as CSV files and retarged these to your character in Blender via Faceit. This presents a quick option to clean a recorded motion in Blender before using it in Unreal.

Read more on Live Link Face here: Recording Facial Animation from an iPhone X

Motion Types

Next to the shape key values, the app also records rotation values for the head and eyes. To load these values along with the shape key animation, you will need to activate the motion types in the general settings

live_face_ue Live Link App - Unreal Engine

Load Recorded Motion (CSV)

Live Link Face exports a .csv file that contains the raw animation data captured by ARKit during the recording. It is stored on the capturing device (e.g. the iPhone) along with a video of the actor. Simply send the saved motion from your device to your computer via Fileshare or Email and load them into your scene via the Faceit UI.

Tip

The CSV import functionality is very similar to the TXT import from Face Cap. You can use both apps in combination without a problem!

SetupOSC
Import motion data from CSV files directly exported by the Live Link Face app. You can retarget the motion to the Control Rig within the operator.

Options (CSV Import)

  1. Imoprt to new Action
    • If this is enabled, Faceit will create a new Action to hold the new Keyframes.
    • Otherwise you can specify a Shape Key Action from the scene.
  2. Mix Method
    • Overwrite: overwrite the whole animation with the new keys.
    • Mix: attempt to mix in the new keyframes. Overwrite overlapping frames for the same shapes, but preserve everything else.
    • Append: append the keyframes to the end of the specified action.
  3. Start Frame
    • Specify a Start Frame for the new animation values. If Append mix method is active, the start frame will be read as offset to the last frame in the specified action.
  4. Region Filter
    • The new importers allow to specify facial regions that should be skipped during import. This allows, for instance, to animate the mouth expressions via Audio2Face while animating the rest of the face via ARKit.