ARKit Performance Capture

ARKit meets Faceit

Faceit comes packed with a bunch of dedicated ARKit Utilities that ease the process of performance capture and animation with the ARKit.

  • ARKit Expression Preset
    • Create the 52 required shape keys.
  • Retargeting
    • Retarget the ARKit expression to remotely similar FACS based expressions (e.g cc3).
  • Control Rig
    • Create a slider-based bone Control Rig, specifically built for controlling the ARKit expressions.
  • Import
  • Realtime

What's That?

The ARKit (Apple ARKit) is Appleā€™s augmented reality (AR) development platform for iOS mobile devices. The ARKit Face Tracking 1 is a very powerful and easy to use tool that is shipped with the ARKit and can be used on all iOS mobile devices with True Depth camera. The ARKit Face Tracking captures 52 micro expressions in an actors face in realtime. The animation data can be captured to drive 3D facial expressions in CGI productions and Games or streamed to an audience in realtime for VTubing or Theatre Performances. Any 3D character that is equipped with a set of shape keys resembling the 52 captured micro expressions can be animated with the ARKit. Next to the creation of the 52 required shape keys, Faceit also provides a collection of tools that ease the performance capture process right inside Blender. A variety of Apps (both free and paid) can be used to import captured motions or stream to the 3D viewport in realtime.

52 ARKit Expressions

Faceit is specifically build to generate the 52 Facial shape keys that are required for ARKit performance capture. Each Shape Key stores a distinctive micro expression, such as mouthRollUpper, eyeBlinkLeft or mouthDimpleRight etc. Only the combination of these 52 micro expressions will result in good looking animation.

Expressions

A list of the expressions can be found in the ARKit Documentation. Click on the individual expressions in this document to see a reference image of the expression.

Rain_all
ARKit shape keys generated with Faceit

You can use any Apple Device equipped with the True Depth Camera to capture your facial motion and retarget it to a 3D character in realtime, provided that the character is equipped with the 52 ARKit shape keys.

Hardware

Realtime shape keys-based Motion Capture is supported by all mobile iOS devices that are equipped with the True Depth Camera:

  • iPhone X, XR, XS, XS Max, 11, 11 Pro, 11 Pro Max, iPhone12,...
  • iPad Pro 3rd or 4th generation.

You cannot use other hardware than these devices. Checkout the next section to see how to use the capturing capabilities of your device.

Software

The capturing capabilities are build into the Apple ARKit (Augmented Reality Kit). A multitude of Apps stands available that support capturing and recording of facial motions, including:

Note

For now Faceit has an interface for the Face Cap App and Live Link Face, but if you use another capturing application that you enjoy working with, I would be happy to hear from it. I will look into it and include it in utilities if possible!

Face Cap App

Faceit provides an interface for the iOS App Face Cap. See the Face Cap Utilities section.
You can receive live motion or import recorded motions by the click of a button.

Faceit provides utilities to import recorded CSV motions from the App. See the Live Link Face Utilities section
If you are an Unreal Engine user than you will be able to use the Live Link Face App by Epic Games to receive motion data live in the Unreal Engine.

Facemotion3D

With FACEMOTION3D, you can capture facial expressions with an iOS app and communicate in real time with 3DCG software on your PC. You can also FBX export the recorded animation data. You can also import VRM format files into the iOS app.
There is a free Blender add-on available.

Get the app here.
Find more information here.

Get the Blender Add-on here.

Note

The Blender add-on will be integrated seamlessly with the current Faceit workflow.

iFacialMocap

iFacialMocap is an app that, paired with a standalone software (Windows and Mac only), allows you to capture motions on the iPhone and send them to your computer in realtime. Next to the standalone software an interface for Blender and Maya is presented that allows to stream motions to character geometry in realtime. Checkout the website for more information and download links!

BMC

The App BMC - Blender Motion Capture can be used to record facial motion and send it to your computer via email. An Add-on is provided to load the motion in your Blender scene.

ARKit Expression Names

ARKit Expressions
 eyeBlinkLeft
 eyeLookDownLeft
 eyeLookInLeft
 eyeLookOutLeft
 eyeLookUpLeft
 eyeSquintLeft
 eyeWideLeft
 eyeBlinkRight
 eyeLookDownRigh
 eyeLookInRight
 eyeLookOutRight
 eyeLookUpRight
 eyeSquintRight
 eyeWideRight
 jawForward
 jawLeft
 jawRight
 jawOpen
 mouthClose
 mouthFunnel
 mouthPucker
 mouthRight
 mouthLeft
 mouthSmileLeft
 mouthSmileRight
 mouthFrownRight
 mouthFrownLeft
 mouthDimpleLeft
 mouthDimpleRigh
 mouthStretchLef
 mouthStretchRig
 mouthRollLower
 mouthRollUpper
 mouthShrugLower
 mouthShrugUpper
 mouthPressLeft
 mouthPressRight
 mouthLowerDownL
 mouthLowerDownR
 mouthUpperUpLef
 mouthUpperUpRig
 browDownLeft
 browDownRight
 browInnerUp
 browOuterUpLeft
 browOuterUpRigh
 cheekPuff
 cheekSquintLeft
 cheekSquintRigh
 noseSneerLeft
 noseSneerRight
 tongueOut



  1. In 2015 Apple acquired the motion capture company FaceShift. Not long after that, Apple released the facial performance capture solution that is now widely used throughout the industry.