FAQ - Frequently Asked Questions


Your question is not answered? Ask me directly!

Will Faceit work on my character?

Faceit will be able to process any character that does meet the prerequisites mentioned in Requirements to Geometry. If you're sure that your character fulfils the prerequisites and it still does not work, than please contact me. I am actively developing Faceit and I will try my best to help you!

What about anthropomorphic models?

Yes, it's also possible to rig wolfs or sheep or dragons. If you run into problems, reach out!

Do I have to know rigging and animation?

Yes (at least the basics). Faceit will try to do it for you, but for optimum results, you might have to clean some weights as in any other rigging process. You should know how to pose an armature to be able to tweak the generated expressions. See the tips and tricks if rigging and weighting in Blender is new to you.

What is the difference between the various versions?

You can read about the differences between Faceit 1.7 and 2.2 here.

My character has no round eyeballs?

Flat eye geometry is very common for many anime models. The trick for properly rigging these models is to set the pivot point of the eyebones further inside the head. If the model is rigged, you can use the dedicated pivot setup (copying the existing bone pivots).

A button is greyed out! Why?

Some operators can only be used in a specific mode. Often it's enough to change to Object mode and the button will be available again. In other cases the operator might be great out, because it's required to do something else before. For example, before you can go back to landmarks you might need to go back to rigging. Please reach out, if you are having problems.

Help! The Bind results are bad! How can I improve deformation quality?

There are multiple ways to improve the quality of deformation. Make sure to read the list below and try to improve the results on your own. If the quality is still below your expectations, reach out to me! I am always curious how to improve the results for every possible geometric composition.

  1. The landmark setup directly effects the bone placement, which is probably the most important factor for good results. Make sure to look closely at the examples and read the setup tips.
  2. The assigned Faceit vertex groups (Main, Eyes, teeth, etc.) affect the rigging and binding process. Take another look at the section Assign Vertex Groups and make sure to assign them correctly.
  3. Find a range of reference .blend files here. Make sure to use the 'Back to Landmarks' operator if you want to see the landmarks and vertex groups setup.
  4. Sometimes the automatic weights operator returns bad results. Please take a look at the tips and tricks for weight painting and smoothing weights.

No matter what, I can't get this expression to look like I want...

Instead of posing and weight painting, it's often easiest to use the non-destructive sculpting feature to quickly improve the quality of individual expressions. If you miss some reference, take a look at the ARKit Documentation here.

Why does the mouthClose expression look so weird after baking!?

Don't worry it's supposed to look like this. The mouthClose expression is the inverse of JawOpen. After baking activate only the jawOpen and mouthClose shape keys (slider to max value) and then you will be able to see/edit the mouthClose expression as you saw it before baking.

After Baking, the eyelids clip through the eyeballs. What can I do?

While the bone mechanics of the face rig allow to animate curved motions, shape keys can only represent linear motion. This natural limitation of shape keys is the reason why the eyeblink motion can look slightly different inbetween the minimum and maximum, after baking. The normal approach to 'fix' this clipping in shape key animation would be to use an additional corrective shape key. The new shape key would push out the eyelid mid-way down, so that the clipping no longer occurs. The value should be controlled by a driver, so that you don't have to worry about it anymore after a quick setup.

The corrective shape key eyeBlink_push_out is driven by the value of the eyeBlink shape key.

Can I create expressions for multiple characters in one .blend file?

No, not without cleaning the properties for the other process and destroying the non-destructiveness of the workflow. Instead, I would recommend to set up a separate .blend file for each character. If you want to animate them you can link them into a final animation scene after setting them up and maintain all options to change the shape keys in the rigging file. Read more here!

How can I animate a dialogue with multiple characters?

Yes, by using control rigs for each character. Each control rig will store objects and target shapes for a specific character. By switching the active control rig, you can quickly animate each character individually. It's highly recommended to follow the Best Practices mentioned here! You can quickly set the active control rig for the character that animate each character individually.

Why does Project Landmarks doesn't work!?

The landmarks are projected onto the main facial surface. Make sure you assign the Main vertex group to the correct surface (geometry).

Can I use Faceit alongside Auto-Rig Pro?

Yes! It doesn't matter if your character is already bound to an ARP armature or if you want to rig it afterwards. Faceit preserves all Object data on your models (including modifiers, vertex groups, shape keys, drivers...). On a side note: You cannot use the ARP face rig for the automatic facial expressions.

Can I join the Faceit Rig to the Auto-Rig Pro body rig?

Yes, there is a dedicated operator that automatically parents the face rig below a specified bone and, most importantly, merges the weights. Read more here!

Use a Rigify rig?

Faceit uses a Rigify bone structure and thus, all expression poses are compatible with other Rigify rigs. If your character is already bound to a Rigify armature with a face rig, you can skip the rigging process and head to the expressions tab directly.

Does Motion Capture really only work with Apple devices!?

The ARKit face capturing (former Faceshift software) is closed source and exclusive to Apple Devices with True Depth Cameras. Since recent versions, you can also use Hallway Tile for live recording. If you own an Nvidia RTX Graphics Card, you can use the Audio2Face recorder to animate your characters. If you are an Android user, you can use Meow Face along with MoRec for live animation and recording.

My model already has ARKit expressions! Can I skip the rigging process?

Yes! You can skip the rigging process. See the Mocap specific Setup here

My model has FACS based expressions (not ARKit naming). Can I use the ARKit mocap operators?

All motion capture operators in Faceit utilies the Target Shapes list when importing or recording animations. You can manually assign the target shapes and even specify multiple target shapes for one ARKit expression. Read more here!

Help! My viewport is slow!

There are multiple factors that contribute to the quality and speed of realtime animation in the viewport! Please read here for some tips on how to improve performance.