Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.
Capture facial performances for MetaHuman Animator:
- MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
- The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
- Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
- This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.
Real time animation for live performances:
- Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
- Visualize facial expressions in real time with live rendering in Unreal Engine.
- Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
- Record the raw ARKit animation data and front-facing video reference footage.
- Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.
Timecode support for multi-device synchronization:
- Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
- Video reference is frame accurate with embedded timecode for editorial.
Control Live Link Face remotely with OSC or via the MetaHuman Plugin for Unreal Engine:
- Trigger recording externally so actors can focus on their performances.
- Capture slate names and take numbers consistently.
- Extract data for processing and storage.
Browse and manage the captured library of takes:
- Delete takes within Live Link Face, share via AirDrop.
- Transfer directly over network when using MetaHuman Animator.
- Play back the captured video on the phone.
Hide..Show more..
Screenshots
User Rating
4.67 out of 5
6 ratings
in Australia
5 star
4
4 star
2
3 star
0
2 star
0
1 star
0
Ratings History
Reviews
Great app but current release is broken
Only leaving a review so I can notify the developers - something has broken in the latest release and both CSV export/streaming seem to be broken (missing values). There's a thread on the Unreal Engine forum with more information.
When it's working, the app is great though.
Great tool
Tongue tracking would be great.
Ability to use iPhones internal giro for head instead of another sensor would be op!
With the way head mounts are mad the current way the head rotation is captured has to be free from the head.
Damn great tool. Is there an open Api on git hub for this?
Bug - Some Left Right values are locked together
This is fantastic, I’ve already started making some characters to puppet. Unfortunately from what I can tell the apps seems to lock the following Left Right values together:
-browOuterUp_L and browOuterUp_R
-browDown_L and browDown_R
They output the same values regardless of raising or lowering either side of my face.
This is a shame because we lose a lot of expression without them. Can this be fixed?