MetaHuman with Live Link Face

In this demonstration I am using the Unreal Engine Live Link Face app on my iPhone 11 Pro Max to send live facial tracking information to the 2 characters in the modified Unreal Engine Project "MetaHumans."

Epic released a sample project in mid-February 2021 that contains two new, fully rigged characters created with their soon to be released MetaHuman Creator tool.  A tool for creating life like digital human models for use in Unreal Engine environments.

In the sample project, the sample level sequence ,MetaHumanSample_Sequence was changed to TTBL_metaHuman_001 in order to stop the automatic sequence in the project. Next, the LLink Face Subj, and LLink Face Head are enabled on the character's Blueprint and set to receive input from the iPhone  named iPhone.

Screenshot from iPhone

On the iPhone, the Live Link Face app is started and set to send its facial tracking data stream to the IP of the computer. Sending head tracking data is also enabled so we get head movements as well as facial expressions.

Screenshot from iPhone

Screenshot of "Live Link" on iPhone

With Live Link Face running on the iPhone, and the MetaHuamns project modified and running on the PC, the setup is compelte. Looking into the iPhone's camera and speaking or moving my face has the MetaHumans mimicking my actions.

Screenshot from Metashape

Screenshot from Unreal Engine MetaHumans project. 

This video shows a picture in picture composition of my iPhone screen (green) with the running MetaHuamns project screen (blue). As I move and speak so do the MetaHumans, very cool!

Screenshot from video

Software used:

Live link Face: Unreal Engine iPhone app, v1.0.1
Sample project "MetaHumans.uProject," Feb 2021
Unreal Engine 4.26.1 , Epic Games Unreal Engine

Hardware used:

Computer: AMD Rysen 1950x Threadripper based, Windows 10 computer
16 cores (32 threads) wiith 32G of ram, NVME loacal disks and RTX2070 GPU

Video camera: Apple iPhone 11 Pro Max

More information on MetaHuamns at: