Unreal Engine Releases 'Live Link Face' iPhone App for Real-time Facial Capture

Unreal Engine Releases 'Live Link Face' iPhone App for Real-time Facial Capture

Posted by · 6304 views · Translate

Unreal Engine has released a new 'Live Link Face' app for iPhone that offers real-time facial capture.

Live Link Face streams high-quality facial animation in real time from your iPhone directly onto characters in Unreal Engine. The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network. Designed to excel on both professional capture stages with multiple actors in full motion capture suits as well as at a single artist’s desk, the app delivers expressive and emotive facial performances in any production situation.

Unreal Engine Releases 'Live Link Face' iPhone App for Real-time Facial Capture

Collaborative virtual production is a particular emphasis of the app, with multicast networking to stream Live Link data to all machines in the Multi-User Editor session simultaneously in order to minimize latency. Robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture. Live Link Face also has Tentacle Sync integration, which allows it to connect to the stage master clock using Bluetooth, ensuring a perfect editorial lineup with all of the other device recordings from the shoot. Sophisticated productions can also make use of the OSC (Open Sound Control) protocol support that lets external applications control the app remotely to do things like initiate recording on multiple iPhones with a single click or tap.

Live Link Face’s feature set goes beyond the stage and provides additional flexibility for other key use cases. Streamers will benefit from the app’s ability to natively adjust when performers are sitting at their desk rather than wearing a head-mounted rig with a mocap suit, as Live Link Face can include head and neck rotation data as part of the facial tracking stream to provide more freedom of movement for their digital avatars with just the iPhone. Animators can also take advantage of the option to record both the raw blendshape data (CSV) and a front-facing video (MOV), each striped with timecode, to use as reference material for the performance if further adjustments need to be made in-engine.

Unreal Engine Releases 'Live Link Face' iPhone App for Real-time Facial Capture

Facial animation via front-camera and ARKit:
● Stream out the data live to an Unreal Engine instance via Live Link over a network.
● Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
● Record the raw facial animation data and front-facing video reference footage.

Timecode support for multi-device synchronization:
● Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
● Video reference is frame accurate with embedded timecode for editorial.

Control Live Link Face remotely with OSC:
● Trigger recording externally so actors can focus on their performances.
● Capture slate names and take numbers consistently.
● Extract data automatically for archival.

Browse and manage the captured library of takes within Live Link Face:
● Delete takes, share via AirDrop.
● Play back the reference video on the phone.

You can download Live Link Face from the App Store for free.


Unreal Engine Releases 'Live Link Face' iPhone App for Real-time Facial Capture

dgadirector - July 10, 2020 at 3:43pm
Trying to figure the purpose of that last pic. Seems to be comparing the two faces, yet the animation has just a blank stare while the guy has got a frown/scowl going on. Don’t look anything alike.
PaulieP - July 13, 2020 at 9:34pm
It’s because this is a gimmick and a waste of time. But they don’t tell you is that you need to separate software. Screw this crap