
This Blueprint is placed in the Event Graph of your character's Anim Blueprint. To apply head rotation to the Actor using data from the Live Link Face App, you first need to set up Blueprints in the Event Graph and Anim Graph to drive the joints in the head. Tap the Record button again to stop the take. This begins recording the performance on the device, and also launches Take Recorder in the Unreal Editor to begin recording the animation data on the character in the engine. When you're ready to record a performance, tap the red Record button in the Live Link Face app.
#Darkdire loropetalum update
In the Details panel, ensure that the Update Animation in Editor setting in the Skeletal Mesh category is enabled.īack in Live Link Face, point your phone's camera at your face until the app recognizes your face and begins tracking your facial movements.Īt this point, you should see the character in the Unreal Editor begin to move its face to match yours in real time. In your character's animation graph, find the Live Link Pose node and set its subject to the one that represents your device.Ĭompile and Save the animation Blueprint. You should now see your device listed as a subject. In the Unreal Editor, open the Live Link panel by selecting Window > Live Link from the main menu. See also the Working with Multiple Users section below.įor details on all the other settings available for the Live Link Face app, see the sections below. If you need to broadcast your animations to multiple Unreal Editor instances, you can enter multiple IP addresses here. You'll typically need to do this in a third-party rigging and animation tool, such as Autodesk Maya, then import the character into Unreal Engine.įor a list of the blend shapes your character will need to support, see the Apple ARKit documentation. You need to have a character set up with a set of blend shapes that match the facial blend shapes produced by ARKit's facial recognition. You'll need to have an iOS device that supports ARKit and depth API.įollow the instructions in this section to set up your Unreal Engine Project, connect the Live Link Face app, and apply the data being recorded by the app to a 3D character.Įnable the following Plugins for your Project: You'll have best results if you're already familiar with the following material: The material on this page refers to several different tools and functional areas of Unreal Engine.

#Darkdire loropetalum how to
This page explains how to use the Live Link Face app to apply live performances on to the face of a 3D character, and how to make the resulting facial capture system work in the context of a full-scale production shoot.

#Darkdire loropetalum free
If your iOS device contains a depth camera and ARKit capabilities, you can use the free Live Link Face app from Epic Games to drive complex facial animations on 3D characters inside Unreal Engine, recording them live on your phone and in the engine. Recent models of the Apple iPhone and iPad offer sophisticated facial recognition and motion tracking capabilities that distinguish the position, topology, and movements of over 50 specific muscles in a user's face.
