by

Unreal Engine For Mac

The Face AR Sample project showcases Apple's ARKit facial tracking capabilities within Unreal Engine. You can download the Face AR Sample project from the Epic Games Launcher under the Learn tab.

  1. Unreal Engine Build For Mac
  2. Download Unreal Engine 4 For Windows 10
  3. Unreal Engine Machine Learning
  4. Unreal Engine For Mac Download
  1. Welcome to the new Unreal Engine 4 Documentation site! We're working on lots of new features including a feedback system so you can tell us how we are doing. It's not quite ready for use in the wild yet, so head over to the Documentation Feedback forum to tell us about this page or call out any issues you are encountering in the meantime.
  2. Unreal Engine 4 enables you to deploy projects to Windows PC, PlayStation 4, Xbox One, Mac OS X, iOS, Android, AR, VR, Linux, SteamOS, and HTML5. You can run the Unreal Editor on Windows, OS X and Linux.

New to Unreal Engine 4.20 is support for Apple's ARKit face tracking system. Using a front-facing TrueDepth camera, this API enables the user to track the movements of their face and to use that movement in Unreal Engine. The tracking data can be used to drive digital characters, or can be repurposed in any way the user sees fit. Optionally, the Unreal Engine ARKit implementation enables you to send facial tracking data directly into the Engine via the Live Link plugin, including current facial expression and head rotation. In this way, users can utilize their phones as motion capture devices to puppeteer an on-screen character.

The Face AR Sample project is a fully functional sample; however, some setup and configuration information is provided to assist you in exploring the project. You should keep in mind that as Apple's ARKit and Epic's OpenXR support evolves, specific project implementation details may change.

For more information about face tracking with Apple's ARKit, please see Apple's official documentation: Creating Face-Based AR Experiences

Welcome to the Unreal Engine support page where you can get started with Unreal Engine development, access developer resources and find developer communities. If you need support regarding the Unreal Engine Marketplace or your Epic account, please visit help.epicgames.com.

Mobile facial animation capture system is only available on iOS devices with a front-facing TrueDepth camera, such as the iPhone X, iPhone XS, iPhone XS Max, iPhone XR, iPad Pro (11-inch), and the iPad Pro (12.9-inch, 3rd generation).

Face AR Capture Overview

At a high level, the facial capture system with ARKit uses the Apple TrueDepth camera to track the motion of a user's face. In the process, it compares the pose of the face against 51 individual face poses. These poses are native to the Apple ARKit SDK, and each pose targets a specific portion of the face, such as the left eye, right eye, sides of the mouth, etc. As a given part of the user's face approaches the shape of a pose, the value of that pose blends between 0.0 and 1.0. For example, if the user closes their left eye, the LeftEyeBlink pose would blend from 0.0 to 1.0. As a user's face moves, all 51 poses are being evaluated by the SDK and assigned a value. The Unreal Engine ARKit integration captures the incoming values from the 51 blended face poses, feeding those into the Engine via the Live Link plugin. Those 51 pose values can then drive the motion of a real-time character's face. So, all that you really need to utilize face capture to animate a character's head is to ensure the character content is set up to use data from those 51 shapes. Because those shapes each feed back individual 0.0.to 1.0 values, they are perfect for driving the motion of a list of blend shapes on a character.

If the blend shapes created on the Unreal character are named precisely the same as those in the official list of shapes from Apple, then the connection is automatic. However, if the shape names differ between the Apple mesh and the Unreal character, then a remapping Asset must be used. For more details on remapping blend shape names, see the Remapping Curve Names in a LiveLinkRemap Asset section.

For a complete list of the blend shapes brought in by Apple's ARKit, please refer to Apple's official documentation: ARFaceAnchor.BlendShapeLocation

Face AR Capture Setup

Setting up a face capture system for animating a character's face with ARKit requires a few steps:

  1. Set Up Character Blend Shapes and Import the Character into Unreal Engine.

    1. Create a character with blend shape based facial animation, accounting for the 51 blend shapes defined in Apple's ARKit guidelines. Ideally, the geometry for these blend shapes should be named the same as the functions listed by Apple (eyeBlinkLeft, eyeLookDownLeft, etc.). However, there is a little leeway here as the names can be remapped if necessary.

    2. Import this character into the Engine, making sure to import Blend Shapes in the import options.

  2. Enable face tracking in the DefaultEngine.ini file for your Project by adding the following lines to your DefaultEngine.ini file. (the DefaultEngine.ini can be found in your Project's Config folder)

    [/Script/AppleARKit.AppleARKitSettings] bEnableLiveLinkForFaceTracking=true

  3. Create and apply a Data Asset in your Project to enable face tracking.

    1. Right-click in the Content Browser and choose Miscellaneous > Data Asset.

    2. From the Pick Data Asset Class window that appears, choose ARSessionConfig and click Select.

    3. Double-click this new Asset to open it and set the set the following options:

      • World Alignment: Camera

      • Session Type: Face

      • Horizontal Plane Detection: Off

      • Vertical Plane Detection: Off

      • Enable Auto Focus: Off

      • Light Estimation Mode: Off

      • Enable Automatic Camera Overlay: Off

      • Enable Automatic Camera Tracking: Off

      • Candidate Images: Ignore

        Mari0 is a platform game that combines the scenarios, characters and general playability of the classic Super Mario Bros with the revolutionary mechanics of the not-so-classic Portal.Thus, the player controls Mario through all of the levels of the legendary Super Mario Bros with the small addition that throughout the entire course of the game they can employ the famous 'portal gun' to create portals in any part of the scenario. With these portals, the player is able to do things from throwing enemies out of the level to skipping ahead in the game, being that the physics are perfectly recreated.Before starting the game, of course, you can configure many different options. Portal for mac. First, in an awesome wink to Team Fortress 2, you can put any of more than thirty available hats on Mario.

      • Max Num Simultaneous Images Tracked: 1

      • Environment Capture Probe Type: None

      • World Map Data: Ignore

      • Candidate Objects: Ignore

    4. In the Level Blueprint for your face tracking level, from Begin Play, call the Start AR Session function, and set the Session Config property to the ARSessionConfig data Asset you just created.

  4. Create an Animation Blueprint that uses a LiveLinkPose node, with Subject Name set to iPhoneXFaceAR. This will feed the ARKit face values into the Unreal Engine animation system, which will in turn drive the blend shapes on your character.

    Just remember, these are third-party themes, not Firefox developed ones like the Dark Theme mode we discussed above.If you install a theme you want to get rid of down the road,go back to Customization Themes and select Manage. There you can Enable, Disable, and Remove extra themes youinstall. Do you like using Dark Modeand dark themes on your Mac? Dark mode for mac. Get into Dark ModeHere are some additional articles on using Dark Mode on yourMac and with specific apps.Wrapping it upDark Mode is easy on the eyes, looks great on Mac as well asspecific apps, and is no different for Firefox users.

The AR Face Component

The ARKit face tracking system uses an internal face mesh that it wraps to the user's face and uses as a basis to mimic expressions. In Unreal Engine, this mesh is exposed by the AppleARKitFaceMesh component. This can be added to an existing Blueprint and set up to visualize what the ARKit SDK is seeing, and help you correlate that to how your character's face moves.

AppleARKitFaceMesh component properties:

Component with Tracked: Concatenates the transforms of both the component and tracked data together.

Tracking Only: Ignores the transforms of the component, and only uses the tracked data.

For example, look at this image where the Kite Boy's jaw is opening only with joint rotation, and how it is improved by layering a corrective blend shape on top of it.

On the left is the boy's mouth opening with joint rotation only. Notice that the lower part of the jaw looks too wide. The middle shows the jaw opening with joint rotation, but now with a corrective blend shape layered on it. The jaw is stretching properly and looks more natural. On the right is the corrective blend shape by itself, it contracts the mouth and chin to aid in the stretching process. The idea is that these two systems, joint rotation and corrective blend shapes, will always work together; never one without the other.

More Corrective Blend Shapes

In the Face AR Sample's Animation Blueprint, you'll notice in the Animation Graph a section that is just adding on corrective blend shapes. This is for special correctives such as when the eye is looking in diagonal directions, such as both left and down. Such poses are generally handled by way of additional blend shapes not included in the original list provided with ARKit, and blending them on based on the value of standard shapes.

For example, if you have a special corrective blend shape for when the right eye is looking diagonally down and left, then you could use your Animation Blueprint to read the values of eyeLookDownRight and eyeLookInRight, and use that data to activate a completely separate blend shape. This can bee seen in the Face AR Sample AnimBP.

Creating a Pose Asset for Facial Animation

To create the necessary Pose Asset to drive facial animation from ARKit data:

  1. Create an animation in your DCC app in which:

    1. The first frame is the rest pose, keyframed with no changes.

    2. For frames 2 and on, each frame should be a different keyframed skeletal pose that achieves the pose from Apple's ARKit list. For example, Frame 2 could be eyeBlinkLeft, Frame 3 could be eyeLookDownLeft, and so on.

    3. You do not need to create every single pose requested by the ARKit list, only those that would require joints to move for your rig. For instance, in the case of our Face AR Sample file, jawOpen is handled by way of joint rotation. However, there is also a blend shape that squishes the face in a bit for a more natural look while the jaw is opening.

    4. Note: you can see an example of what this animation will look like in the Face AR Sample project, with the animation Asset named KiteBoyHead_JointsAnim.

  2. You must keep a list of what poses are in the animation, and the order in which they appear. We recommend that you do this in a spreadsheet, so you can easily paste the names into Unreal Engine later.

  3. Import your animation into the Unreal Engine, making sure it is associated with your character's skeleton.

  4. Right-click on the animation in the Unreal Engine and choose Create > Create Pose Asset.

  5. The Asset will have a list of poses for each frame of the animation. You can copy and paste a list of names straight from a spreadsheet to rename them.

Special thanks goes to the team at 3Lateral, who were a great help in setting up the rig for the Kite Boy's face.

Remapping Curve Names in a LiveLinkRemap Asset

  1. In the My Blueprint panel's Function group, choose Override and select Get Remapped Curve Names.

  2. This opens up a function graph with inputs and outputs. The goal is to use this graph to change the incoming name from the expected list of names from Apple's SDK, to a name that corresponds to blend shape names on your character. For instance, if you had a character whose blend shapes were named appropriately, but had “Character_” appended to them, you would use a graph like this:

    Click for full image.

    Notice that it takes the incoming name from the Apple SDK, appends “Character_” to the front, and outputs the result.

Handling Head Rotation

For some projects you may need access to rotation of the tracked face. In the Unreal Engine implementation of ARKit, we pass in the rotation data alongside the face shape values. Within the KiteBlyHead_JointsAndBlends_Anim Animation Blueprint, you will see a section where this data is broken out and applied to the joints of the neck and head via Modify Bone nodes, as shown here:

Click for full image.

The data is sent out by way of 3 curves: HeadYaw, HeadPitch, and HeadRoll.

Deploying to iPhone X

The Face AR Sample project should be deployed to an iPhone X to fully explore its feature set. While there are deployment docs in place already, see iOS Game Development, you may find it easier to use the Project Launcher to deploy the Face AR Sample project to your device.

  1. Open Project Launcher (use the small arrow to the right of the Launch button on the main toolbar).

  2. At the bottom of the window click the + button across from Custom Launch Profiles to create a new profile.

  3. Set the following settings:

    • Build Configuration: Development

    • How would you like to Cook Content: By the Book (also check iOS in the build list)

    • Cooked Maps: FaceTrackingMap_Simplified (we do not recommend deploying FaceTrackingMap2, as it is not optimized for mobile rendering)

    • How would you like to package the build: Do not package

    • How would you like to deploy the build: Copy to Device: All_iOS_On_

      To counteract this problem the app has a calibration system. In the app, the calibration system can be opened by way of the settings button in the lower left corner, then entering Calibration Mode. The app will guide you through the process from there.

      In the Editor, the Face AR Sample project also has a calibration process.

      1. While Simulating in Editor, select the KiteBoy in the scene.

      2. You will see the In Editor Calibration event button in the Details Panel. Click the button to calibrate in the same manner as the app.

      Unreal Engine Build For Mac

      In both cases, the project is recording the current facial capture values received by the SDK, and scaling those to the new zero. The function to gather those values is in different locations depending on whether you are on device or within the editor (in the pawn within app, within the Kite Boy Blueprint in editor). Once gathered, the values are processed in the Animation Blueprint using a Modify Curve node with its Apply Mode setting set to Remap Curve.

      Live Link Broadcasting

      Aside from just being used for amusement, the Face AR Sample showcases how the iPhone X and ARKit can be used as a powerful digital puppeteering and motion capture device. This is done somewhat outside of the standard Live Link workflow, but has been simplified on the app.

      It is important that the device and the computer are on the same physical network—check the WiFi settings on your iPhone to make sure.

      1. Within the app, tap the Settings button.

      2. Tap the Live LInk Connection button,

      3. Enter your IP address into the provided line.

      4. Relax your face as shown in the image.

      5. Tap Connect.

      You are given the option of saving your IP address. This will save your IP address between sessions. However, we intentionally do not save the state of the Save IP Address checkbox, so you must confirm the setting each time you relaunch the app.

      Show Flag Checkboxes

      The Face AR Sample app includes a few checkboxes for features that can be turned on and off to display specific features.

      • Show Debug Mesh

      • Show Debug Values

      • Show Unit Stats

      Show Debug Mesh

      This checkbox shows and hides Apple's ARKit debug mesh. This is the mesh the SDK is using to track the motion of the user's face. Within the app, this is rendered with a very simple unlit wireframe material.

      If using the Face AR Sample app as a facial motion capture puppeteering device, it is recommended that you only show the Debug Mesh. This is faster, more performant, and has less of an impact on device thermals. This is important, as performance of the device diminishes if it overheats.

      Download Unreal Engine 4 For Windows 10

      Show Debug Values

      Show Debug Values give you direct visualization of the numeric float data being passed from ARKit into the Unreal Engine. These values are separate from any calibration offsets that are in place. Use the debug values to help diagnose discrepancies between the incoming ARKit data, and the expected result in your apps.

      Show Unit Stats

      Show Unit Stats is the same as typing STAT UNIT into the console within the app. This just opens up the standard unit stats in the Engine, so you can see performance numbers on the device.

      Help & About

      The Help & About screen is an in-app overview of the Face AR Sample, similar to what you see on this page.

      Unreal Engine Machine Learning

      Connecting the App to Your Computer

      Unreal Engine For Mac Download

      One of the more exciting features of the Face AR Sample project is that it can be used as a motion capture device on your computer. The app has been streamlined to make this process as painless as possible, but before you begin, verify that the device and your computer are on the same physical network.

      It is important that the device and the computer are on the same physical network—check the WiFi settings on your iPhone to make sure.

      1. Launch the Face AR Sample project on your computer.

      2. Open the FaceTrackingMap2 map in the editor and navigate to a viewing position directly in front of the character.

      3. Press Simulate in the Editor (located under the arrow next to the Play in Editor button).

      4. On your device, launch the Face AR Sample app.

      5. After a few seconds, the settings button appears in the lower left corner. Tap it.

      6. Choose LiveLink Connection from the Settings panel.

      7. Enter your computer's IP address into the provided line.

      8. Tap Connect.

Name

Description