Face Tracking with RealityKit [SUBSCRIBER]

Posted on


Feb 23 2021 · Video Course (37 mins) · Advanced


  • Swift 5.3, iOS 14, Xcode 12.4

Find out what RealityKit has to offer and how to set up a RealityKit project with the required permissions.


Learn how to get around in Reality Composer, set up your first face anchor, and add props from Xcode’s built-in library.


Start an ARSession and configure it for face tracking with help from ARKit. Access objects from Reality Composer with Swift.


Set up the app to handle switching between different props by adding and removing anchors from the ARView.


Set up an ARSessionDelegate to handle live updates to face anchors and drive animation based on where a user is looking.


Learn how to access a ton of information about a user’s facial movement via the face anchor’s blend shapes.


Use the movement of a user’s jaw to drive the jaw animation of a 3D robot head! Learn a bit about quaternions.


Use the movement of a user’s eyes and eyebrows to drive the eyelid animation of a 3D robot head! Apply multiple rotations to a single object.


Try out Reality Composer’s behavior system to add animation and sound effects to your robot experience.


Learn how to trigger behaviors from Reality Composer from your Swift code. Add some lights!


This course is for experienced iOS developers who are comfortable with Swift and have some familiarity with AR technology.

  • RealityKit
  • Reality Composer
  • ARSession & ARSessionDelegate
  • Face Anchors
  • Blend Shapes
  • Reality Composer Behaviors


Catie Catterwaul

Catie makes things for, with, and about Apple tech in collaboration with her husband, Jessy! She is inspired by everyone at…


Adriana Kutenko

Graphic Illustrator with a Bachelor’s Degree in Fine Arts. I am a perpetual perfection seeker with a big passion for History…