Face Tracking with RealityKit [SUBSCRIBER]

Posted on

Pro

Feb 23 2021 · Video Course (37 mins) · Advanced

Version

  • Swift 5.3, iOS 14, Xcode 12.4

Find out what RealityKit has to offer and how to set up a RealityKit project with the required permissions.

1




Learn how to get around in Reality Composer, set up your first face anchor, and add props from Xcode’s built-in library.

2




Start an ARSession and configure it for face tracking with help from ARKit. Access objects from Reality Composer with Swift.

3




Set up the app to handle switching between different props by adding and removing anchors from the ARView.

4




Set up an ARSessionDelegate to handle live updates to face anchors and drive animation based on where a user is looking.

5




Learn how to access a ton of information about a user’s facial movement via the face anchor’s blend shapes.

6




Use the movement of a user’s jaw to drive the jaw animation of a 3D robot head! Learn a bit about quaternions.

7




Use the movement of a user’s eyes and eyebrows to drive the eyelid animation of a 3D robot head! Apply multiple rotations to a single object.

8




Try out Reality Composer’s behavior system to add animation and sound effects to your robot experience.

9




Learn how to trigger behaviors from Reality Composer from your Swift code. Add some lights!

10




This course is for experienced iOS developers who are comfortable with Swift and have some familiarity with AR technology.

  • RealityKit
  • Reality Composer
  • ARSession & ARSessionDelegate
  • Face Anchors
  • Blend Shapes
  • Reality Composer Behaviors

Contributors




Catie Catterwaul

Catie makes things for, with, and about Apple tech in collaboration with her husband, Jessy! She is inspired by everyone at…

Instructor




Adriana Kutenko

Graphic Illustrator with a Bachelor’s Degree in Fine Arts. I am a perpetual perfection seeker with a big passion for History…

Illustrator