Diploma in Apple Development πŸŽπŸ‘©πŸ»β€πŸ’»: Coding Two: Lecture 8: Scene Reconstruction and World Tracking.
Back to slide index.
πŸ‘‹πŸ» Hi!
First, don't forget to confirm your attendance using the Seats Mobile app!
Second, I'm going to start every lecture with a meditation from the fantastic Sitting Still Like a Frog book, all read by Myla Kabat-Zinn. If you don't want to participate that's completely fine.
Let's start by watching this session from WWDC24: "Create enhanced spatial computing experiences with ARKit".
Now let's go through the example code from that session, let's get it building on all three Vision Pro's and then go through the code.
Next, let's take a look at this Introductory visionOS sample: "Obscuring virtual items in a scene behind real-world items".
Next, let's take a look at this sample code: "Tracking specific points in world space".
Next, let's take a look at this Introductory visionOS sample: "Applying mesh to real-world surroundings".
Finally, let's look at these two examples of using world tracking and head movement tracking:
Thanks! As always, please review all today's content as homework.
Back to slide index.