Diploma in Apple Development 🍎👩🏻‍💻: Coding Two: Lecture 7: Tracking objects.
Back to slide index.
👋🏻 Hi!
First, don't forget to confirm your attendance using the Seats Mobile app!
Second, I'm going to start every lecture with a meditation from the fantastic Sitting Still Like a Frog book, all read by Myla Kabat-Zinn. If you don't want to participate that's completely fine.
Let's start by watching this video from WWDC24: "Explore object tracking for visionOS". The source code for the app mentioned in the presentation is here, but let's save that for later.
First, let's take a look at this article from Apple: "Implementing object tracking in your visionOS app". TLDR: we need a USDZ file of the object we want to track. I have a poly.cam account, so lets use that to make three different scans. I'll AirDrop the files to one of three groups:
  1. This group is going to use this article to use Reality Composer Pro to make an app.
  2. This group is going to use this article to use RealityKit to make an app.
  3. This group is going to use this article to use ARKit to make an app.
What's the maximum number of objects you can track at once? Remember you can use GitHub code search and Claude for help!
Finally, let's look throughthe source code of the presentation we saw at the beginning.
Thanks! As always, please review all today's content as homework.
Back to slide index.