Diploma in Apple Development 🍎👩🏻‍💻: Coding Two: Lecture 4: Happy Beam and recognising hand shapes.
Back to slide index.
👋🏻 Hi!
First, don't forget to confirm your attendance using the Seats Mobile app!
Second, I'm going to start every lecture with a meditation from the fantastic Sitting Still Like a Frog book, all read by Myla Kabat-Zinn. If you don't want to participate that's completely fine.
Today's lecture is going to be a deep dive into the visionOS sample code from Apple: Happy Beam. Let's start by trying the app out amongst ourselves on one of our three Vision Pro headsets.
Great, now that we've all experienced the app, let's take a look at this article which details how to set up access to ARKit data on device. TLDR: you can add it to your project's app’s information property list or in code directly.
Now that we've understood how to enable access, lets go through the Happy Beam article and source code in detail. The way this app deals with the state of the game would be a great thing to duplicate in your Spatial 2 solo app - as well as its fantastic accessibility features - how many different ways does it allow you to play the game?
Stop press! Lots of hand tracking demos from GitHub, shall we give them a try?
Thanks! As always, please review all today's content as homework.
Back to slide index.