XR Developer News - WWDC 2024

XR Developer News - WWDC 2024
visionOS 2 features at WWDC 2024

Apple's WWDC contained a ton of visionOS 2 updates for XR developers, spread across no less than 29 separate prerecorded sessions which became available throughout the week. I've gone through all of them to sort out the most interest news and figure out which ones are worth watching.

In this special edition of XR Developer News, I'm going into full detail on visionOS 2. If you're interested in the broader XR developer ecosystem, be sure to check out the regular monthly roundup and subscribe to get it by email.

Having gone through everything, I think visionOS 2 is actually a pretty solid release, with many, many smaller improvements which really bring the OS and developer platform a step forward. There weren't that many headline features, especially since Apple already released spatial Personas back in April, but after diving deeper into the details, I'm definitely not disappointed.

WWDC24 visionOS guide - Discover - Apple Developer
The infinite canvas is waiting for you.

To get the basics out of the way, it's good to check out the keynote which started with an 8 minute section (starts at 05:42) on visionOS 2. I'm not going to rehash all the general, non-developer updates to visionOS, because those have been covered by other publications more than sufficiently already (and it's only 8 minutes, so just have a go at watching it).

WWDC 2024 Keynote

The Platforms State of the Union went a bit more in depth, with two interesting sections about RealityKit (starts at 42:00 for 2 minutes) and visionOS 2 (starts at 56:15 for 6 minutes). It offers a good, but very high level overview of what's new, so definitely have a look there as well.

WWDC 2024 Platforms State of the Union

It's not until you get to the 29 deep dive sessions that it really gets interesting though. Before I go into each one, a few general things that stood out:

  • The most interesting new capabilities came as part of the new Enterprise API. Be sure to check out the session dedicated to that topic. It adds really powerful developer capabilities, but because those pose a serious privacy risk, they are only available to Enterprise use cases and developers. Still, good to have a look at what a device like this can do when it's opened up a bit further to developers.
  • Even more than during the original Vision Pro announcement last year, Apple is all in on 'native' development for visionOS, which means RealityKit, Reality Composer Pro, Swift UI, ARKit, etc. While during the original announcement of the Vision Pro, Unity got a bit of attention, including two dedicated sessions, it was almost nowhere to be seen this time. Unity of course will add support for visionOS 2. Not surprisingly, Epic's Unreal Engine was also completely absent.
  • The web got some love this time, with the most important change being that WebXR support is now by default enabled in Safari on visionOS, instead of an experimental toggle as before. However - and this is a major however - it's still VR-mode only, so no camera passthrough. This still eliminates the web as an option for any AR-type functionality, which happens to be the core use case for visionOS.

Then on to the session videos, which contain a wealth of details. As stated, there are a whopping 29 of them! All of them are also on YouTube, where you can easily watch them at double speed (the presenters often speak very slowly, so that's definitely feasible), but in the list below I've linked to the Apple site for each session, because that contains links to additional resources such as code samples.

Spatial Computing - Videos - Apple Developer

Below I've sorted them into a few groups:

  • Designing for visionOS
  • 3D assets
  • Reality Composer Pro
  • RealityKit
  • SwiftUI
  • Enterprise API and ARKit
  • visionOS and the Web
  • Other frameworks, functionalities and topics

I've also added some comments on what the contents of each session are and how relevant or interesting they are, based on three 'Must watch?' scores:

  • 3/3 means a must watch. There are only 3 of those, about 3D assets, RealityKit and the Enterprise API, so definitely make the time to watch those.
  • 2/3 means a pretty interesting session, relevant to many, but not everyone. There are 11 of those, so pick which ones interest you and check those out.
  • 1/3 means they are really in depth about a very specific topic, so they're usually only relevant if you're using the specific feature it covers. But if you are, they can be very good. There are 15 of those sessions, so be very selective.

Here and there I've added some additional details and links to related news, to offer more context.

Ready? Here goes!


Designing for visionOS

Let's start with the sessions about designing. The first one deals mainly with apps and the is more on the gaming/interactive side. While you're at it, check out the winners and nominees of the Apple Design Awards

2024 winners and finalists - Apple Design Awards - Apple Developer
Meet the winners and finalists for the 2024 Apple Design Awards
  • (Must watch? 2/3) Design great visionOS apps (link) won't offer a lot of new insights to those already familiar with designing visionOS apps, but it does offer a good recap of best practices. The nice thing about this session is that uses a lot of recent examples, so it offers some really nice practical inspiration.
    • An interesting thing about this session and the related resources is that Apple offers Figma and Sketch design templates, which are 2D design tools, and doesn't offer anything for 3D-native design tools like ShapesXR or Gravity Sketch. I'd hope that will change as the practice evolves!
Design great visionOS apps - WWDC24 - Videos - Apple Developer
Find out how to create compelling spatial computing apps by embracing immersion, designing for eyes and hands, and taking advantage of…
  • (Must watch? 2/3) Design interactive experiences for visionOS (link) is a case study about the really good Encounter Dinosaurs experience, which is basically the default app to show people who first try a Vision Pro. It deals more with interactive narrative apps than the UI-heavy focus of the previous session. It offers some pretty good pointers.
Design interactive experiences for visionOS - WWDC24 - Videos - Apple Developer
Learn how you can design a compelling interactive narrative experience for Apple Vision Pro from the designers of Encounter Dinosaurs…

3D assets

Next up are 3D assets and the sessions directly or indirectly related to it. The first session I would consider mandatory viewing for anyone designing or developing a visionOS app or game, but the other sessions deal with very specific subtopics, so be more picky with those.

  • (Must watch? 3/3) Optimize your 3D assets for spatial computing (link) deals with a lot of fundamentals of how far you can push the Vision Pro. It goes into nitty gritty of things like polygon budgets, USD files, importing from Blender, textures + materials + shaders, colour spaces, etc. The type of knowledge designers, artists and developers need to know, and technical artist most of all.
Optimize your 3D assets for spatial computing - WWDC24 - Videos - Apple Developer
Dive into an end-to-end workflow for optimized 3D asset creation. Discover best practices for optimizing meshes, materials, and textures…
  • (Must watch? 1/3) Create custom environments for your immersive apps in visionOS (link) offers some guidance on creating fully immersive (VR) environments. It covers some basics on workflow (e.g. assets from Blender), polygons, textures, lighting, texture baking, importing into Reality Composer Pro, etc, but it stays very high level on all those topics. If you're already familiar with these types of topics in XR development on other platforms, it will not offer many new insights, because it's more introductory level.
Create custom environments for your immersive apps in visionOS - WWDC24 - Videos - Apple Developer
Discover how to create visually rich and performant customized app environments for Apple Vision Pro. Learn design guidelines, get expert…
  • (Must watch? 1/3) What’s new in USD and MaterialX (link) runs through a bunch of updates on RealityKit, MaterialX, ShaderGraph, USD, Preview and Storm. It's a bit of a strange session, because those are very technical topics, but it stays on a very high level on all of them. It serves more as a quick overview, so people know what updates to look for in the visionOS 2 documentation.
    • I found it interesting to see that handling of and tooling for 3d files in MacOS is improving steadily, really moving it more to a first class citizen status.
What’s new in USD and MaterialX - WWDC24 - Videos - Apple Developer
Explore updates to Universal Scene Description and MaterialX support on Apple platforms. Discover how these technologies provide a…
  • (Must watch? 1/3) What’s new in Quick Look for visionOS (link) explains how to integrate Quick Look functionality into any visionOS app, to preview 3D assets and other spatial media. It also covers improvements to how Quick Look works. Mainly interesting to watch if you use Quick Look functionality in your visionOS app.
What’s new in Quick Look for visionOS - WWDC24 - Videos - Apple Developer
Explore how Quick Look in visionOS can elevate file preview and editing experiences in your app. We’ll cover the integration of in-app…
  • (Must watch? 1/3) Discover area mode for Object Capture (link) goes over improvements to the photogrammetry technology offered by Apple. In addition to capturing objects, it now supports area capture. Interesting to see Apple pushing forward the built-in capabilities, offering a free alternative to third party alternatives in the market. Good to watch if you're using this type of technology.
Discover area mode for Object Capture - WWDC24 - Videos - Apple Developer
Discover how area mode for Object Capture enables new 3D capture possibilities on iOS by extending the functionality of Object Capture to…

Reality Composer Pro

In a sense, Reality Composer Pro is Apple's version of the main Unity UI, where you compose your scenes. Like many other similar XR development tools, it's a bit like a Unity-lite, gradually adding features in an attempt to catch up with Unity and Unreal. There were two interesting sessions about it, one more focused on new improvements, and the other more an overview session covering multiple aspects based on a sample project.

  • (Must watch? 2/3) Compose interactive 3D content in Reality Composer Pro (link) covers the addition of timelines to Reality Composer Pro. It's a really good fundamental feature, but it also shows how much Apple still has to build compared to full fledged real-time engines like Unity and Unreal. This session also covers a lot of other improvements in the area of animation in Reality Composer Pro, so it's pretty vital to watch if you're doing anything in that area.
Compose interactive 3D content in Reality Composer Pro - WWDC24 - Videos - Apple Developer
Discover how the Timeline view in Reality Composer Pro can bring your 3D content to life. Learn how to create an animated story in which…
  • (Must watch? 2/3) Enhance the immersion of media viewing in custom environments (link) covers a lot of topics based on a sample app, but most of it is about Reality Composer Pro. It's interesting because it pulls together a number of topics into one coherent whole.
Enhance the immersion of media viewing in custom environments - WWDC24 - Videos - Apple Developer
Extend your media viewing experience using Reality Composer Pro components like Docking Region, Reverb, and Virtual Environment Probe…

RealityKit

RealityKit is the framework which is the foundation of the logic coding of your visionOS app or game. Because it's so fundamental, the first session below is mandatory viewing, as it overviews all the improvements to RealityKit.

  • (Must watch? 3/3) Discover RealityKit APIs for iOS, macOS and visionOS (link) covers an overview of many things new in RealityKit (e.g. custom hover effects, custom hand tracking, real-time physics, lights and shadows and improved cross-platform support for RealityKit), based on an example spaceship game. It's a useful session, because it's the entry point to several of the other deep dive sessions, so offers a very good starting point.
Discover RealityKit APIs for iOS, macOS and visionOS - WWDC24 - Videos - Apple Developer
Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover…
  • (Must watch? 1/3) Enhance your spatial computing app with RealityKit audio (link) builds directly on the previous session by adding spatial audio to the sample spaceship game. Great session if you're doing something with audio, but fine to skip if you're not.
Enhance your spatial computing app with RealityKit audio - WWDC24 - Videos - Apple Developer
Elevate your spatial computing experience using RealityKit audio. Discover how spatial audio can make your 3D immersive experiences come…
  • (Must watch? 2/3) Build a spatial drawing app with RealityKit (link) uses an example project to dive a bit further into some topics from the basic RealityKit session. However, the second half of this one dives into extremely deep technical topics around meshes and memory, so that part is definitely not for everyone.
Build a spatial drawing app with RealityKit - WWDC24 - Videos - Apple Developer
Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience…
  • (Must watch? 2/3) Break into the RealityKit debugger (link) is a crucial session once you get into building with RealityKit at any serious level. Eventually something will not work as expected and you'll need to debug why. That's the moment to pull up this session, because it covers the debugging tooling, and also covers a few common problems and there causes and solutions. Crucial to have in your developer toolbox.
Break into the RealityKit debugger - WWDC24 - Videos - Apple Developer
Meet the RealityKit debugger and discover how this new tool lets you inspect the entity hierarchy of spatial apps, debug rogue…

SwiftUI

While RealityKit covers logic, SwiftUI is the foundation for most user interface in visionOS. Whether or not these session are useful depends a lot on how much you use SwiftUI in your visionOS app or game.

  • (Must watch? 2/3) Work with windows in SwiftUI (link) is an introductory level session on SwiftUI, basically at tutorial level. It does however cover some of the new updates to the framework. It's a useful session if you need to be up to date on the latest in Swift UI.
Work with windows in SwiftUI - WWDC24 - Videos - Apple Developer
Learn how to create great single and multi-window apps in visionOS, macOS, and iPadOS. Discover tools that let you programmatically open…
  • (Must watch? 2/3) Dive deep into volumes and immersive spaces (link) mixes both design and technical topics, with a lot SwiftUI mixed into it. Useful on both aspects if you're considering building something new for visionOS.
Dive deep into volumes and immersive spaces - WWDC24 - Videos - Apple Developer
Discover powerful new ways to customize volumes and immersive spaces in visionOS. Learn to fine-tune how volumes resize and respond to…
  • (Must watch? 1/3) Create custom hover effects in visionOS (link) offers a deep dive into the new custom hover effects functionality. Mostly SwiftUI, and only relevant if you're making use of this specific feature.
Create custom hover effects in visionOS - WWDC24 - Videos - Apple Developer
Learn how to develop custom hover effects that update views when people look at them. Find out how to build an expanding button effect…

Enterprise API and ARKit

  • (Must watch? 3/3) Introducing enterprise APIs for visionOS (link) covers the new Enterprise API, which offers some really powerful new capabilities to developers. Because those capabilities have serious privacy implications, they are only available to enterprise developers which meet certain criteria. Still, the access to the raw main camera feed for running custom computer algorithms, screen capture for remote support scenarios, QR code / barcode scanning, and ability to push the hardware harder by getting access to the Neural Engine part of the chip and trade battery and heat for performance, are really interesting. Make sure you at the very least check out the 2 minute demo around the 9 minute mark in the video.
    • These capabilities are the most ground breaking changes in visionOS 2, and are a great look into the future of what these devices can offer as they integrate more into our daily lives. My guess is that as we get used to them and figure out the etiquette and security measures around user privacy, more of these capabilities will gradually become available in consumer application, behind user permission protections. Once that happens, it should bring a whole bunch of new interesting use cases to life.
Introducing enterprise APIs for visionOS - WWDC24 - Videos - Apple Developer
Find out how you can use new enterprise APIs for visionOS to create spatial experiences that enhance employee and customer productivity…
  • (Must watch? 2/3) Create enhanced spatial computing experiences with ARKit (link) goes into the improved capabilities of ARKit on visionOS 2. Normally I would highly recommend this, but ARKit is only available in the immersive space type of apps on visionOS, so there are many applications to which these capabilities are not available. Still, improvements to room tracking, tracking slanted surfaces, object tracking, world tracking and hand tracking are all very welcome and solid all around.
Create enhanced spatial computing experiences with ARKit - WWDC24 - Videos - Apple Developer
Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking…
  • (Must watch? 2/3) Explore object tracking for visionOS (link) deep dives into on of the ARKit capabilities from the previous session, namely object tracking. It deals with the full workflow with scanning of a 3D object (see the relevant earlier session), turning that into a machine learning model for tracking, and then integrating that into your app. It's not necessarily a ground breaking feature, because object tracking has been available in third party solution (e.g. Vuforia) for years, but still interesting to see Apple's approach and it's an interesting capability to play with.
Explore object tracking for visionOS - WWDC24 - Videos - Apple Developer
Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build…

visionOS and the Web

I'm always fascinated by the odd relationship Apple has with the spatial web. On the one hand, there is definite progress each year, but on the other it always lags behind its competitors, certainly Meta with Quest OS, and Apple's web implementations always seem fundamentally limited.

The major change this year is that WebXR is no longer experimental, so doesn't require diving into Safari settings to enable experimental toggles, making it much more accessible. The HUGE caveat to this, is that still only the VR-mode of WebXR is supported, because the camera passthrough video feed is not available to the browser, eliminating all web-based augmented reality use cases. As visionOS is primarily about those latter use cases, this unfortunately hamstrings the web on visionOS to the point where you can fairly ask what the point of supporting it at all is.

I'm assuming this is because it's hard to lock down access to the camera feed to the websites once it's available, which would make it a sensible privacy protection, similar to the limitations on the Enterprise API. Still, with Apple's history and current struggles against EU legislation, it's I think not unfair to assume business considerations around percentages of transaction fees which can't be enforced in web-based transactions are potentially somewhere in mind as well when it comes to priorities on supporting WebXR versus supporting the app-based approach to visionOS experience building.

Apple released two blogs posts with full details on WebKit in Safari 18 and how to test with the simulator and a real device, which are good to read through if you work with the tech. Those posts overlap a lot with the content from the two sessions below.

  • (Must watch? 1/3) Optimize for the spatial web (link) focuses on how to adapt traditional websites to work well when viewed in the flat Safari browser on visionOS (including through speech interaction) and make use of the capabilities of the device to show spatial content in a device-specific way. It also covers how to inspect and debug your website.
    • A fair question is whether it makes sense at all to make these modifications to any generic flat website, because there is pretty much no user base for now. If that grows in the future however, it might start making sense at some point, and if you're building web experiences tailored to visionOS, it's a useful session.
Optimize for the spatial web - WWDC24 - Videos - Apple Developer
Discover how to make the most of visionOS capabilities on the web. Explore recent updates like improvements to selection highlighting,…
  • (Must watch? 1/3) Build immersive web experiences with WebXR (link) goes into how to build fully immersive web-based virtual reality experiences on visionOS based on the WebXR standard. The session is at an introductory level, so easily accessible.
    • I have to give Apple credit for the privacy aware way in which they deal with permissions around data access, even in VR WebXR experiences which don't get camera access. Very good.
    • It's interesting to see how in this context suddenly the terminology 'virtual reality' is being used by Apple employees, while everywhere else it is avoided completely.
    • The session names a number of third party frameworks and tools that can be used for web development: three.js, babylon.js, PlayCanvas, Wonderland Engine and A-Frame. It goes a bit more into depth with the last of those, as that is a relatively simply framework to build on.
Build immersive web experiences with WebXR - WWDC24 - Videos - Apple Developer
Discover how WebXR empowers you to add fully immersive experiences to your website in visionOS. Find out how to build WebXR experiences…

Other frameworks, functionalities and topics

Then there's a grab bag of various other topics, often about specific SDKs and functionalities. The first one around spatial Personas is the most interesting, not because of the new custom templates it covers, but more because spatial Personas is I think underappreciated in general and good to take note of more.

  • (Must watch? 2/3) Customize spatial Persona templates in SharePlay (link) covers everything spatial personas and SharePlay, which is a pretty nice visionOS feature in which Apple is clearly ahead of Meta. The session describes custom placements of groups of people relative to each other and the content in a remote multiplayer session and goes through both design considerations and technical implementation, based on example app. Good topic to know a bit about.
Customize spatial Persona templates in SharePlay - WWDC24 - Videos - Apple Developer
Learn how to use custom spatial Persona templates in your visionOS SharePlay experience to fine-tune the placement of Personas relative…
  • (Must watch? 1/3) Meet TabletopKit for visionOS (link) is about a completely new SDK which offers a lot of basic building blocks for building table top games. so teams can focus more on the actual game mechanics without having to spend time on some of the fundamentals of placement, board, seats, tiles, rules, effects, multiplayer, SharePlay, etc. Definitely a good session if you're working on a table top game or something similar, but not that interesting otherwise.
    • Interesting that Apple expects this to be such a core use case, that it made a dedicated framework for it.
Meet TabletopKit for visionOS - WWDC24 - Videos - Apple Developer
Build a board game for visionOS from scratch using TabletopKit. We’ll show you how to set up your game, add powerful rendering using…
  • (Must watch? 1/3) Build compelling spatial photo and video experiences (link) cover the basics of various media types and options for displaying them. Also covers some of the API changes related to recording (on suitable iPhones) and displaying the media. It's not strictly a design video, as it also covers technical topics, but focus is on what to do with photo and video and it's not extremely in depth. Interesting if you're working with these specific types of media.
Build compelling spatial photo and video experiences - WWDC24 - Videos - Apple Developer
Learn how to adopt spatial photos and videos in your apps. Explore the different types of stereoscopic media and find out how to capture…
  • (Must watch? 1/3) Explore multiview video playback in visionOS (link) is even more specific, as it dives into showing multiple video feeds at once, which is an extremely narrow use case.
Explore multiview video playback in visionOS - WWDC24 - Videos - Apple Developer
Learn how AVExperienceController can enable playback of multiple videos on Apple Vision Pro. Review best practices for adoption and…
  • (Must watch? 1/3) Render Metal with passthrough in visionOS (link) is potentially the most hardcore session of them all. It's a pretty cool deep dive into building highly custom graphics combined with ARKit, but this is really only for the most extreme developers and artists.
Render Metal with passthrough in visionOS - WWDC24 - Videos - Apple Developer
Get ready to extend your Metal experiences for visionOS. Learn best practices for integrating your rendered content with people’s…
  • (Must watch? 1/3) Get started with HealthKit in visionOS (link) is much more about accessing HealthKit data gathered on other devices on displaying it on visionOS, than using visionOS as sensor or health tracking device (not surprising with Meta Quest being much more of a fitness device).
Get started with HealthKit in visionOS - WWDC24 - Videos - Apple Developer
Discover how to use HealthKit to create experiences that take full advantage of the spatial canvas. Learn the capabilities of HealthKit…
  • (Must watch? 1/3) Explore game input in visionOS (link) is I think the only session in which Unity is mentioned at all, even though it's only for 10 seconds or so. It goes into various options for working with gestures and game controllers, if those are connected. Interesting if you're working on a use case which depends on either.
Explore game input in visionOS - WWDC24 - Videos - Apple Developer
Discover how to design and implement great input for your game in visionOS. Learn how system gestures let you provide frictionless ways…
  • (Must watch? 1/3) Bring your iOS or iPadOS game to visionOS (link) goes through considerations when bringing an iOS game to visionOS, but only for iOS native games, not those built in Unity, Unreal, etc. So it's only relevant to those very specific cases.
Bring your iOS or iPadOS game to visionOS - WWDC24 - Videos - Apple Developer
Discover how to transform your iOS or iPadOS game into a uniquely visionOS experience. Increase the immersion (and fun factor!) with a 3D…

And with that, we're through the list!

As I said, I'm actually pretty impressed by how much is in there compared to visionOS 1. It's not all flashy new features, but many many gradual small improvements, additions and upgrades all over the developer part of the OS, making it very welcome. People tend to underestimate how important this type of progress is to making a platform a success, so it gives me high hopes on what's to come.


A bit about this newsletter

Each month I try to round up all the interesting developments in the XR developer landscape. New hardware and software releases, hackathons, interesting tooling, etc. Feel free to reach out to me on LinkedIn, for instance if I missed anything which definitely should be in this monthly round up next time. And if you like the newsletter, subscribe to receive it by email as soon as it comes out.

Hope it is useful!