Apple’s next big product looks like it’ll cost $3,000, rest on your face and need to be tethered to a battery pack. Whatever this expected VR headset ends up being, it isn’t immediately clear what it’ll do or who it’s for. The Reality Pro headset, as it’s expected to be called when it’s likely unveiled at Apple’s WWDC developer conference on June 5, is Apple’s biggest new product in nearly a decade. It’s also totally different than anything Apple has ever made before.
VR headsets have been a standard consumer tech thing for years, and your family, or families you know, may already have one lying in a corner. They’re used for games, fitness, creative collaboration, even theater. Still, VR and AR have been outlier technologies, not deeply connected enough to the phones, tablets and laptops most of us use every day.
Apple could change that. And of course, don’t expect the word “metaverse” to be uttered even once. The metaverse became Meta’s buzzword to envision its future of AR and VR. Apple will have its own parallel, possibly unique, pitch.
A connection to everything?
I pair my Quest 2, from Meta, to my phone, and it gets my texts and notifications. I connect it to my Mac to cast extra monitors around my desk using an app called Immersed. But VR and AR don’t often feel deeply intertwined with the devices I use. They aren’t seamless in the way my watch feels when used with an iPhone, or AirPods feel when used with an iPad or Mac.
Apple needs this headset to bridge all of its devices, or at least make a good starting effort. Reports say the headset will run iPad apps on its built-in 4K displays, suggesting a common app ecosystem. It’s also possible that the Apple Watch could be a key peripheral, tracking fitness and also acting as a vibrating motion-control accessory.
VR is a self-contained experience, but mixed reality – which Apple’s headset should lean on heavily – uses pass-through cameras to blend virtual things with video of the real world. In Apple’s case, its own devices could act as spatially linked accessories, using keyboards and touchscreens and ways to show virtual screens springing from real ones.
Apple’s expected headset is supposed to be self-contained, a standalone device like the Quest 2 and Quest Pro. But that interconnectivity, and its position in Apple’s continuity-handoff connected ecosystem, is a big opportunity and a big question mark.
However, Apple does have a big AR head start: Its iOS ecosystem has supported AR for years, and the iPhone and iPad Pro already have depth-sensing lidar scanners that can map out rooms in ways that Apple’s headset should replicate. Apple could emphasize making its existing AR tools on other devices more usable and visible through a new interface.
Apple’s head of AR, Mike Rockwell – the person expected to be leading this new headset’s development – told me in a conversation about AR in 2020 that “AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we’ve got to make sure that it is successful. For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort.”
How do we control it?
I’m less curious about the Apple headset display – which sounds extremely promising with a possible 4K resolution per eye and a Micro OLED display – and more focused on how Apple solves what we do with our hands.
Interfaces in VR and AR are very much a work in progress. VR has tended to lean on split game controllers for most inputs, with optional (and steadily improving) hand tracking that still isn’t perfected.
Apple isn’t expected to have any controller at all with its Reality Pro headset. Instead, it’ll likely use both eye tracking and hand tracking to create a more accurate and possibly streamlined style of interface that could make targeting intended actions feel faster. Eye tracking already works this way, sometimes, in headsets that use it: The PlayStation VR 2 has some games that use eye tracking for controlling menus.
Accessibility is a big question here. Apple’s design choices are often very accessibility-conscious, and VR and AR headsets often rely on eye movement or physical hand movements that aren’t always easy for everyone. Voice control is a possible option here, or maybe some Apple Watch-connected functions that improve gesture accuracy and offer some touch controls could be in the cards, too. I don’t know. Apple already added some gesture controls for accessibility purposes on the Apple Watch, so the door’s open.
A lot of hand gestures in VR feel complicated to me, and involve lots of movement. Can Apple make a gesture language that feels as intuitive and as easy as multitouch on iPhones and iPads? It’s a big hurdle.
VR has already been a surprisingly effective fitness tool for years. Apple could address a whole bunch of opportunities that could open the landscape a lot further, though.
I’ve used Beat Saber and Supernatural on the Quest 2 for years as home exercise options, but the Quest 2 (and most VR headsets) aren’t designed with fitness in mind. Foam and silicone face pieces get sweaty, hardware can feel weirdly balanced, and no company has really spent targeted effort yet on making headgear that’s aimed at breathability and comfort like a piece of athletic equipment. There are plenty of third-party Quest accessories that help, but it still feels like an imperfect situation.
That’s Apple’s wheelhouse. After designing the Apple Watch, AirPods and, most recently, the Watch Ultra’s new straps, conceiving of materials and design that could feel better during workouts seems like an achievable goal. If the Reality Pro feels like a better piece of workout gear, it could inspire others to invest in better designs, too.
Apple should, and could, integrate the Apple Watch and fitness and health tracking into the headset’s functions. The Quest 2 can do this too to some degree, but most smartwatches and fitness trackers, like Fitbit, don’t have deep connections with VR headsets yet. They should, and again, introducing a clear wearable relationship between watch and headset feels like an overdue bridge.
Of all the things I’m trying to imagine Apple positioning an expensive headset to be in people’s lives, a fitness device keeps coming to mind as a much more likely proposition than a gaming gadget. Not that many people own gym equipment, or have space for it. Could headsets fill that role? I think they could. For me, they already do, sometimes.
Will Apple just focus on making it a great wearable display?
I’m starting to wonder if maybe Apple’s first goal with Reality Pro is just to nail a great audio/video experience. I’ve thought of VR/AR glasses as eventually needing to be “earbuds for your eyes,” as easy to use and as good as headphones are now. VR and AR headsets I’ve used all far short of being perfect displays, with the exception of the highly expensive Varjo XR-3. Could Apple achieve making the Reality Pro a headset that looks and sounds good enough to truly want to watch movies in?
Some reports that the Apple headset runs iPad apps, and that perhaps the iPad Pro with its lidar/camera array is in fact the “developer kit” for the headset, make me wonder if the headset will feel like a wearable extension of iOS rather than a whole new experience.
What about my glasses?
VR and AR headsets aren’t making it easy for me to live with my own eyewear. Some hardware fits right over my own chunky glasses, and some doesn’t. As headsets get smaller, a lot of them are trying to add vision-adjustment diopters right into the hardware – like the Vive XR Elite – or add optional prescription inserts.
Maybe someday we’ll have AR glasses that double as our own everyday glasses, and Apple can morph into a Warby Parker optical shop for its retail glasses fittings. In the meantime, these sometimes-on headsets also need to work without being annoying. Am I going to have to order prescription lenses? And how? And will they fit my needs? It’s a big responsibility for VR/AR manufacturers, and I’ve found that some of the insert options don’t meet my heavily near-sighted needs.
What are the killer apps?
Finally, of course, I’m curious about how this headset is defined. The Quest 2 is a game console with benefits. The Quest Pro was aimed at work. The PlayStation VR 2 is a PS5 extension.
The iPhone was a browser, an iPod, and an email device at first. The iPad wanted to be an easy way for users to read and look at the web. The Apple Watch was a fitness device, iPod, and wrist-communicator. What will Version One of the Apple Mixed Reality Headset be positioned as?
Apple did pepper a ton of extras into the Apple Watch at first, almost to test the waters with possibilities: a camera remote, a virtual way to send love taps and scribbles, voice memos. Reports of an avatar-based FaceTime, multiscreen immersive sports, and maybe 3D immersive versions of Apple’s already 3D-enabled Maps are clear starts. Apple’s collaborative Freeform app could be pitched as a mixed reality workplace, and movies could be watched in a virtual theater, in a way that VR headsets have enabled for years (but maybe here with an even better display and audio). AR-enabled iPhone and iPad home improvement apps, 3D scanning apps, and games could be ported over, leaning on similar lidar-scanning AR functions in-headset. Apple fitness workouts, clearly, could be big. Gaming? With Arcade, or some early partners, sure.
Will any of these be enough? Will Apple define a territory that right now has had a hard time defining itself beyond gaming? This first headset may not be the one most people buy, but it could be the one that tries to map out some clear directions for development beyond gaming. With Samsung and Google’s headset on the horizon, and possibly a lot more after that, these devices will start to reinvent themselves as they become more phone-connected and portable. Apple could have an early chance at shaping that narrative… or, if it doesn’t, others will get a chance after Apple. We’ll likely know more, or at least get an early glimpse, at WWDC.