Enlarge / Apple offices in Northern California.
According to a new report in The Information, Apple’s long-rumored mixed reality headset will require an iPhone in wireless range to work for at least some apps and experiences.
The sources of the information say that Apple completed work on the system-on-a-chip (SoC) for the headset “last year” and that the physical designs for this and two other chips for the device have been completed. Apple has also completed development of the device’s display driver and image sensor.
The SoC will be based on TSMC’s five-nanometer manufacturing process, which is current now, but may not be when the headset hits the market in 2022 or later.
(Note that the headset we’re talking about is the expensive, high-resolution, likely developer-centric mixed reality headset that Apple is expected to launch in the relatively near future – not the slimmer AR glasses for the mass market, that are planned will come later.)
Crucially, the headset doesn’t have the Neural Engine, the machine learning processor found in iPhones, iPads, and post-Intel Macs. The Neural Engine is already being used to complement Apple’s existing AR technologies, and it will also be essential for future AR apps – the headset just has to rely on a device nearby with that chip to do those things handle as there is no own neural engine.
However, the headset’s SoC has both a CPU and a GPU, which suggests it can do some things without having to communicate with the phone. However, the hardware in the headset is said to be less powerful than that in Apple’s phones or tablets.
On the other hand (and we’re just speculating), the SoC is much more likely to be able to do some tasks that would be inefficient over WiFi rather than making the device nominally functional without the phone even being there.
The SoC for the headset is designed to do some things that other products cannot. Examples cited by the source of the information were power management to maximize battery life, “compressing and decompressing video”, and “transferring wireless data between the headset and the host”.
These details give us a lot of insight into how Apple’s exact approach to the underlying technologies for the headset. But the revelations here may not really come as a surprise to many who have followed Apple’s work and AR headsets in general lately.
Other AR devices like the Magic Leap rely on external processing units, and heavy batteries that would be required to power headsets that do all of their processing locally are a hindrance to user comfort and adoption.
Apple took this approach with an important earlier device: the Apple Watch. The first iterations of the device required an iPhone nearby in order to function, but Apple eventually made a version of the wearable that could work completely independently.
Apple has worked with AR developers for the past few years to create tools and APIs that make it easier to develop augmented reality apps like ARKit and RealityKit. These were used to create AR apps that would run on smartphone screens, not AR glasses, but much of this work would ultimately be applicable to mixed reality glasses.
Apple has worked much less publicly on VR, which should also be supported by the new headset.