Apple’s AR headset leaves much of the hard work to the iPhone | GeekComparison

A tree-lined campus surrounds a multi-storey glass and steel building.
enlarge Apple offices in northern California.

According to a new report in The Information, Apple’s long-rumored mixed-reality headset requires an iPhone within wireless range to function for at least some apps and experiences.

The sources say that Apple completed work on the system-on-a-chip (SoC) for the headset “last year” and that the physical designs for it and two other chips for the device have been completed. Apple has also finished designing the device’s display driver and image sensor.

The SoC will be based on TSMC’s five nanometer manufacturing process, which is current now, but may not be the case when the headset is released in 2022 or later.

(Note that the headset we’re talking about is the expensive, high-resolution, likely developer-focused mixed-reality headset that Apple is expected to launch in the relatively near future — not the slimmer mass-market AR glasses that are scheduled to come later.)

Crucially, the headset doesn’t feature the Neural Engine, the machine learning processor found in iPhones, iPads, and post-Intel Macs. The Neural Engine is already being used to complement Apple’s existing AR technologies, and it will be essential for future AR apps as well — the headset will just have to rely on a nearby device with that chip to handle those things. because he has no Neural Engine of his own.

That said, the headset’s SoC has both a CPU and GPU, suggesting it can do some things without having to interact with the phone. However, the hardware in the headset is said to be less powerful than that in Apple’s phones or tablets.

On the other hand (and this is just speculation), it’s much more likely that this will allow the SoC to perform some tasks that would be inefficient over wireless rather than attempting to make the device nominally functional without the phone being present.

The headset SoC is designed to excel at some things that other products don’t. Examples provided by The Information’s source included power management to maximize battery life, “compress and decompress video” and “send wireless data between the headset and the host”.

These details give us a lot of insight into exactly how Apple approaches the underlying technologies for the headset. But the revelations here may come as no surprise to many who have been following Apple’s work lately and working on AR headsets in general.

Other AR devices like the Magic Leap rely on external processing units, and heavy batteries required to power headsets that do all their processing locally are a barrier to user comfort and adoption.

Apple has taken this approach with one major predecessor device: the Apple Watch. The first few iterations of the device required a nearby iPhone to function, but Apple eventually switched to making a version of the wearable that could run completely independently.

Apple has worked with AR developers in recent years to create tools and APIs that facilitate the development of augmented reality apps, such as ARKit and RealityKit. These have been used to create AR apps that run on smartphone screens, not AR glasses, but much of this work would eventually apply to mixed reality glasses.

Apple has publicly done significantly less work on VR, which is said to be supported by the new headset as well.

Leave a Comment