Apple has long been expected to unveil its mixed reality headset as part of WWDC 2023. That comes as a surprise to some, in part because Apple has been promoting augmented reality since at least WWDC 2017. Apple began laying the groundwork for the technology used in the headphones through development tools on the iPhone and iPad.
That’s when Apple first introduced its ARKit augmented reality framework, which helps developers create immersive experiences on iPhones and iPads.
ARKit was such a major focus for Apple in the years that followed that it devoted much of its subsequent live keynotes to showing off and demonstrating new AR features. Who could forget the few wooden tables that served as surfaces for building virtual LEGO sets on stage?
By emphasizing these tools, Apple communicated the importance of augmented reality technology as part of the future of its platforms.
iPhone and iPad software isn’t the only thing beginning to be designed for a mixed reality future. The hardware of the iPhone and iPad is also better equipped to serve as portable windows into a world of augmented reality.
Starting with Face ID and Apple’s Animoji (and later Memoji) feature, Apple began tweaking the iPhone for AR capabilities. Internally, Apple adapted the iPhone’s Neural Engine to handle augmented reality effortlessly.
The iPhones main camera even added a dedicated LiDAR sensor, like lunar rovers that navigate the surface of the Moon and self-driving cars that read their surroundings.
There was even a hardware update to the iPad Pro that focused almost entirely on adding a LiDAR scanner to the rear camera.
Because? Sure, it helped with focusing and detecting depth in Portrait mode photos, but there were also dedicated iPad apps for decorating your room with virtual furniture or trying on glasses without having the frames.

What was clear from the start is that ARKit was not entirely intended for immersive experiences via the iPhone and iPad. The phone’s screen is too small to be truly immersive, and the tablet’s weight is too heavy to support extended periods of use.
There is absolutely no use for AR on iPhones and iPads. Catching pocket monsters in the real world is more whimsical in Pokémon GO than it is in an all-digital environment. Dissecting a virtual creature in a classroom can also be more enjoyable than touching the actual innards.
Still, the most immersive experiences that truly trick your brain into thinking you’re actually surrounded by whatever digital content you’re viewing require protective eyewear.
Does that mean everyone will care enough about AR and VR to make the headset a hit? Sometimes the reactions to AR on the iPhone and iPad are that Apple offers a solution in search of a problem.
Still, there are some AR experiences that are clearly delicious.

Do you want to see all the dimensions of the iPhone or MacBook announced but not published? AR is probably how many people experienced the Mac Pro and professional XDR display for the first time.
Designing a 1:1 scale virtual space rocket in your living room will also give you a good idea of the scale of these machines. Experiencing the launch of a virtual rocket that allows you to look down at Earth as if you were a passenger can also be exciting.
Augmented reality has also been the best method of introducing my kids to dinosaurs without risking time travel and bringing the T-Rex back to the present.
As for ARKit, there are a number of ways Apple has been openly building tools that will be used for headphone experience development starting next month.

For starters, the framework introduced a way to provide developers with the tools, APIs, and libraries they needed to build AR applications in the first place. Motion tracking, scene detection, light detection, and camera integration are all necessary to introduce AR applications.
Real world tracking is another important factor. ARKit introduced the necessary tools to use hardware sensors such as camera, gyroscope, and accelerometer to accurately track the position of virtual objects in a real environment through Apple devices.
Then there’s face tracking. ARKit allows developers to include the same facial tracking capabilities that Apple uses to power Animoji and Memoji with facial expression mirroring.
AR Quick Look is another technology mentioned above. This is what AR experiences use to place virtual objects as products in the real environment around them. Correctly sizing these objects and remembering their position in relation to your device helps create the illusion.
The most recent versions of ARKit have focused on supporting shared AR experiences that can remain persistent between uses, detecting objects in their environment and hiding people from scenes. Performance has been constantly tweaked over the years, too, so the core technology that powers VR and AR experiences in headsets should be pretty solid.

We look forward to our first official look at Apple’s headphones on Monday, June 5, when Apple kicks off its next main event. 9to5Mac will be attending the special event, so stay tuned for full, detailed coverage. good luck to the htc lives Is headphone metaphones of the world.
FTC: We use automated affiliate links for income generation. Further.