Using the power of the front-facing TrueDepth camera on the iPhone, you can make audio sound a little more match to your ears, but does it work with your earphones?
We’ve seen quite a few announcements from Apple recently, thanks in part to the launch of the iPhone 14 and its more premium sibling in the iPhone 14 Pro and Pro Max, but one thing you might have missed has the potential to change the way you hear music.
Inside the announcement of the second-generation AirPods Pro, Apple noted that it was also going to let you add a personalised setting to make audio on those new AirPods sound better. Essentially, the company would be mapping the size of your ears to cater its dimensional sound to the positions of your ears.
If you’ve tried Apple Music’s catalogue in Dolby Atmos and experienced any of the ten albums you simply have to hear (or anything else regularly being added), you may know what spatial audio does to music.
While a good pair of headphones can increase the depth of a soundstage, spatial audio goes above and beyond, not only making it seem larger with a brand new mix, but also adapting what you hear to the position of your head. It’s not the same experience as your regular music listen, but something completely different, and quite immersive.
This year, that idea is going a little further with the introduction of personalised spatial audio, which will see Apple channel a Sony idea, and execute it for a service with more subscribers.
Personalised spatial comes from Sony’s 360 Reality Audio
You probably don’t know much about Sony 360 Reality Audio, and really, we wouldn’t be surprised if you didn’t, either. Available on only a handful of services, it’s Sony’s approach to spatial audio that provides a similar sense of space, but lacks the head-tracking.
The take up for 360 Reality Audio has never been tremendous, though, with a small catalogue, lower volume, and speakers that may be make as much of a dent with 360 sound as you might think.
But Sony did come up with the personalised spatial setup first, capturing pictures of your ears to tune your listening app to the headphones you own. It’s been a part of Sony’s Headphones app for years, popping up in the WH-1000XM3, and appearing on headphones since then.
In iOS 16, Apple has adopted that idea and made it work a little in a much more seamless fashion. It’s still weird taking photos of your ears using your phone, but at least Apple makes sounds to let you know you’ve done it right, and using the TrueDepth front-facing camera of the modern iPhone, it can get a spatial look at your ears.
Now we just need to know what it works on, and if it works.
Good news: personalised spatial audio works on the AirPods and AirPods Pro
So first the good news: if you have a pair of AirPods Pro 1st-gen, the AirPods 3rd gen, AirPods Max, or even a pair of the Beats Fit Pro, you have something that supports personalised spatial audio.
Even though Apple’s AirPods Pro 2nd-gen announcement seemed to imply you need the new model, you don’t. The first two AirPods variations, the originals from 2017 and the subsequent follow-up in 2019 won’t work with the tech, however.
All you need to get set up in personalised spatial audio is one of those supported models or anything newer from Apple (and likely Beats), and an iPhone running iOS 16 or higher. When iPadOS 16.1 is released for the iPad, it’ll work there, too.
Armed with an iDevice with the right operating system, plus a pair of earphones or headphones that supports the tech, you simply need to go into settings, trigger the setup procedure, and let the app map pictures of your ears. Easy.
Does personalised spatial audio change how you hear?
The bigger question may actually be whether Apple’s addition of personalised spatial audio does anything tremendously different.
Unfortunately, it’s one that’s not remarkably easy to answer. Once mapped, if you own several pairs of compatible earphones, the information is shared across, so if you own both the AirPods Pro and AirPods Max, and you jump between them, you won’t need to map each time.
Map once and move on. Great.
However, as to whether the personalised spatial audio technology changes how you hear music, that’s more complex. You can’t simply turn it off just yet, and Apple will simply remove the data every time you want to turn personalised spatial off.
In our tests with the technology, we found the differences to be subtle. There was a slightly warmer sound with a touch more impact, as some sounds appeared a little closer. Tested with the Pickr Atmos Sound Test, tracks felt a little colder yet balanced without the tech, while with personalised switched on, they read as more direct, but also a touch warmer.
The difference is one that comes across as a slightly more refined approach to binaural audio, with personalised spatial adding to the scene as if each track was recorded with you standing in the centre. It’s close, but the personalised sound does work slightly better. Your mileage may vary, of course, and it will depend on whether what you’re listening to has been engineered with spatial in mind.
But the tech does work without you necessarily needing to buy the new AirPods Pro if you own another supported pair, which is a win for everyone who already owns something compatible.