One of the new features in iOS 16, and something that was highlighted again at Apple’s event on Wednesday, is custom spatial audio. Once you’ve installed the latest version of iOS on your iPhone from September 12, you’ll be able to create a custom sound profile that should improve the sense of immersion and overall spatial audio experience you get from AirPods.
To produce this custom setting, Apple uses the iPhone’s front-facing TrueDepth camera to scan your ears. The process, which involves holding your iPhone about 10 to 20 centimeters from the side of your head, takes less than a minute, and the resulting data is then used to optimize spatial audio for your unique ear shape. “The way we all perceive sound is unique, based on the size and shape of our head and our ears,” Apple’s Mary-Ann Rau said during the keynote. “Personalized Spatial Audio will deliver the most immersive listening experience by precisely placing sounds in space that are tuned just for you.”
But Apple is not the first company to go down this path. Sony has been offering “360 Reality Custom Audio” since 2019 for supported music services like Amazon Music, Tidal, Deezer and Nugs.net. Conceptually, it’s very similar: Sony and Apple try to determine your ear structure and adjust spatial audio processing to account for the unique creases and contours of your ears. The goal is to maintain that 3D audio experience and eliminate any audio quirks that dull the feel.
Here’s how Sony explained the benefits to me in June, courtesy of Kaz Makiyama, VP of Video and Sound at Sony Electronics:
Humans are able to recognize spatial sound sources by the subtle changes in the intensity and timing of sound entering the left and right ears from the sound source. In addition, the sound can depend on the shape of our head and our ears. Thus, by analyzing and reproducing the characteristics of both ears by taking pictures of the ears, this technology makes it possible to reproduce the sound field while using headphones.
Sony’s approach, however, is slightly clumsier than Apple’s. AirPods technology is built right into iOS settings. But to create a personalized sound field with Sony products, you need to take an actual photo of each ear with the Headphones Connect app and your phone’s camera.
These images are uploaded to Sony’s servers for analysis, and then Sony retains them for an additional 30 days so that they can be used for internal research and feature enhancements. The company says the ear photos are not personally associated with you during this window.
It’s also not to say that Apple was completely successful with the ear-scanning procedure. Throughout the iOS 16 beta period, some on social media and Reddit have mentioned that the process can seem tedious and sometimes fails to detect an ear. I think the truth is there’s no dead simple way out of this when too get a good accurate reading of your ear shape.
The consensus seems to be that it’s worth it: these custom profiles often make a noticeable difference and can improve our perception of spatial audio. And Apple doesn’t take actual photos: the TrueDepth camera captures a depth map of your head and ear, the same way Face ID learns your facial features.
Apple’s website notes that once you’ve created a custom spatial audio profile from an iPhone, it will sync across your other Apple devices, including Macs and iPads, to maintain a consistent experience. This will be true from October at least: you will need the next macOS and iPadOS updates for the synchronization to work. Custom Spatial Audio will be supported on third-generation AirPods, both generations of AirPods Pro, and AirPods Max.
Apple never claimed to achieve any firsts with custom spatial audio. Company executives have consistently said their goal is to deliver the best execution of meaningful features, even though others – in this case, Sony – are already pushing in that direction.
#Apples #custom #spatial #audio #trick #Sonys #idea