When it’s used well, spatial audio can lend extra immersion and depth to music, movies, and TV shows. Apple has offered the feature across its iPhone, iPad, Mac, AirPods, and Beats products since 2020 for videos, and last year, the company brought spatial audio to Apple Music. Spatial audio is intended to make content sound more multidimensional compared to traditional stereo and leave you feeling like sound is coming at you from all directions.
With the release of iOS 16, Apple rolled out a new tool for making the experience even better: personalized spatial audio. This process calibrates the sound profile based on your own specific ears and head, which can enhance the 360-degree sensation and lead to more convincing instrument and vocal placement in the sound field.
The procedure involves pointing your iPhone’s front-facing camera at your ears to capture data on their shape and contours. This isn’t the most “Apple-like” feature in terms of simplicity; getting it right can be tricky and take a few tries. But the end result is a spatial audio experience that’s customized just for you. In a support document on its website, Apple explains that the feature changes “how audio is rendered” for each individual person “to better match how you personally hear sound.” After that, you can decide whether you prefer the more immersive Dolby Atmos songs on Apple Music — or if you’re equally happy sticking with stereo.