Bruh, Do You Even Do Spatial Audio?
Updated: Nov 1, 2020
AirPods revolutionized the wireless audio experience, and now AirPods Pro take it even further with a new class of lightweight, in-ear headphones engineered for 3-D spatial audio, comfort and fit.
AirPods Pro deliver superior sound quality with Adaptive EQ, which exigently tunes low- and mid-frequencies of the sound to the shape of an individual’s ear - a rich, immersive listening experience.
CREATORS NEED TO START WORKING ON BUILDING IMMERSIVE, AUDIO EXPERIENCES. GET READY NOW, FOR THE NEXT WAVE OF OBJECT-ORIENTED SOUND
Magic Like You've Never Heard
In June, Apple's Mary-Ann Ionascu announced spatial audio during their WWDC event. It was a major unveiling that sounded like overkill, science for the sake of science. Spatial audio creates an experience that makes it seem like the sounds are kind of halfway between you, and you're device. When looking at the screen it doesn't seem like its coming from your screen, and it doesn't sound like its coming from your headphones either. Rather AirPods Pro use a combination of hardware and a little Apple magic to create a unique listening experience, unlike anything I've ever heard before.
In her role as Senior Engineer of AirPods Firmware, Mary-Ann shared that this new update, would add this amazing new feature that will make you forget about two channel headphones. And that's really the magic behind this miraculous soundstage experience. At the close of her presentation, the Apple engineer stated that this new technology will work with Dolby 5.1, 7.1 and Dolby Atmos encoded entertainment. Let's take a closer look at this new tech Apple is baking into their products.
Dolby Atmos is the latest development in surround sound, the technology is known as object-based surround. Object-based surround assigns each sound a dynamic position in the space that surrounds you. Engineers refer to this space as a soundstage, think about where you are sitting and the room around you being that soundstage. Each sound producing object (like an airplane) has it own individual channel and can be moved anywhere in the three-dimensional soundstage.
Dolby Atmos revolutionizes your audio experience by filling your space with immersive sound that flows around you, even overhead, with breathtaking realism and clarity. Sound comes to life with precise placement, movement, and nuance as you would experience in the real world around you.
Although the many channels of Dolby Labs 5.1 and 7.1 technology has fantastic sound quality, Atmos is on another level altogether. And when you partner with a leader like Apple, well things just get better. In my last blog I wrote about how Dolby Vision is going to allow iPhone 12 users to create fantastic new video content. But I didn't quite understand the role Dolby played in audio capabilities found in the AirPod Pro earbuds, and what is entailed in delivering this experience.
AirPods Pro is not just another pair of headphones. Yes, they can pair with almost any device and playback beautiful sound. But when integrated with new devices in the Apple ecosystem, will blow your mind. The new Apple-designed H1 chip, developed specifically for headphones, delivers performance efficiencies, faster connect times, more talk time, and the convenience of "Hey Siri”. This Headphone 1 chip supports better synchronization. Apple says AirPods sync up to 2x faster when switching between active devices and provide 1.5x faster connection time for phone calls.
This means when you pick up a call using AirPods you’ll connect faster. Syncing is faster when you switch between your different Apple devices as an audio source – like when switching from your iPhone to an iPad or MacBook.
The H1 is actually what the electronics industry refers to as a System-in-Package (SiP). Today's advanced technology requires higher levels of functionality integration in a single package. Since 2015, Apple has integrated several generations of SiP chips for Apple Watch. AirPods Pro is the first time, the company has chosen the same solution for headphones. It came in two different SiPs, one for Bluetooth and one for the audio codec. In the AirPod Pro cutaway below you can see the Apple branded H1 SiP that controls each earbud.
More New Acronyms
In AirPods Pro, SiPs influenced the compactness of the wireless headsets. Each earbud is comprised of several SiPs assembled together: two Inertial Measurement Units (IMUs), one Bluetooth module and one audio codec module. The IMUs are standard Land Grid Array (LGA) SiPs. An IMU is an electronic device that measures and reports a body's specific force, angular rate, and sometimes the orientation of the body, using a combination of accelerometers, and gyroscopes. An LGA is a surface-mount for integrated circuits (ICs) allowing different modules to be easily attached.
So now you know why the AirPods seem a bit pricy, they are not two tiny bluetooth speakers that someone popped in an earplug. There is a lot of tech, all working together in harmony with the device thats providing the audio source, and we haven't talked about the speakers in them yet. But before we get into that, Let's talk about how they connect to your device, and why. Inside newer Apple devices is another Apple-designed chip called a U1.
Debuting with iPhone 11, integrated into Apple Watch Series 6, and now being included in the new iPhone 12, is an Ultra-Wide Band (UWB) chip that Apple's calling the U1. The U1 will enable ultra-wideband positioning powers, giving these devices the ability to determine each other's location when they're in close proximity. Think of it as Bluetooth, pumped up on steroids.
You may have seen it in action at the recent iPhone 12 event. When announcing the new HomePod mini, Apple demonstrated how the speaker came to life, at the same time sent the user holding the device, near the HomePod haptic signals, and turned on the iPhone display. In this UWB environment different devices will interact with each other in different ways. As you'll see in the vide below, UWB is super fast so these interactions will feel like magic.
When Steve Jobs unveiled the first iPhone he quoted legendary computer scientist Alan Kay, who said "People who are really serious about software should make their own hardware". And thats Apple's strategy moving forward. It's about taking complete ownership of the product.
In the video above Linus illustrates some use cases for this Ultra-wide Band technology. And you've seen it working Home Pod mini. And there is the rumored AirTags and the tags that Apple recently announced that will work with their new App Clips technology. For older phones these App Clip tags will have NFC capabilities, but I think the newer phones will connect with these tags via the U1 chip - not standard NFC. So this is Where 2.0 - you may see a trend starting to form here ...
Now that we know how the Apple-designed H1 chip in AirPods connects with the U1 chip in other devices what does this interconnection provide, and how does all this help provide better sound. But before we get into that, let's quickly review what devices will interconnect to provide this spatial sound. To listen to spatial on AirPods Pro you'll need the headset as well as an iPhone 7 or later, or one of these iPad models;
iPad Pro 12.9‑inch (3rd generation) and later
iPad Pro 11‑inch
iPad Air (3rd generation)
iPad (6th generation) and later
iPad mini (5th generation)
You'll also need the latest update to iOS or iPadOS 14 or later, and properly encoded audiovisual content from a supported app. The AirPods Pro will convert 5.1-channel, 7.1-channel or Dolby Atmos digital audio signals into virtual surround. AirPods will eventually work with any video streaming app that supports multichannel audio, including Netflix, Disney Plus, Hulu, and HBO. There is one other thing you'll need, this essential piece of the puzzle is Mary-Ann's baby, the latest firmware update for your Apple AirPods - The video below tells you how to update and more.
Preparing to Launch
Having just done this update I tried a few different tricks when I could not get it to update initially. I cleared the ram on my iPhone by hitting volume up, then volume down, and then hold the power button down until you see the Apple logo appear on your screen. I also asked Siri to update my firmware, but Siri replied she could not do that ... but it happened pretty quickly after.
You'll want to have the AirPods connected to your device, I put a business card behind the buds to keep the case lid open. I made sure both devices were fully charged or connected to a power supply. OK, now we have all the right devices, chips, software and content - phew. Don't worry it was all worth of it.
Here's what happens ... I just took a break from writing and watched an episode of Amazing Stories on my iPhone 11 with AirPods Pro connected. I turned on the Active Noise Cancellation and put on the show. Here's some subtleties that I will try to convey as best I can, so you understand the listening experience.
The first thing i took notice of was the soundstage of the individual shots in the screenplay. I noted the physical plane of the shot, the focal length and how wide or narrow the camera frame was. This is important because remember Dolby technology can place a sound object anywhere on a soundstage.
So rather than thinking of them as balls like in the Dolby demonstration think of them as dots of various sizes. This will help you understand my explanation of what spatial sound is. Remember this is audio were talking about here, not video. So your music has the same spatial sound field, so instead of two channels of audio you hear the drummer in the back, the singer upfront and the music or sounds all around you. Toes tapping, fingers snapping and drummer's brushes slapping.
Set the Soundstage
Lets say you have a long shot of a desert horizon with a car in the distance coming towards you. When it's far away it's a little dot and you barely hear it ... but as it approaches the dot gets larger and fills the screen as the car comes toward you in the frame. But this is a basic example. Here's a bigger concept. Some reviewers have reported a phenomena that even though they have AirPods Pro in and device speakers are turned off, it still seems as if sound is coming right out of the screen.
They don't mean the speakers on the iPhone --- they mean the screen. Heres what is happening - think of an actors face on the screen. Where is that actors mouth on the screen? Put that sound object (voice) right there on that screen. Make that dot the same size as his mouth and it gives the effect of the Gorilla glass screen being a loud speaker. You can see how these sound objects dots are placed in a sound field in the Third Party video clip that posted below in this blog post.
Want the sound to get louder or softer? Reduce or expand the dot in its place in the soundstage axis. This little pip means so much to Dolby that they turn your cursor into a dot on their website.
The subtleties are even more complex than that. You experience the thunder when the Hulk lands on screen, in front of you, with a huge dot that you can't see. Every object that makes makes a sound in the soundstage is heard independently, every step, every water droplet, every character in the scene - It's a totally new and different way to listen to and hear content.
From Tip to Chip
Most of the experience I have relayed so far takes place in the Dolby technology, it's their 5.1, 7.1 and Atmos technology that allows these dimensional sound fields. So far all Apple is doing is allowing playback and creating chips to make the connection between the device and these super sophisticated wireless headphones. But like Steve said, building both hardware and software makes for some powerful nuances you can't pull off if you are using off the shelf chipsets.
Apple-designed chips and software add a lot more that truly makes the experience special. I just watched another episode and took advantage of the opportunity to test out some of the mind bending experiences that Apple adds.
When watching with the iPhone screen directly in front of me I hear the sound distributed just as if I was watching it normally. But if I turn my head to the right, sound fills more of left side of my head thats closest to the screen. It's as if I was still hearing the sound from the screen, even though I am listening to headphones. If I turned my head left the opposite side filled with sound. But thats just the tip of the iceberg. I laid the phone in my lap, the sound seemed to be coming from there. When I raised my phone above my head, and tilted my head back to see, the sound shined down on me.
Audio in Motion
I continued trying different apps, and the web to play content, noticing the differences. I totally get why Apple built the Apple TV+ app. It's so all content is streamed through their software to provide an optimal experience for both audio and video playback. But let me expand on the movement of the phone screen in relation to sound. While the show was playing I held my phone out directly in front of me but with the screen facing away from me. Call me crazy but it sounded like the audio was being projected out from my phone, away from me - in the direction the screen was facing.
I'm pondering this projected sound, and the idea showing sound to another person. Now this is pure concept, but it's a good one. Apple enabled sharing in the latest update. This is when two sets of AirPods can listen to the same device at the same time. You and your significant other are watching a movie on the couch together on your iPad. They turn away from the screen to do something. To them the audio now sounds like it's behind them. Could you place that iPad back within their field of view to alert them to a dramatic scene they don't want to miss. Essentially use the device to point the sound at them, and get their attention. See I told you, this is heady stuff.
Other things I noticed were the sound is dynamic based on the image size you are viewing. Lets say I am looking at the video in full screen landscape format, but then I use the new picture-in-picture controls to reduce the video to half screen size in portrait mode. Now the soundstage is smaller, because its image is smaller. The sound object dot that was the actors mouth is now smaller and you can hear the difference. When the image is shrunk to its smallest size the sound is just and clean and crisp as before but it's soundstage has shrunk along with the image --- No I'm not crazy.
When video is in its smallest form it would be easy to watch it "in the background", while doing other tasks on screen, I got this feeling because the sound is not dominating the rest of the screen, just a little corner. It's hard to explain but it's very powerful and unique.
I also noticed while doing loop-de-loops, and barrel rolls with my smartphone, that when changing screen orientation and unexpected movement of the handset, there was a pause in playback, not a stutter, but a pause allowing the reorientation of the sound based on where the device and listener are, as well as whether the display was in portrait or landscape mode.
This actually impressed me rather than making me think it was buggy or glitchy. It made me realize just how much the chips in the headset and the phone are communicating with each other to provide the optimal listening experience. Now this is the H1 chip, not the H14 Bionic chip, I'm sure Apple has a strategy of continued improvements planned as they build their utopian computing ecosystem. And I'm sure those technology advances will require ongoing purchases ...
I want to switch gears here for a minute, to talk about product. But not in the way you usually think about product. I want to talk about the Apple ecosystem as a product. Stick with me, there is a good point here. When Jobs and Woz built the Apple I, Steve hired a designer to design a package for their creation built to his specifications. This was Steve's baby, and prior to the Apple I there had never been a desktop computer system sold in retail stores, as a boxed product, a complete unit. It came in several boxes and you usually had to assemble it yourself.
Steve saw the product as everything that went into the product. In other words the final product is the device in the box, not just the device itself. It was the hardware, the software, the brand, and the box. To this day Apple still thinks this way. And it's getting more and more obvious that the customer experience is a big part of that product now too.
Apple now thinks of every step of the customer journey as being important. From the moment the customer sees the product unveiled at one of their events, to the day they trade it in for a new one.
Each image, product name, technical and marketing terms are all laid out in full, before a word is spoken to the media. Yes they have their share of leaks - they kind of are a thing now ... But it's all strategically planned, and calculated, pushing their way forward to building the next great product.
Now lets turn back to that ecosystem as product idea. In the same way Steve thought about that first boxed product, Apple is now applying this concept to this interconnected ecosystem of devices and services that all know where each other are. They compliment each other adding value to the Apple product ecosystem.
For years reviewers have refereed to this as Apple's Walled Garden. But no, it's Apple's Box ...
As you watch and see Apple intertwining all these disparate devices together, you'll no longer view them as add-ons or accessories. They are valuable assets to the product as a whole and that product is the Apple ecosystem. Hi, welcome, come in, sit down ... have a glass of Kool-aid.
Setups a Tap
Apple says AirPods Pro join the existing AirPods line in delivering an unparalleled wireless audio experience. Each model uses advanced technology to reinvent how people listen to music, make phone calls, enjoy TV shows and movies, play games and interact with Siri. The magical setup experience customers love with today’s AirPods, extends to AirPods Pro.
A custom high dynamic range amplifier produces pure, incredibly clear sound while also extending battery life, and powers a custom high-excursion, low-distortion speaker driver designed to optimize audio quality and remove background noise. The driver provides consistent, rich bass down to 20Hz and detailed mid- and high-frequency audio.
Something I have noticed is that when I have AirPods in my ear but they are not being used they go into an idle mode, and don't reconnect until you need it - but it's ready the second you need it. This is a battery saving trick made possible by the Ultra-wide band chip interacting with the H1 chip in the headphones.
All in all, I have to say that I have listened to a lot of great stereo equipment in my time. At one point in between jobs I sold audio gear for Sound Advice in Miami. There, I had the opportunity to experience some of the finest quality audio and video gear available for home theater.
Hands down the AirPods Pro with all their crazy circuitry and sophisticated software beats all these listening experiences hands down. Apple will sell a lot of these without even eluding to what they have up the sleeve for the future. Marrying software and hardware together, like hand in glove.
In 2010, Steve Jobs told Kara Swisher, and Walt Mossberg that Apple watches the tech landscape and cherry picks what they think are budding technologies to include in their products. Steve said he wanted technology that was ascending. It sure looks like they found such a technology, and a partner in Dolby. Utilizing the best technology they have to offer in sight and sound - Dolby Vision and Dolby Atmos with an Apple twist, to make the experience even more magical.
I think we'll see an Apple 4K TV set top box that creates this experience for the big screen. I also think the HomePod and HomePod mini may be used with this new Apple 4K TV. I think the larger unit could act as the subwoofer with a mini on either side, like the popular Bose 321 system. I also think we'll see this new H1 chip in the rumored but not yet released AirPods Studio, over-the-ear headphones. I mean they didn't figure all this stuff out for a pair of earbuds --- or did they?
Apple has a strong partner in this, Dolby's, technology is now baked into the Apple ecosystem. Dolby describes itself as a company that transforms the science of sight and sound into spectacular experiences. Through innovative research and engineering, they create breakthrough experiences for billions of people worldwide through a collaborative ecosystem spanning artists, businesses, and consumers. The experiences people have with Dolby technology revolutionizes entertainment and communications at the cinema, on the go, in your home, and at work.
Start Your Compilers
In May of 2020 Dolby announced Dolby.io, an API platform that further broadens the opportunities to create in Dolby for the enterprise and application development space. Dolby.io will enable businesses, developers, and content creators to enhance every interaction and every piece of content in order to deliver spectacular communications, collaboration, and audiovisual experiences in their apps and services.
And yes Dolby is licensing its technology to other manufacturers in the mobile and computing space. Lenovo, Tidal, Amazon, and Huawei are all launching devices that utilize Dolby's tech, but no one is doing anything as deeply integrated as Apple. Tim Cook bet big on Dolby, but I think it's a solid partnership that will greatly reward both companies.
In hindsight, it's interesting to remember standing at a Dolby booth at CES about 10 years ago and a their rep telling me that they were starting to license their IP to mobile companies. In the crowded hall at CES, the rep held the phone up to my ear so I could hear the amazing sound coming from the device over the din of the CES show floor. At the time, it didn't make sense to me, but it's clear as a bell now.
So bruh, do you even do spatial? The time is now to start creating 3D content for your audience.