Apple executives break down new AirPods features


Image Credit: Darrell Etherington

There was only a passive mention of AirPods during the keynote address at Apple’s event. It’s understandable – the iPhone 15 and Apple Watch Series 9 (and Ultra 2) were center stage. Additionally, the headphones did not receive the same hardware updates. According to a press release issued after confirming the event, the biggest physical change to the AirPods Pro 2 is the (admittedly long-awaited) arrival of a USB-C charging case.

You’d be forgiven for thinking the AirPods news ended there. However, Apple’s high-end earbuds also received a meaningful software update in the form of new listening modes, which can be accessed with a few taps in iOS 17 on both versions of the AirPods Pro 2 (USB-C and Lightning) .

With the new model connected, swipe down to pull up Control Center and then long press the volume slide. Three mode selections will pop up below: Noise Cancellation, Conversation Awareness, and Spatial Audio. These are the first two that are getting love this year.

Adaptive audio has been added to the options along with standard noise cancellation, transparency and off. Tapping on the new option highlights it with a rainbow background. The new feature seamlessly moves between different settings in real time. It’s an attempt to bring both ends of the spectrum into a single setting, so you can walk down a crowded street with situational awareness while not getting the full impact of the noise of a garbage truck as it goes by.

Image Credit: Apple

Although named similarly to last year’s Adaptive Transparency feature, Adaptive Audio offers a full spectrum of modes, in which both transparency and noise cancellation play a role.

“Adaptive transparency, which we announced last year, should be coming really quickly,” Eric Tresky, director of product marketing, said in a conversation with TechCrunch. “This happens 40,000 times per second. That is not just an oversight, it is also a deficiency. To mitigate this quickly, it needs to happen in real time. Adaptive Audio slows down a bit over the course of a few seconds, as it has a more systematic process to figure out what you’re hearing. We’re moving from adaptive audio to transparency, so – to make it less jarring and more comfortable – it’s intentionally a lot slower for that reason.

The system also takes into account whether the content you’re listening to is music versus a podcast. This is determined based on tagging from apps like Apple Music. A microphone also measures the sound inside your ear to get an accurate picture of the sound you are experiencing. Ron Huang, VP of Sensing and Connectivity, explains, “Because if you just measure the sound that you think you’re playing in someone’s ears, it depends on how they’re wearing it and Other factors, it may be less accurate.”

Huang told TechCrunch that the company considered leveraging your device’s GPS to determine sound levels based on location. However, in real-world testing, the method proved inefficient.

“During the early exploration of adaptive audio, we basically put you in ANC versus transparency based on where you are,” Huang says. “You can imagine that the phone could signal the AirPods and say, “Hey, you’re home” and so on. This is one way to do it, but after all our learning, we don’t think it’s the right way to do it, and we didn’t do it. Of course, your home isn’t always quiet and the streets aren’t always noisy. We decided that, instead of relying on the location signal from the phone, AirPods monitor your environment in real time and make those decisions intelligently.

Image Credit: Darrell Etherington

Personalized volume is also a big part of the adaptive audio experience. According to Apple, the system combines a pool of user data with personal preferences to build a complete picture of listener habits, which can “understand environmental conditions and listening preferences over time to automatically fine-tune the media experience.” Machine learning for. Many different metrics are included.

“We did a lot of research — listening to different content by different users — to really understand different listening preferences and what is distracting and offensive from a noise standpoint to keep your content really clear. And it took thousands of hours of different data – with varying background noise.” Advertisement. “We also remember your personal preferences. Looking at some type of environment, how much noise there is, how loud you typically listen to your content, and remembering that for yourself. We add this to our machine learning models to make it even better for you.

The other big mode introduced via iOS 17 is Conversational Awareness, which lowers the volume of a track when you start speaking. However, external sounds will not trigger the effect – only the wearer. Apple is able to accomplish this effect without having to have an on-board voice profile. Instead, it takes advantage of multiple on-board sensors. The feature triggers when the mic hears a sound and the accelerometer detects jaw movement. How long this lasts depends on various factors. I was impressed by the feature’s ability to avoid things like throat clearing or yawning.

The team also hit another long-standing earbud bugbear: the Switching. That five-second gap between picking up a call and hearing it on your earbuds feels like forever. The user is required to be locked into the Apple ecosystem to take advantage of the new switching speeds.

Image Credit: Apple

“With this new software update the connection time of our AirPods to our devices has become much faster,” says Huang. “It comes from all the different methods we are using to find nearby devices. It’s really important for us to know what the iPhone is doing, what the iPad is doing, what the Mac is doing. A phone call is more important than the music, so when you’re answering a phone call, we make sure to take the path away from the iPhone and connect to your Mac for a conference call, for example.

The last big part of the AirPods announcement is Vision Pro connectivity. For the full audio experience, those using Apple’s upcoming spatial computing headset should bring along the new AirPods Pro for ultra low latency lossless audio.

“Bluetooth typically runs at 2.4 GHz, and that airspace is very, very noisy,” Huang says. “Everyone is running on 2.4. This is why routers Wi-Fi routers, for example, are typically dual-band if not tri-band, because the 5Ghz spectrum is so clean. In fact, to get really low latency audio, and to get really high fidelity, lossless audio – it’s all about a very, very clean and real-time channel between the two. The combination of 5Ghz and the fact that they are so close together allowed us to do this. We’ve been able to basically redesign a brand new audio protocol at 5Ghz for AirPods.



Leave a Comment

“The Untold Story: Yung Miami’s Response to Jimmy Butler’s Advances During an NBA Playoff Game” “Unveiling the Secrets: 15 Astonishing Facts About the PGA Championship”