The power of AI to propel audio and sound-based technology

Lila Snyder, chief executive officer of Bose, gives a keynote during the Institute for Experiential Artificial Intelligence event held in East Village 17th floor. Photo by Matthew Modoono/Northeastern University

There are countless ways that companies are getting creative as they anticipate a world influenced by, and operating in, artificial intelligence. 

One such company, Bose Corporation, is leveraging the accelerating momentum of artificial intelligence to reimagine the future of audio and sound-based technology, the company’s CEO, Lila Snyder, told an audience at Northeastern’s Boston campus during an AI-focused forum on Wednesday. 

“We believe at Bose that sound is an incredibly powerful force,” said Snyder, who delivered the keynote address at the gathering. “It connects you to the things you love most.”

The inaugural event, hosted by the Institute for Experiential AI, brought together industry stakeholders, prospective job-seekers, university leaders, and academic researchers for a conversation about how to make AI work in the real world. 

Bose, the Framingham, Massachusetts-based company famous for its popular noise-canceling headphones, is harnessing AI to improve users’ experience of its products. It’s part of a transition the sound industry is on the cusp of—a third step towards an AI-based future after moving, first from analog sound to digital sound processing of the present, Snyder said. 

“We’re at this critical moment now, where we’re going to pivot from digital sound processing to AI,” Snyder said. 

“We want to make sure we’re at the forefront of that movement, and the thing that stands between us and getting that done is talent,” she added. “We can’t reach our aspirations without talent.”

One application of AI is to better understand Bose’s supply chain using a breadth of data, which Snyder describes as “operational” AI. Another way the company is leveraging AI is through analysis of customer data. Some of the company’s newer software-based technologies, such as ActiveSense, are being deployed to automatically pick up cues from the real world to provide a seamless, synched experience for headphone users.

That’s called “contextual awareness.” Bose has been developing headphones, Snyder said, that can filter out external noise while users move between environments without having to manually adjust the volume on their devices. Instead of thinking about noise-cancellation as an on-or-off switch, Snyder said the company is moving towards a model where customers can “hear transparently what’s going on in the world around you while you’re listening.”

Part of the challenge has been trying to figure out the degree to which environmental sound is desirable (and necessary) for users who are moving about their day, listening to music or a podcast, for example. 

“Maybe it’s an individual’s voice; maybe it’s a baby crying; maybe it’s the doorbell you want to hear,” she said. “Being able to articulate which sounds you want to hear versus those that you don’t, and using data to identify them, we think is an incredibly important part of how you want to experience audio devices going forward.”

But getting that balance wrong can mean losing customers, Snyder said. Which is why AI-driven research is so important.  

“We think the future of noise cancellation is hearing what you want to hear,” she said. “So typically you don’t want to hear everything or nothing. There are things that you want to hear and things you don’t want to hear. The power of AI and data is that we can start to discern the difference between the two.”

For media inquiries, please contact media@northeastern.edu