Shouldn’t a Company Understand Your Voice Like Siri and Cortana?

October 13, 2015
Hollis Chin, Vice President of Corporate Marketing

When you call a company with a question, often its Interactive Voice Response (IVR) struggles to understand you. Particularly if you’re in a noisy environment or you speak with a “non-standard” dialect or accent. It’s a different experience than interacting with personal assistants like Apple’s Siri and Microsoft’s Cortana. So what’s the shortfall, and what’s being done about it?

The issue is that companies’ Interactive Voice Response (IVR) systems have had varied success in speech recognition. But that’s about to change. Recently [24]7.ai took a groundbreaking step to make an IVR’s “hearing” more accurate. It became the first company to integrate Microsoft’s Deep Neural Network (DNN) technology into its platform to supercharge IVRs.

DNN boosts the accuracy of what an IVR “hears,” to at least 95 percent. That improvement will benefit all languages, not just English, and allow companies to move more calls to self-service. That is a big win for companies because research shows that customers prefer to self-serve.

How does it work? DNN emulates the human brain in understanding the spoken word. Microsoft’s DNN technology draws on analyses from more than 10 billion speech utterances from Bing search, Xbox, and Windows Phone—and applies that to enterprise self-service IVR interactions.

So when you call a DNN-powered IVR, you’ll be understood far better than ever, making it easier to complete your task in the IVR. So there’s no need to repeat yourself. The IVR can hear you now.

Learn more about how integrating DNN will help [24]7.ai improve what IVRs hear.

Hollis Chin, Vice President of Corporate Marketing
Hollis Chin, Vice President of Corporate Marketing

Add new comment