Microsoft has acquired AI startup Semantic Machines to bolster its efforts in “conversational AI,” which may enhance voice assistant Cortana’s ability in language comprehension.
“For rich and effective communication, intelligent assistants need to be able to have a natural dialogue instead of just responding to commands.” David Ku, Microsoft Corporate Vice President and Chief Technology Officer of AI & Research, said in a blog post.
Semantic Machines’ team consists of a number of AI veterans, including Larry Gillick who was once the chief speech scientist for Apple’s Siri, UC Berkeley professor Dan Klein and Stanford University professor Percy Liang. Others include former staff of Nuance, the voice recognition company that had powered Siri.
Microsoft plans to open a new AI research center in Berkeley, where Semantic Machines is based, to push forward the boundaries of what is possible in language interfaces,” Ku said.
Microsoft pointed out the shortcoming of the voice assistants available today. That is, they only understand commands, and not conversations. “Most of today’s bots and intelligent assistants respond to simple commands and queries, such as giving a weather report, playing a song or sharing a reminder, but aren’t able to understand meaning or carry on conversations.”
Semantic Machines’ technology might provide some help, and help Cortana to have a more natural dialogue, Microsoft said. The aim is to make chatbots sound more human and less like machines.
By focusing on memory, Semantic Machines says it can produce an AI that predicts or answers questions more accurately, something that today’s voice assistants like Siri, Alexa and Cortana are struggling to accomplish.
Semantic Machines’ technology could be applied not only in Cortana, but also Microsoft Cognitive Services, Azure Bot Serivce and Xiaolce.
The Cognitive Services framework revolves around the development of bots and the integration of speech recognition and natural language understanding into intelligent assistants. There are now over one million developers using Microsoft Cognitive Services and over 300,000 developers using the Azure Bot Service.
Microsoft became the first company to add full-duplex voice sense to a conversational AI system for users to carry on a conversation naturally with XiaoIce and Cortana. XiaoIce has hit over 30 billion conversations averaging up to 30 minutes each with 200 million users across the U.S., China, India, Japan and Indonesia.
Microsoft’s acquisition comes not long after it was reported it is working with Taiwan-based Quanta Computer to develop its own smart speaker. The deal with Semantic Machines may give Microsoft an edge as it competes with Amazon and Google that have dominated the smart speaker market.