Let's talk Microsoft, neural networks and natural language processing for AI

Microsoft logo
Microsoft logo (Image credit: Windows Central)

Microsoft, Google, Apple, Samsung, and Facebook are investing heavily in AI. AI is the branch of computer science seeking to create intelligent machines that can behave, work and react like humans. But humans don't even act like "humans," if they're not taught.

Thus, teaching and learning are important to the appropriate development of human beings and intelligent computer systems.

Still, without the appropriate neurological structure as a foundation for learning, humans cannot learn what they're taught. The same is true of computer systems. Additionally, our goal of interacting with computer systems via language is impossible if they're not first taught how to understand us. Neural networks and natural language processing, therefore, are fundamental to the development of AI systems.

Artificial neural networks

Since the goal for AI is to mimic human capabilities, their "brains" are patterned after biological systems. Artificial neural networks (ANN) are inspired by the biological neural networks of animal brains. They're comprised of collections of artificial neurons that send signals to process data. These systems, like biological brains, are capable of learning or improving their abilities in various tasks.

Neural networks mimic biological system.

Neural networks have been applied to areas such as speech and image recognition, drug discovery and toxicology, digital assistants and natural language processing. Microsoft's Cortana and Google's various digital assistants have benefited from the advantages of being based on neural networks long before Apple's Siri made the switch to the superior tech in 2014.

Additionally, Microsoft's years of investments in machine learning gave it's more recent, but neural network-based, Cortana certain advantages over Siri. Apple's later neural network investments improved Siri to a degree that even impressed Eddy Cue, Apple's internet service boss:

This was one of those things where the jump was so significant that you do the test again to make sure that somebody didn't drop a decimal place.

It's also interesting that this year Apple announced that the new iPhone X has incorporated a neural engine, dedicated processing cores, within its A11 Bionic chip. This engine handles the underlying machine learning algorithms that power AI functions like Animoji, Face ID and AR. Huawei and Google have also invested in on-device AI, which has advantages over cloud-supported AI, requiring an internet connection and is generally less secure. Qualcomm, a company whose technology powers most of the mobile industry will be providing the industry with mobile AI chips as well.

Natural language processing

Microsoft's Larry Heck talks about natural language understanding.

Talking to our gadgets is beneficial only if they can understand what we say (speech recognition). A system only has to "call Bob" a few times after we tell it to "call Mom" before we give up in frustration. They must also understand our intent. This is where natural language processing (understanding) comes in. It's the field of computer science focused on the interactions between computers and human language. Computer scientists program computers to process the vast scope of natural language.

An AI system's exposure to verbal data is vital to the development of its natural language abilities. Thus, the more verbal interactions we have with systems, the more data they have to work with to improve their skills.

Larry Heck of Microsoft Research and former Bing General Manager, Mike Calcagno explain how Microsoft services including Skype, Bing, Cortana and more (shown in the above video) acquire language data to improve its systems natural language abilities. Google's voice search and hundreds of millions of digital assistant-equipped smartphones also collect hordes of natural language processing-improving voice data.

Voice-dedicated systems like Google Home and Amazon's Echo, the less popular Cortana-equipped Harmon Kardon smart speaker and the yet-to-be-released Apple Home Pod are also a means to collect voice data to improve AI systems.

AI will understand us better in time

We're a long way away from AI systems that are capable of thinking like and understanding us in ways that science fiction portrays. Still, the proliferation of always-connected smartphones and the birth of digital assistants has helped propel the advancement of neural network technology and natural language understanding.

AI systems are already capable of understanding us in ways they could not just a few years ago. As ambient computing becomes more common our interactions with devices around us will become more natural. Those systems, in turn, will become more capable as we provide more data from a variety of human-computer interaction scenarios.

Microsoft's AI vision

Will Microsoft's years of investments in machine learning, natural language processing, AI and its cross-platform strategy with Cortana help it maintain a leadership position? Or will rivals smartphone and smart speaker advantages position them for an eventual leap forward, that overshadows Microsoft, as millions of consumers engage AI in consumer-facing products which are increasingly the driving force in personal computing?

Jason Ward

Jason L Ward is a columnist at Windows Central. He provides unique big picture analysis of the complex world of Microsoft. Jason takes the small clues and gives you an insightful big picture perspective through storytelling that you won't find *anywhere* else. Seriously, this dude thinks outside the box. Follow him on Twitter at @JLTechWord. He's doing the "write" thing!