Microsoft's inclusive design mission is guiding the company in ensuring its products and services are designed from conception forward with every user of all ability levels in mind. Though the company has made admirable progress in this regard, it is still a work in progress.
Microsoft's Seeing AI app helps people with blindness, Project Fizzyo supports children with Cystic Fibrosis, the Emma Watch and Project Emma aids people with Parkinson's Disease and Microsoft's Immersive Reader helps children with Dyslexia. There are millions of people with varying levels of abilities who are either excluded from interacting with the technologies of modern society or whose physical limitations prevent their full participation in everyday tasks.
Microsoft has embraced the challenge of creating specific solutions, like the tremor-halting Emma Watch, which targets a particular aspect of a disability. It has also incorporated solutions that level the playing field into its technologies, like gaze control in Windows, which enables people with immobility to navigate the OS. Given this integrated solution for people with para- or quadriplegia, a similar OS level solution that enables Windows or Cortana to understand sign language for the 466 million people with disabling hearing loss, in a world where "speaking to AI is becoming the norm" seems like a natural goal for Microsoft. And given that a developer "modified Alexa" to do just that we know that it's also possible.
If Alexa can do it so can Cortana/Windows
Developer Abhishek Singh created a web application that uses a camera to view and understand sign language which is then translated and spoken and heard by Alexa via Amazon's Echo. A typed response is then provided by the system that can be read by the user after Alexa speaks her response.
Using machine learning platform Tensorflow, Singh trained an A.I. to understand American Sign Language and used Google's text-to-speech to translate the signs into spoken words. Singh said, "The project was…inspired by observing a trend among companies of pushing voice-based assistants as a way to create instant, seamless interactions."
Given Microsoft's A.I. and machine learning investments, its Cognitive Services that recognize human faces, activities, speech and more and the role of the camera in Windows PCs for biometrics Microsoft has the end-to-end resources to create a system that can communicate with users who use sign language.
Inclusion is what Microsoft is about
Most companies have some degree of dedication to inclusive design. Microsoft is not unique in that regard.
Microsoft is unique, however, in that its CEO Satya Nadella is personally driven toward inclusion goals due to his experience raising two children with disabilities including his son Zane who has severe Cerebral Palsy. This has led Nadella to promote a pervasive empathy mission throughout Microsoft that imbues its inclusive design efforts with a depth of sincerity, a level of detail and broad scope that makes it stand out in the industry.
Windows now has gaze technology as a result of a hackathon that enabled a former NFL great with Amyotrophic Lateral Sclerosis (ALS) play with his son despite using a wheelchair. Immersive Reader is integrated throughout Microsoft's cross-platform products like OneNote enabling school systems and parents to support children with Dyslexia and other disabilities. Sign language recognition built into Windows/Cortana would be a systemic component of the platform that fits naturally with Microsoft's other efforts.
Including sign language recognition is a must
If Microsoft's products are meant to be used by everyone then sign language recognition must be a part of the equation. Though the companies digital assistant and smart speaker efforts have paled in comparison to the success rivals are enjoying Microsoft still has a case for making this move.
At build 2018 Microsoft demonstrated Cortana's navigating a meeting, transcribing the conversation, responding to participants and providing text that a deaf/hard of hearing participant could read. A natural progression to such a scenario would be including sign language recognition for people with deafness who do not speak or others who rely on sign language who are unable to speak.
Microsoft's partnership with Amazon by bringing Cortana and Alexa together shows that it is serious about keeping its ambient computing efforts visible in the consumer space. Perhaps bringing sign language recognition to Windows and Cortana could be boosted by similar or joint efforts integrated into Alexa. However it pans out, it just needs to happen. I hope Microsoft agrees.
We may earn a commission for purchases using our links. Learn more.
Why is the PS5 beating the Xbox Series X in comparisons? Microsoft responds
The Xbox Series X seems to be losing out against the PlayStation 5 in real-world side-by-side tests, and Microsoft has issued a statement. Here's what they had to say.
The Black Friday keyboard deals you need to know about
Getting your hands on a new keyboard is exciting for PC users, and thanks to Black Friday, it's more affordable than ever before. Here's a look at the best Black Friday keyboard deals available now.
Best Wireless Headsets for Xbox Series X, Series S 2020
Next-gen consoles need next-gen wireless. Here are our picks for the best wireless headsets for Xbox Series X and Series S.
These are the best PC sticks for when you're on the move
Instant computer, just add a screen! That’s the general idea of the ultra-portable PC Compute Sticks, but it can be hard to know which one you want. Relax, we’ve got you covered.