Microsoft's inclusive design mission is guiding the company in ensuring its products and services are designed from conception forward with every user of all ability levels in mind. Though the company has made admirable progress in this regard, it is still a work in progress.
Microsoft's Seeing AI app helps people with blindness, Project Fizzyo supports children with Cystic Fibrosis, the Emma Watch and Project Emma aids people with Parkinson's Disease and Microsoft's Immersive Reader helps children with Dyslexia. There are millions of people with varying levels of abilities who are either excluded from interacting with the technologies of modern society or whose physical limitations prevent their full participation in everyday tasks.
Microsoft has embraced the challenge of creating specific solutions, like the tremor-halting Emma Watch, which targets a particular aspect of a disability. It has also incorporated solutions that level the playing field into its technologies, like gaze control in Windows, which enables people with immobility to navigate the OS. Given this integrated solution for people with para- or quadriplegia, a similar OS level solution that enables Windows or Cortana to understand sign language for the 466 million people with disabling hearing loss, in a world where "speaking to AI is becoming the norm" seems like a natural goal for Microsoft. And given that a developer "modified Alexa" to do just that we know that it's also possible.
If Alexa can do it so can Cortana/Windows
Developer Abhishek Singh created a web application that uses a camera to view and understand sign language which is then translated and spoken and heard by Alexa via Amazon's Echo. A typed response is then provided by the system that can be read by the user after Alexa speaks her response.
Using machine learning platform Tensorflow, Singh trained an A.I. to understand American Sign Language and used Google's text-to-speech to translate the signs into spoken words. Singh said, "The project was…inspired by observing a trend among companies of pushing voice-based assistants as a way to create instant, seamless interactions."
Given Microsoft's A.I. and machine learning investments, its Cognitive Services that recognize human faces, activities, speech and more and the role of the camera in Windows PCs for biometrics Microsoft has the end-to-end resources to create a system that can communicate with users who use sign language.
Inclusion is what Microsoft is about
Most companies have some degree of dedication to inclusive design. Microsoft is not unique in that regard.
Microsoft is unique, however, in that its CEO Satya Nadella is personally driven toward inclusion goals due to his experience raising two children with disabilities including his son Zane who has severe Cerebral Palsy. This has led Nadella to promote a pervasive empathy mission throughout Microsoft that imbues its inclusive design efforts with a depth of sincerity, a level of detail and broad scope that makes it stand out in the industry.
Windows now has gaze technology as a result of a hackathon that enabled a former NFL great with Amyotrophic Lateral Sclerosis (ALS) play with his son despite using a wheelchair. Immersive Reader is integrated throughout Microsoft's cross-platform products like OneNote enabling school systems and parents to support children with Dyslexia and other disabilities. Sign language recognition built into Windows/Cortana would be a systemic component of the platform that fits naturally with Microsoft's other efforts.
Including sign language recognition is a must
If Microsoft's products are meant to be used by everyone then sign language recognition must be a part of the equation. Though the companies digital assistant and smart speaker efforts have paled in comparison to the success rivals are enjoying Microsoft still has a case for making this move.
At build 2018 Microsoft demonstrated Cortana's navigating a meeting, transcribing the conversation, responding to participants and providing text that a deaf/hard of hearing participant could read. A natural progression to such a scenario would be including sign language recognition for people with deafness who do not speak or others who rely on sign language who are unable to speak.
Microsoft's partnership with Amazon by bringing Cortana and Alexa together shows that it is serious about keeping its ambient computing efforts visible in the consumer space. Perhaps bringing sign language recognition to Windows and Cortana could be boosted by similar or joint efforts integrated into Alexa. However it pans out, it just needs to happen. I hope Microsoft agrees.
Surface Duo is better at multitasking than Galaxy Z Fold 2. Here's why.
The $2,000 Samsung Galaxy Z Fold 2 and $1,400 Microsoft Surface Duo are garnering many headlines because of pricing, and they both "fold" in some manner. But these devices are radically different, a point demonstrated with great effectiveness by @iAm_erica in this new video.
Review: Samsung Galaxy Book S brings amazing battery life, tough keyboard
Samsung's Galaxy Book S runs the latest Qualcomm Snapdragon 8cx ARM processor. With a slick, slim design and weighing just 2.1lbs how does this ultra-light laptop handle office productivity? With exceptionally good battery life and that always-on 4G LTE we have some thoughts in our latest review.
The games on Xbox Cloud Gaming that need touch controls
Minecraft Dungeons proved that touch controls and cloud streaming games can work, so we put together a list of the games we think need touch controls as soon as possible.
The NFL is back! Check out these must-have Windows apps for football fans
After months of waiting through a unique offseason and no preseason games, the NFL is finally back this week. With these Windows 10 apps, you won't miss a snap of the NFL action.