Skip to main content

Microsoft rejected sales of facial recognition tech over human rights worries

Microsoft Logo at Ignite
Microsoft Logo at Ignite (Image credit: Windows Central)

Microsoft President Brad Smith this week revealed that the company recently turned down a California law enforcement agency's bid to equip officers with its facial recognition tech. The primary concern for Microsoft, Smith said at a Stanford University AI conference this week reported by Reuters, was the potential impact on human rights.

According to Reuters, Microsoft decided that equipping officers' body cameras and cars with the technology would lead to minorities and women being "disproportionately held for questioning." That's largely down to a flaw in the AI used with facial recognition tech, which is disproportionately trained on pictures of white and male subjects.

"Anytime they pulled anyone over, they wanted to run a face scan," Smith said at the conference. While he didn't name the agency involved, Smith said Microsoft decided against providing the technology after weighing its impact.

Smith said that Microsoft also declined to provide facial recognition technology for cameras in the capital city of an unnamed country, Reuters reports. However, the company did allow the tech to be used in a prison after deciding its use would be sufficiently limited and would improve safety.

The human rights effect of facial recognition technology is an issue Microsoft has called attention to before by urging governments to regulate facial recognition technology. "Facial recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression," Smith has previously said on the subject.

Dan Thorp-Lancaster is the Editor in Chief for Windows Central. He began working with Windows Central as a news writer in 2014 and is obsessed with tech of all sorts. You can follow Dan on Twitter @DthorpL and Instagram @heyitsdtl. Got a hot tip? Send it to daniel.thorp-lancaster@futurenet.com.

13 Comments
  • This is good. The least organizations can do is be accountable about the use of their technology. This is why protests about the use of technology work - companies want to be accountable to their users and employees, not just formal shareholders.
  • So it's OK to sell stuff to train the military to kill more efficiently overseas but not ok to sell stuff to the police to help identify people in the homeland? Trying to wrap my head around how Nadella can justify one but not the other.
  • To help lower crime rates is to target minorities and women? Maybe these groups should look at themselves and commit less crime or the other stop asking for equality when there is no equality in the dispearsement of punishment for women committing the same crime as a man.
    One group, "minorities" needs to follow the law and not cry racism when caught breaking our laws while the other group, women needs to accept some agency in their lives.
  • Holy nonsense batman! Read the article before commenting next time?
  • Yeah, ridiculous.
    It's OK to sell software to the military to kill people but don't you dare infringe someone's rights not to get caught for being a criminal.
    The world gone mad, this is PC BS!
    What about the rights of the individuals that these turds that dont want to be recognised when they raped, pillaged and plundered the innocent?
    F ark political correctness.
  • Calm down. Most people you're criticizing for being "politically correct" (which has just become the thing to blame for everything) were also opposed to HoloLens being used for the military. The double standard here is Microsoft trying to look good by turning this one down. Political correctness has nothing to do with personal rights (that criminals don't necessarily deserve, I agree). In fact, most people who subscribe to some form of political correctness (American liberals and leftists) are often not very uptight about "muh personal freedoms" unlike the libertarian-right "don't tread on me" variety.
  • No actually, it's not ridiculous or double standards or PC BS when you think about it logically. In this instance you have carte blanche application of a technology with no limits versus an application specifically to reduce casualties as well as reducing harm to innocents. The AI has flaws that a quite clearly listed in the article which adds to the uncertainty surrounding this tech. Do you really want CA going down the same path as China with this?
  • It's just more lefty, tail wagging the dog, politically correct BS, that's what it is.
    Poor little Johnny might get his feelings hurt.
    F ark little Johnny and his PC world.
  • What are you even talking about? They specifically said the AI wasn't up to snuff and would give too many false positives within demographics other than white men where the AI works very well. Seems the only one who's feelings were hurt were your own. Talk about putting feels before reals... sheesh.
  • Honestly sounds like you're the offended one here.
  • presumption of innocent until proven guilty is not politically correct BS.
    right to privacy is not politically correct BS.
    deterring a nanny surveillance state is not politically correct BS.
  • Gee I'd like some of what you're taking. The war software is for one reason only. To more efficiently kill someone in an overseas country.
  • Microsoft double standard!
    I bet it was LAPD and London...