Skip to main content

Microsoft won't sell facial recognition tech to police, echoing Amazon, and IBM

Microsoft logo
Microsoft logo (Image credit: Daniel Rubino / Windows Central)

Microsoft President Brad Smith emphasized today that Microsoft will not sell facial recognition software to police departments in the United States. Smith shared his remarks during a Washington Post Live session, in which he spoke on the role of tech companies and the United States government in regard to facial recognition software. Microsoft did not sell facial recognition software to police departments previously and Smith explained that the company will not do so until national laws are in place that govern the use of facial recognition technology.

Smith stated that Microsoft will not sell facial recognition software to police departments in the United States until a national law is in place that is "grounded in human rights" that will govern the use of facial recognition technology.

The company president explained that Microsoft has a set of principles in place that led to its decision to not sell facial recognition technology to police departments, stating, "As a result of the principles that we've put in place, we do not sell facial recognition technology to police departments in the United States today."

Smith's statements come at a time when protesters throughout the United States and around the globe are calling for an end to police brutality and racially related violence. Smith believes the current climate should lead people to do more, stating, "I do think this is a moment in time that really calls on us to listen more, to learn more, and most importantly, to do more." Smith continued, stating, "We need to use this moment to pursue a strong national law to govern facial recognition that is grounded in the protection of human rights."

See more

Microsoft is not the first tech giant to speak out on the use of facial recognition technology. IBM and Amazon have made similar moves. Smith states that the moves made by Amazon, IBM, and Microsoft in this sphere won't be as effective if other companies don't make similar moves. Later, Smith spoke on the role of government in regard to the technology, stating, "We need Congress to act, not just tech companies alone. That is the only way that we will guarantee that we will protect the lives of people".

Sean Endicott
Sean Endicott

Sean Endicott is the news writer for Windows Central. If it runs Windows, is made by Microsoft, or has anything to do with either, he's on it. Sean's been with Windows Central since 2017 and is also our resident app expert. If you have a news tip or an app to review, hit him up at sean.endicott@futurenet.com.

38 Comments
  • I don't think this has anything to do with racial tensions. This is just a human rights thing.
  • You might want to educate yourself a bit more sir.
  • Facial recognition tech being sold to the government has been a worry for years. This is nothing new. This could easily be a timing thing. Or they just finally came out and said it because of the situation. I'm sure they would have a large amount of people be incredibly angry at them before or after everything that happened.
  • Last year I wrote an article here at Windows Central, "Microsoft's moral stance on facial recognition is good for everyone (especially Microsoft)" In the piece I share how Microsoft, in part due to the immaturity of the tech due to its high rate if inaccuracies in relation to recognizing dark-skinned individuals and women, Microsoft was holding back in selling the tech to potential buyers until widespread regulation was put in place. https://www.windowscentral.com/how-microsofts-ethical-leadership-facial-... I
  • One of the many big problems with facial recognition is that technology companies are inherently biased based on the demographics of their employees. Evidently, this tech does a really poor job of recognizing non-white skin subjects. As an example of the bias in tech, do a google or bing search for "unprofessional hair". Let us know if you detect a bias. Now apply that bias to an increasingly militarized police force who themselves have proven to be biased and the tech goes off the rails at warp speed.
  • Facial recognition isn't going to be used to predict who is going to commit crime. It would have been used to see who has committed crimes. Now with that, yes, the tech is bad when it comes to minorities, but what does that have to do with racial biasies if it were in the hands of police or military? It's used to find a singular person, not stereotype. I agree that the government shouldn't have facial recognition period. But it has nothing to do with race and has everything to do with personal privacy. Giving up rights is easy. Getting them back is near impossible.
  • > Now with that, yes, the tech is bad when it comes to minorities, but what does that have to do with racial biasies if it were in the hands of police or military? It's used to find a singular person, not stereotype. That's exactly it though. Because it's bad with racial biases, facial recognition is more likely to detect a face as belonging to the wrong person. There's been many stories lately where police have raided the wrong houses with tragic consequences, and if they're relying on a faulty/incorrect facial identification, that will only get worse.
  • Or you Americans could just give up your guns! Both citizens and police. I saw the Daniel Shaver video this week. If anything is an advert for you all to de-arm - I don’t know what is.
  • Oh yes, let's defund police and give up any means of protection. That sounds great! /s
  • As a cop myself I don't want the tech until its very reliable. If I had a computer that constantly gave me false returns on information I input into the system, it does nothing for me. When it comes to taking someone's rights away even temporarily, we have to be right. I enforce DWI in my state and I know I've let people over the legal limit go because I just didn't have enough probable cause for the arrest. I error on the side of rights, so if the facial tech isn't accurate on dark skinned people then it's not ready anyhow. We'll keep doing investigations how we do now.
  • I didn't even know that was a thing..
    Thank you for sharing - that kind of bias in the industry is disturbing.
  • It's both. One does not exclude the other and they have long struggled with racial bias in their AI and CV services as have Amazon, IBM and others. The use of these products in policing when combined with the systems themselves struggling with bias serves to magnify existing systemic oppression by police and other agencies. Stopping service for the police is a first step, but they need to stop selling to governments at all, especially ICE.
  • Oh, good grief. Such overreaction.
  • Unbelievable. This is political don't you understand what is going on. If your on the other side they sensor you. If you fit their narrative everything is great. Look outside the box and stop being so blind.
  • Does "look outside the box" mean watch Fox News and listen to Alex Jones to you? That's more like stuffing yourself in a box and closing the flaps.
  • No. It means watching Fox AND CNN. Not just one or the other. If all you watch is CNN then you are just as “inside the box” as someone who only watches Fox. As for Alex Jones, he is a moron.
  • Um, how about watching neither because they are both highly problematic? For mainstream news, the BBC, NPR, CBC and others do a much better job than CNN as straight news sources, and are leagues ahead of propaganda like Fox.
  • The BBC is a liberal organisation. It’s spouts it’s own brand of left wing propaganda. In fact we don’t call them journalists, we call their news people activists. There is even a book about it. A big push will be made in the coming years to reform the BBC.
  • In other words you don't like reality so you assault those who speak to it. Got it.
  • Lmao, all news outlets have their biases, the BBC do not report good news stories about Trump or Brexit, yet both were extremely popular at the time it mattered. There is no balance in the media these days, you have to watch a lot of news stations and try and get a balanced view of what's really happening, watching FOX + BBC might give you a chance, not easy though. Fox may be more biased than the Beeb, but neither are telling the truth.
  • At least Fox News have liberal people... Donna Braziel, Juan Williams and many others are constant contributors - on almost every episode. MSNBC and CNN have nobody, even hyping people like Joe Scarborough as Republican - what a joke. MSNBC and CNN do more to create havoc and chaos than anybody. People like Al Sharpton does nothing but stoke the black community acting like every black person is a victim. It's sick really how biased the media left is.
  • The problem that I have on most of these articles about facial recognition software being biased towards certain demographics is the not the fault of the institution that is going to use it for possibly identifying people of interest, but the companies that developed the software with those biases. They act all high and mighty by not letting them use the technology, but it was they themselves that created the tech with those flaws in the first place. How about develop the tech to the point were it does not tend to lean towards certain demographics at least before going to market with it. I am also concerned like everyone else about being big brothered with this to if used in the wrong way or abused.
  • I don't think police forces in democratic countries having facial recognition software is a bad thing, but it needs to have ground rules and maybe some well thought out SOP's. I think this is what Microsoft is saying and so their stated policy makes sense to me.
  • Agreed. Facial recognition/Computer Vision are coming, and already are being used in a number of applications. But we need to carefully consider where they are deployed and what the weaknesses are, and until those weaknesses are mitigated not deploy them in ways that can infringe on civil liberties.
  • Agree. All MSFT and others are asking is for the US Congress to update laws that provide clear controls about new technologies like facial recognition. There has to be bounds established to protect individuals against flawed technology and how government uses this technology to deny someone their constitutional rights. For instance, DNA analysis has come a long way over the last decade. We now use the technology to free people from wrongful prosecution. Hopefully going forward, DNA evidence protects individuals from wrongful prosecution. Will facial recognition technology improve to more accurately identify a specific individual regardless of race? Can it distinguish one twin from another? Will the cameras improve enough to not only identify the shape of a persons head but also the patterns of blemishes (freckles, scares, etc.). Hopefully. At this point facial recognition may only be able to provide a 95% confidence of positive identification, which is simply not good enough. It is no different than the reliability of eye witness testimony. Or government may pass a law saying that facial recognition technology can not prove guilt alone. Bu I really don't know how evidence is allowable in a court proceeding.
  • MS may not have sold facial recognition to police departments across US, but that does not mean they have not sold any to FBI or US military....!
  • Yup, IMO gov sales shouldn't be a thing until this works well and there are clear terms of use (and ways to validate that is all it is used for).
  • Or sell it to the CCP or any other tyrannical government.
  • Not all Microsoft employees are for this... far from it. There is a silent majority of people who are afraid to speak out in fear of the mob calling us racists, bigots' or other names because we don't believe in what they believe in. The fact that facial recognition is being used by criminal organizations and they won't even let the Police work on equal ground is disturbing.
  • Sorry but your comment is off the mark. The constitution protects individual rights against the power of the state by limiting the power of government to imprison an individual. A criminal organization is not governed by law. They will use any tool to active their illegal activities. MSFT is asking the government to provide the legal basis for how facial recognition software is used by government to convict an individual of a crime. The police have power to impose their will on individuals, as the murder of George Floyd has demonstrated. But many other people in far weaker encounters with the police are still abused. Does facial recognition grant too much power to the police to wrongfully deny individuals their rights? Do we want police to use faulty facial recognition software to be able to wear a heads up display to identify a suspect and arrest them when the technology is flawed? Many police cars have license plate readers which can quickly and accurately identify a car used during a crime. For example your car is stolen. But that does not mean the driver of the car committed a crime. The person driving the stolen car is probably guilty of a crime but they may not have actually stolen the car. Maybe a friend of the thief is driving the car because the friend thought the thief bought the car. But at least the cops know quite accurately if the found the right car.
  • I spent enough years at Microsoft to know there is no such 'silent majority'. There are, however, some bigots there afraid of being called what they are.
  • Like all liberals David, if you don’t agree on a subject then the other person is racist? No wonder Trump will get in again. The silent majority who are afraid of the left making their lives uncomfortable for speaking the truth are the silent majority. We saw the same in the U.K. over Brexit. Big mouthed left wing politicians. People moaning online. The vote came, and the right got its biggest majority for 2 decades. Nobody foresaw it. People are quietly fed up of people telling them the sky isn’t blue. Bad people do bad things - they are criminals not victims for e.g. People in the U.K. protested after the George Floyd incident. But the truth is our police don’t regularly shoot civilians. Our murder rate is high due to black on black male murder. But if you say this then you are racist. My black wife would disagree with you.
  • There has never been such a thing as a 'silent majority'. That is a myth. It's what people refer to when all evidence says otherwise so they can stay secure in their own carefully constructed illusion.
  • Tell that to LinkedIn ;).
  • Surely the polls for both Donald Trump and British Brexit shows there is a silent majority, that's how they got it so wrong (and with a whole load of faulty statistical analysis). I know lots of people that have opinions on Trump, gun control, Brexit, LGBT treatment and all the issues around black lives matter. I personally have opinions on all, but would never tell anyone or post anything, and I'm probably a moderate, but quite clearly social media is full of nasty people who are happiest when berating others. If you frequented groups on Brexit leading up to the vote, it was shocking how nasty some people were and it was similar for Trump. Let alone the number of people who are getting called out for things said in the past, that with today's thinking would be considered racist or bigoted or whatever, there is no such thing as freedom of speech, leave any trails on the internet and you could potentially lost or fail to get a job. Anyone with any sense would keep quiet, and I'm sure this is what we would call the silent majority.
  • Police in America absolutely kill civilians at an abnormally high rate.
  • They specifically mention police departments in the United States. I would be interested in seeing what police departments in other countries are worthy.
  • These gestures are meaningless. Facial recognition has nothing to do with why blacks in this country commit an outsized proportion of crimes, and thus have continuous negative interactions with police. Facial recognition is the reason that they have the lowest education success rates. Every one of these virtue signaling opportunities ignores the real issues facing black people in this country. Which is why, since the Civil Rights Era, blacks in this country have actually gone backwards. More children born to single mothers than ever before, fewer father figures in homes, etc. Virtue signaling does nothing that would possibly help the situation.