The news comes the same week Microsoft received criticism for working with the Chinese military on Facial Rec technology. So, why does Microsoft allow its software to be used by one organization and not another? Speaking at Stanford University this week, Brad Smith said Microsoft is worried police will use the tech for inaccurate profiling. He says evidence suggests facial recognition tech used by police leads to more women and minorities being stopped. “Anytime they pulled anyone over, they wanted to run a face scan,” said Smith of the unnamed law enforcement agency. “We said this technology is not your answer.” Police departments in California had requested to use Microsoft’s technology in body cameras and on vehicles. Smith said the company also turned down a contract to install facial recognition technology in cameras throughout an unnamed capital city.
Contradictions
In a way, it is good Microsoft seems to be having an open debate about the merits and usability of this technology. However, it also seems the company is unsure how that debate can be resolved. On the one hand, Smith admits in some areas the tech can cause issues, but on the other Microsoft is working to develop the technology with the National University of Defense Technology, which is run by the Chinese military. That partnership caused U.S. Senators to urge caution about Microsoft’s involvement this week. While Smith discussed the moral implications of denying the technology to law enforcement, are those implications not apparent with contracts that have been accepted? Smith said earlier this year passing facial recognition of governments is not a problem, suggesting Microsoft will aim to do just that. “I do not understand an argument that companies should avoid all licensing to any government agency for any purpose whatsoever. A sweeping ban on all government use clearly goes too far and risks being cruel in its humanitarian effect.” “There are certain uses of facial recognition that should cause concern. And should cause everyone to proceed slowly and with caution. That’s certainly what we’re doing and we’re very worried about. Situations where facial recognition technology could be used in a manner that would cause bias or discrimination.”