Electronics

Microsoft CEO: Selling HoloLens to Military Is a ‘Principled Decision’

Last week, Microsoft employees sent a letter of protest to their own company, claiming that Microsoft had engaged in war profiteering when it signed a deal with the US military to provide HoloLens technology to soldiers in the field. While close cooperation between the US military and Microsoft is nothing new, HoloLens will be directly incorporated and used in combat scenarios. When the US military announced the deal, it stated the device was intended to “increase lethality by enhancing the ability to detect, decide and engage before the enemy.” The Army calls this system IVAS, for Integrated Visual Augmentation System.

This appears to be the first time that Microsoft has moved from developing products that are used by the military to developing products that can be directly used to kill. In an interview with CNN Business, Microsoft CEO Satya Nadella dismissed the issue, saying:

We made a principled decision that we’re not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy. We were very transparent about that decision and we’ll continue to have that dialogue.

The 250+ employees who have signed the letter disagree with this characterization of the process. The letter states:

Although a review process exists for ethics in AI, AETHER, it is opaque to Microsoft workers and clearly not robust enough to prevent weapons development, as the IVAS contract demonstrates. Without such a policy, Microsoft fails to inform its engineers on the intent of the software they are building…

Brad Smith’s suggestion that employees concerned about working on unethical projects ‘would be allowed to move to other work within the company’ ignores the problem that workers are not properly informed of the use of their work. There are many engineers who contributed to HoloLens before this contract existed, believing it would be used to help architects and engineers build buildings and cars, to help teach people how to perform surgery, or play the piano, to push the boundaries of gaming… These engineers have now lost their ability to make decisions about what they work on, instead finding themselves implicated as war profiteers.

“AETHER,” in this context, refers to the group of experts Microsoft created to evaluate ethics in AI and to ensure it was “used responsibly.”

A Matter of Ethics

I’ll acknowledge that I’m sympathetic to the ethical argument the Microsoft programmers and engineers are making here. Many of the IT-related tasks undertaken by the military are not conceptually different than those you’d find in a commercial or even a consumer context. Radio signals need to penetrate buildings. Networked computers need to communicate efficiently. The specific needs and requirements differ, but the underlying principles are much the same.

Image by Microsoft

But there’s a difference between building a product that people you disagree with will use and building a product to accomplish a goal you abhor. Many of the scientists who worked on the Manhattan Project, including J. Robert Oppenheimer, became active in the nuclear non-proliferation movement precisely because they were troubled by the ethical implications of their own participation in the development of atomic weapons and the fundamental threat of extinction a nuclear war presented. Two years after the bomb fell on Hiroshima and Nagasaki, Oppenheimer stated:

Despite the vision and farseeing wisdom of our wartime heads of state, the physicists have felt the peculiarly intimate responsibility for suggesting, for supporting, and in the end, in large measure, for achieving the realization of atomic weapons. Nor can we forget that these weapons, as they were in fact used, dramatized so mercilessly the inhumanity and evil of modern war. In some sort of crude sense which no vulgarity, no humor, no overstatement can quite extinguish, the physicists have known sin; and this is a knowledge which they cannot lose.

One can quibble over the degree to which scientists, engineers, and inventors should feel responsible for the uses of the weapons or technologies they create. That many of them felt/feel some responsibility is not in question.

Nadella’s response to his employees reflects either a failure to consider these issues or an insulting dismissal of them. To repeat: “We made a principled decision that we’re not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy.”

First, Nadella’s response confuses the idea of “withholding” technology from the government and withholding it from the military. The military is not an elected institution in any sense of the word. Furthermore, one of the major criticisms of Congress across the 20th century, during times of both Democrat and Republican leadership, has been its unwillingness to intervene in various military excursions and activities authorized by the President. Post-Vietnam War reforms intended to rein in this tendency of the so-called ‘Imperial Presidency’ largely failed to do so. Meanwhile, the intelligence failures used to justify the entire Iraq War should stand as a reason to be dubious of any handwaved appeal to the supposedly perfect decision-making capabilities of democratically elected governments.

Nadella’s response assumes that all such actions are always justified and that the end justifies the means. The actual links between various US military actions over the past 70 years and the degree of protection they offered to “the freedoms we enjoy” is vastly more nuanced, specific, and particular to the conflicts in question.

Nadella’s answer further assumes a degree of uniform agreement that simply does not exist among the US electorate or, I daresay, within Microsoft employees. His statement implies that leaders always make the right decisions and that every military action can be justified by an appeal to “the freedoms we enjoy,” simply because said political leaders were elected. There’s no indication that the company or its CEO has even attempted to engage with the ethical ramifications of creating something only to see your creation used to kill or harm people in a manner you never intended. This is not a principled decision. It’s an unquestioned assumption. It’s the kind of unquestioned assumption someone might make if you start with the idea that the Army paying $479M for HoloLens is a convenient moral good and work backward from there.

There are at least 250 Microsoft employees who do not feel the company was forthcoming about its intent to sell this technology directly to the military. That’s not very many, compared with the size of the entire firm. It could still be a significant proportion of the number of people working on HoloLens. People have the right to make a decision about what kind of projects they want to devote their lives to creating. Not every nuclear scientist joined the non-proliferation movement. There are, I’m certain, HoloLens employees who have no problem with the idea that their work might be used to kill people. The ones who do care have every right to be upset about the way their work is being used.

While we can’t speak to how HoloLens was communicated internally, nothing in Microsoft’s 2015-2016 public guidance and demonstrations of HoloLens reflected the goal of using it to directly improve combat lethality. The DoD contract raised enough eyebrows that we noted this issue in our initial coverage of the announcement. Allowing people to move to new projects after you’ve decided to sell their work to the military to improve lethality does nothing to address the ethical burden they may feel they have incurred by contributing to such a project in the first place.

Now Read:

  • Microsoft Launches $3,500 HoloLens 2 Headset
  • Microsoft Wins $480M Contract to Provide HoloLens to US Military
  • Microsoft wants to bring HoloLens to the consumer market once the technology matures

HoloLens

About the author

admin

Add Comment

Click here to post a comment

Your email address will not be published. Required fields are marked *