
Microsoft has confirmed providing AI tools and cloud services to the Israeli military during its war in Gaza. Including advanced tools via the Azure platform to support hostage recovery and intelligence operations. The company claims its technologies were not used to harm civilians. Though it admitted limited visibility into how the tools were deployed. The acknowledgement follows scrutiny from employees and media over Microsoft’s defence contracts and growing tech involvement in war zones. As the conflict has killed over 50,000 people, mostly in Gaza and Lebanon, Microsoft’s statement raises fresh concerns about AI’s role in military operations and ethical oversight.
Microsoft’s Role in the Gaza War Under Review
Microsoft said it supplied Israel’s military with AI tools, cloud infrastructure, language translation services, and “limited emergency support” after Hamas’ October 7 attacks. These services were used to locate and rescue hostages and defend Israeli cyberspace. The company’s internal and external reviews were triggered by staff backlash and an Associated Press investigation. The AP revealed Microsoft’s Azure platform was central to processing intelligence, which could be used to cross-reference targeting data. Despite providing “special access” beyond commercial agreements. Microsoft insisted it found no evidence its products were used to inflict unlawful harm.
The firm stated it carefully vetted all assistance and that Israeli authorities were bound by Microsoft’s Acceptable Use Policy and AI Code of Conduct. However, the company admitted it had no control over software use on customer-operated servers or other third-party infrastructure. Critics say this lack of oversight makes Microsoft’s assurances difficult to verify. Meanwhile, the company declined to disclose details on who conducted the review, what the findings were, or whether the Israeli military participated in the investigation. With mounting civilian casualties, Microsoft’s involvement continues to draw global scrutiny and internal dissent.
Employee Pushback and Ethical Concerns Grow
Microsoft’s partnership with Israel has sparked intense backlash from within. The group “No Azure for Apartheid,” made up of current and former Microsoft employees. That demanded that the company release its full investigation report. Hossam Nasr, a former employee fired after organizing a Gaza vigil, said Microsoft’s statement was a PR attempt rather than real accountability. Critics argue that Microsoft is obscuring the nature of its assistance and sidestepping the ethical implications of supporting a military campaign that has killed tens of thousands. Civil liberties groups, including the Electronic Frontier Foundation, applauded the rare transparency but said it falls short.
Executive Director Cindy Cohn noted that Microsoft is still withholding specifics on how its AI tools are being used in targeting decisions. Military use of commercial AI, prone to errors and bias, has raised alarms globally. The Israeli military also contracts with Amazon, Palantir, and Google, all of which face similar ethical scrutiny. Microsoft’s assertion that it has seen “no violations” of policy is difficult to verify without independent oversight. As U.S. tech firms embed deeper in global conflict zones, critics warn that commercial AI is increasingly shaping who lives and who dies.
AI Tool: Transparency Amid Escalating Accountability Demands
Microsoft’s admission sheds rare light on how powerful AI tools are being used in modern warfare. Yet serious questions remain about oversight, accountability, and ethical boundaries. While Microsoft claims it followed internal principles and did not directly enable harm. The company’s limited visibility raises doubts about how meaningful its safeguards really are. As AI becomes central to military operations around the world, transparency alone is not enough. Employees, rights groups, and the public are demanding more: clear rules, independent audits, and firm limits on how AI can be weaponised. The future of responsible AI may hinge on how companies respond.