Microsoft AI Chief Leaks Walmart Plans During Palestine Protest



Confidential AI plans accidentally revealed as Microsoft faces growing backlash over Israeli contracts.
In a surreal moment at Microsoft’s Build conference, a protest against the company’s involvement in Israel’s war on Gaza led to an unexpected data leak. During a disrupted session on AI security, Microsoft’s own head of AI security, Neta Haiby, accidentally shared internal Teams messages detailing Walmart’s confidential AI strategy.
🔍 What Was Leaked?
Haiby was presenting a session on best practices for AI security alongside Microsoft’s head of responsible AI, Sarah Bird, when protesters interrupted the event. In the aftermath, Haiby mistakenly shared her screen, displaying Microsoft Teams chats revealing:
- 🛒 Walmart is preparing to use Entra Web and Microsoft’s AI Gateway.
- 💬 A Walmart engineer praised Microsoft’s AI edge over Google.
Walmart is one of Microsoft Azure’s largest enterprise clients, and this leak confirms they’re going deeper into AI integration—raising questions about how these technologies may be used.
🔥 Why Were There Protests?
The leak followed a live protest by two ex-Microsoft employees, Hossam Nasr and Vaniya Agrawal. Nasr accused Microsoft of fueling genocide in Palestine through cloud contracts with Israel’s Ministry of Defense.
"How dare you talk about responsible AI when Microsoft is fueling the genocide in Palestine," Nasr said during the session.
This marks the third protest at the Build conference, signaling rising discontent among tech workers over military ties.
🧱 Microsoft’s Response: Denial and Deflection
Microsoft conducted an internal and external review, claiming:
- 🤖 "No evidence" that Azure or AI tools were used to harm civilians.
- 🧾 The contract with Israel’s defense ministry is a "standard commercial relationship."
Critics argue this is corporate gaslighting.
🛡️ Ethics vs. Profits
The leak and protest highlight growing concerns about:
- 🧬 How AI might be used in surveillance or warfare.
- 🧼 The ethics of selling tech to governments accused of war crimes.
Companies like Microsoft and Walmart are under pressure to balance innovation with responsibility. But critics say they’re choosing profits over people.
🧨 What This Means for AI and Activism
- 🚨 Transparency in corporate AI use is more urgent than ever.
- 📢 Tech worker dissent is rising—and becoming harder to silence.
- 🧭 AI ethics can’t be a side project. It must be core to business.
🐾 What You Can Do: Join the Boycat Movement
Boycotting unethical tech use isn’t a trend—it’s a duty.
📲 The Boycat app helps you:
- Identify companies complicit in occupation and apartheid
- Track protest wins
- Support ethical innovation
This story isn’t just about AI. It’s about accountability.