Bill Ackman’s Pershing Square Capital just loaded up on Microsoft stock, calling it a “steal” after shares dipped below $400 in February—now hovering near $450 as AI-driven revenue streams (Azure, Copilot, and enterprise AI) outpace expectations. The move isn’t just about short-term valuation; it’s a bet on Microsoft’s ability to weaponize its Maia architecture (a hybrid x86/ARM NPU stack) to dominate the AI infrastructure race while squeezing competitors like AWS and Google Cloud. But the real story isn’t just about stock prices—it’s about how Microsoft’s technical moats are deepening in ways that could reshape cloud computing, developer ecosystems, and even the semiconductor wars.
The Ackman Playbook: Why Microsoft’s AI Stack Is a “Ganga” (And What It Hides)
Pershing Square’s thesis boils down to three pillars: Microsoft’s operating leverage in AI, its platform lock-in, and its semiconductor edge. But let’s dissect the underpinnings—because the numbers alone don’t tell the full story.
First, the Azure AI Infrastructure. Microsoft’s Azure Machine Learning now processes 40% of all enterprise LLM inference workloads globally, per internal benchmarks leaked to The Register. That’s not just cloud share—it’s developer inertia. The ONNX Runtime integration (now at v1.18) lets models trained on PyTorch or TensorFlow deploy with near-zero latency on Azure’s NPU-accelerated VMs. Compare that to AWS’s SageMaker, which still forces users to rewrite inference layers for custom hardware like Graviton4. Microsoft’s move to unified API endpoints (e.g., /v2/models/deploy) means developers don’t need to rewrite for ARM vs. X86—just point, and shoot.

What This Means for Enterprise IT:
- Microsoft’s
Azure AI Studionow supports end-to-end encryption for model weights viaAzure Confidential Computing, a feature AWS lacks in itsBedrockservice. - Latency for Copilot Pro users dropped 38% YoY after Microsoft’s Maia NPU deployment, per internal telemetry. That’s not just a speed bump—it’s a competitive moat.
- Third-party devs using
Azure Cognitive Servicesnow get priority access to Microsoft’sPhi-3family (4K parameters), while AWS users are stuck withTitanmodels that require custom fine-tuning.
The Semiconductor Gambit: Why Microsoft’s NPU Stack Is a Silent Killer
Here’s where Ackman’s bet gets interesting. Microsoft isn’t just riding the AI wave—it’s building the infrastructure. The Maia architecture (revealed in this April paper) combines x86-64 cores with a custom NPU fabric that Microsoft calls “Project Silica.” The result? A 3.2x improvement in TOPS/W over NVIDIA’s H100 for mixed-precision workloads, according to AnandTech’s benchmarks.
But the real kicker? Microsoft’s open-sourcing of the NPU compiler toolchain (MaiaCC) in GitHub. This isn’t just about performance—it’s about ecosystem lock-in. Developers who optimize for Maia’s ISA (Instruction Set Architecture) are stuck on Azure. No porting to AWS or Google Cloud without a full rewrite.
— Dr. Elena Vasilescu, CTO of Neural Magic, on Maia’s NPU dominance:
“Microsoft’s move to open-source the NPU toolchain is genius. They’re not just selling cloud—they’re selling a development platform. If you’re a startup building an LLM, you’ll deploy on Maia because it’s faster, cheaper, and Microsoft controls the roadmap. That’s how you win the AI wars.”
The Antitrust Angle: Is Microsoft’s AI Stack Too Big to Fail?
Ackman’s bet isn’t just about tech—it’s about regulatory arbitrage. The FTC’s activision-blizzard lawsuit (still dragging into 2026) proved Microsoft can outlast antitrust scrutiny. But here’s the twist: Azure AI is now a de facto standard.

Take Copilot Pro. It’s not just a productivity tool—it’s a closed-loop feedback system that trains on Microsoft’s internal data lakes. That means enterprises using Copilot are feeding Microsoft’s models, which then get better, which then makes Copilot more sticky. It’s a virtuous cycle—and one that’s hard to break.
The 30-Second Verdict:
- Microsoft’s AI stack is not just competitive—it’s dominant in enterprise.
- The
Maia NPUandONNX Runtimecombo is a technical moat AWS/Google can’t match. - Ackman’s bet is a long-term play on Microsoft’s ability to own the AI infrastructure stack.
What’s Next? The Cloud Wars Escalate
Expect three major shifts in the next 12 months:

- Microsoft will push
Maia-powered VMsintoAzure Stack, forcing enterprises to choose between on-prem NPU acceleration or slower cloud alternatives. - AWS will counter with
Graviton5 + custom NPUs, but they’ll lack Microsoft’sONNXecosystem. - Google Cloud will pivot to
TPU v5for pure ML training, but lose ground in enterprise inference.
Pershing Square’s move isn’t just about stock—it’s about recognizing that Microsoft has built an AI fortress. And the best part? They’re just getting started.
— Rajeev Nayyar, Head of AI Infrastructure at Databricks:
“Microsoft’s advantage isn’t just in the cloud—it’s in the developer experience. If you’re building an AI product today, you’re choosing between Microsoft’s walled garden and a fragmented ecosystem. That’s not a choice—it’s a strategy.”
Canonical Source:
For the original Quartz report, see: Pershing Square’s Microsoft Bet: A ‘Steal’ in the AI Cloud Wars.