Billionär Bill Ackman erhöht Microsoft-Anteil im Pershing Square Capital Management

Bill Ackman’s Pershing Square Capital has quietly amassed a multi-billion-dollar stake in Microsoft, signaling confidence in the Redmond giant’s ability to dominate AI infrastructure, cloud dominance and enterprise software—just as its Copilot+ PCs and Azure AI supercomputers are shipping. The bet hinges on Microsoft’s vertical integration of hardware, software, and AI models, a strategy that’s forcing competitors like Google and Nvidia to scramble. But beneath the surface, Ackman’s move exposes a deeper tech war: Microsoft’s bet on proprietary AI chips (like the NPU-powered Copilot+ PC architecture) vs. Open-source fragmentation, and how Azure’s AI supercomputers are rewriting cloud economics.

This isn’t just about stock prices. It’s about Microsoft’s end-to-end AI stack—from the DirectML API that lets developers train models on x86 chips (bypassing Nvidia’s CUDA monopoly) to the Azure Machine Learning platform’s new parameter-efficient fine-tuning capabilities. Ackman’s thesis: Microsoft is the only cloud provider with the scale to monetize AI at every layer, from consumer devices to enterprise data centers. But the real question is whether this bet pays off when open-source LLMs (like Mistral’s Mistral-8x7B) and ARM-based AI chips (e.g., ARM Neoverse V3) threaten to disrupt the ecosystem.

The Copilot+ PC Gambit: Why Microsoft’s NPU Strategy Is a Game-Changer (And a Risk)

Microsoft’s Copilot+ PCs—shipping in this week’s beta—are the most aggressive play yet in the AI-on-device arms race. Unlike Qualcomm’s Snapdragon X Elite (which relies on a custom hexagon NPU) or Apple’s M-series chips (which use a neural engine with limited flexibility), Microsoft’s approach is software-defined. The NPU in Copilot+ devices isn’t just for inference—it’s designed to offload LLM workloads from the CPU, enabling real-time context-aware processing without cloud latency.

But here’s the catch: Microsoft’s NPU isn’t just about raw throughput. It’s about API lock-in. The DirectML framework (now in GitHub’s public repo) lets developers compile PyTorch and TensorFlow models directly to x86 NPUs, but with a twist—only models optimized via Microsoft’s ONNX Runtime get hardware acceleration. This isn’t just a performance boost; it’s a strategic moat.

— “Microsoft’s NPU play is brilliant in theory, but the real test is whether third-party developers will adopt DirectML when CUDA still dominates 80% of enterprise AI workloads. The ecosystem fragmentation risk is real.”

Dr. Emily Carter, CTO of Modular AI, former Nvidia AI architect

Benchmark data from AnandTech’s recent tests shows Copilot+ PCs (using Intel’s Meteor Lake NPU) achieve ~3x faster LLM inference than non-Copilot devices—but only for Microsoft’s Phi-3 model. Run a Hugging Face model like Llama 3, and the performance gap narrows to ~1.5x. The implication? Microsoft is betting on its own models, not open standards.

The 30-Second Verdict

  • Win: Microsoft’s NPU strategy forces Nvidia to compete on software-defined AI chips, not just GPUs.
  • Risk: Developers may resist DirectML if it means abandoning CUDA’s mature tooling.
  • Wildcard: ARM’s Neoverse V3 could undercut x86 NPUs if cloud providers adopt it en masse.

Azure AI Supercomputers: The Cloud War’s Next Front

Ackman’s bet isn’t just about PCs—it’s about Azure’s AI supercomputers. Microsoft’s new Azure AI instances (codenamed “Project Olympus”) are shipping with 100+ GB HBM3 memory per node, a spec that dwarfs AWS’s Trainium2 and Google’s TPU v5. The difference? Azure’s nodes use Nvidia H100 GPUs in a custom NVLink-interconnected cluster, but with a twist: Microsoft’s Azure Machine Learning platform now supports hybrid training, letting enterprises split workloads between on-prem HPC clusters and Azure.

Here’s where the antitrust implications kick in. Microsoft’s vertical integration—controlling both the cloud infrastructure and the AI models (like Phi-3)—creates a feedback loop: better models attract more developers to Azure, which in turn improves the models. Google and AWS are playing catch-up, but their open-source strategies (e.g., JAX for Google, DLC for AWS) lack Microsoft’s closed-loop optimization.

— “Microsoft’s Azure AI stack is the first true ‘walled garden’ for enterprise AI. The problem? If regulators force them to open up, they lose their biggest advantage.”

API Pricing: The Hidden Cost of Azure’s AI Dominance

Service Azure Pricing (Per 1M Tokens) AWS Equivalent Google Equivalent
Azure AI Text Generation (Phi-3) $0.0012 $0.0015 (Titan) $0.0018 (Gemini)
Azure AI Vision (Custom Models) $0.0050 (per image) $0.0060 (Bedrock) $0.0075 (Vertex AI)
Azure AI Fine-Tuning (LLM) $0.05/hr (A100 v4) $0.06/hr (P4d) $0.07/hr (A3 VM)

Source: Azure AI Pricing (May 2026)

Pershing Square CEO Bill Ackman: Wir unterscheiden uns grundlegend von anderen geschlossenen Fonds

The pricing table tells the story: Microsoft undercuts competitors on inference costs but locks customers into its ecosystem via Azure AI Studio, which offers free tier credits for Phi-3 models—a tactic that’s accelerating developer adoption. The risk? If open-source models (like Mistral-7B) improve faster than Microsoft’s proprietary stack, enterprises may migrate away.

Open-Source vs. Closed Ecosystems: The Tech War’s Next Battlefield

Ackman’s bet assumes Microsoft can monetize AI before open-source kills the margin. But the reality is more nuanced. While Microsoft’s Phi-3 model is closed-source, its ONNX Runtime is open-core—meaning developers can use it for free, but enterprise features (like DirectML acceleration) require Azure. This is a Trojan horse strategy: get developers hooked on open tools, then upsell them on proprietary infrastructure.

The open-source community is pushing back. Projects like Ollama (which lets users run LLMs locally) and vLLM (a high-throughput inference engine) are gaining traction, but they lack Microsoft’s enterprise-grade support. The question is: Can open-source compete on scale?

One wild card: ARM’s Neoverse V3. If cloud providers adopt it en masse, Microsoft’s x86 NPU strategy could become obsolete. ARM’s chips are 20-30% more power-efficient than x86 for AI workloads, and companies like AWS are already testing them in data centers. Microsoft’s response? A new ARM-based Surface Pro shipping later this year—but it won’t change the fact that Azure still runs on x86.

What This Means for Enterprise IT

  • Microsoft’s Azure AI is now the de facto choice for enterprises running TypeScript-based AI apps, thanks to seamless ONNX integration.
  • Developers using PyTorch or TensorFlow may face vendor lock-in if they rely on DirectML.
  • Cybersecurity teams should monitor Microsoft’s CVE database—Azure AI’s confidential computing features are robust, but supply chain risks (e.g., third-party model vulnerabilities) are growing.

The Ackman Thesis: Can Microsoft Pull It Off?

Ackman’s confidence in Microsoft boils down to three bets:

  1. AI Infrastructure Dominance: Azure’s 100+ GB HBM3 nodes will outperform AWS/Google for large-scale LLM training.
  2. Copilot+ Ecosystem Lock-In: Developers will adopt DirectML if Microsoft’s NPU performance justifies the switch from CUDA.
  3. Regulatory Arbitrage: Microsoft can navigate antitrust scrutiny better than Google or Apple.

The biggest wild card? Open-source AI. If projects like BigScience or Mistral’s open models achieve Phi-3-level performance, Microsoft’s moat evaporates. But for now, Ackman’s bet is a calculated gamble on closed ecosystems winning the AI arms race.

The 30-Second Takeaway

Microsoft’s Copilot+ PCs and Azure AI supercomputers are a double-edged sword: They accelerate lock-in but risk alienating open-source developers. Ackman’s move signals confidence in Microsoft’s ability to monetize AI at every layer—but the real test will be whether the tech world follows.

Photo of author

Sophie Lin - Technology Editor

Sophie is a tech innovator and acclaimed tech writer recognized by the Online News Association. She translates the fast-paced world of technology, AI, and digital trends into compelling stories for readers of all backgrounds.

Healthy Nutrition & Weight Loss Tips by DietDeDingue

6 Standing Core Exercises After 60 That Strengthen Your Midsection Better Than Kettlebell Swings

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.