AI came to Davos. No one actually talked about AI
The AI race looked physical and political in Davos: grid capacity, chip rules, and the fight to control enterprise access and outcomes
Davos Puts AI on a Meter
Davos has always preferred futures you can point at. Lanyards multiply, black cars idle, and the world’s most influential people practice the art of describing inevitability with the calm confidence of those who expect to profit from it.
This year, AI arrived in the Alps with something new: a utility bill.
At the World Economic Forum, the AI conversation didn’t hover around benchmarks, moonshots, or science-fiction timelines. It landed—repeatedly—on constraints. Power. Land. Data access. Security. Governance. And, shadowing every trillion-dollar ambition, the most practical question of all: who actually gets to scale?
Davos treated AI less like magic software and more like a supply chain. The result was an AI week grounded in scarcity, hierarchy, and control.
The New AI Stack: Meter, Keys, Proof
A clear structure emerged in the snow.
At the bottom sat the meter: electricity, grid capacity, cooling, and the data centers that convert capital into compute.
In the middle sat the keys: orchestration layers, permissions, identity, compliance, and the enterprise systems that decide which agents can touch which workflows.
At the top sat the proof: ROI, measurable outcomes, audit trails, and the kill switch that turns an AI initiative from a moonshot into a budget line that survives the next quarter.
Once you see AI this way, Davos feels more zero-sum than last year—not because belief has faded, but because belief has been itemized.
The Meter: Power Is the Constraint
Walk the Davos promenade and you might think AI is still a software story. Talk to executives for five minutes and you realize everyone is mentally staring at a utility bill.
Elon Musk’s most on-brand Davos move was to pivot from AI to solar power, arguing that a small corner of the American Southwest could generate all the electricity the U.S. uses—if policy friction didn’t get in the way. The point wasn’t geography. It was instinct. When AI gets discussed seriously, it turns into an energy and permitting debate.
Amazon CEO Andy Jassy was even more direct: “There is a power shortage.” He described AI labs consuming “gobs and gobs and gobs of power” and framed Amazon’s response as industrial scrambling—investing in small modular nuclear reactors to stay ahead of the constraint. No futurism. No sermon. Just a binding limit.
Satya Nadella put the same reality in Davos-friendly terms. Microsoft, he warned, risks losing “social permission” to consume scarce energy unless AI demonstrably improves real outcomes—health, education, public services, competitiveness. AI now has to earn its place on the grid.
Jensen Huang completed the reframing by calling AI “the largest infrastructure buildout in human history.” He spoke of “AI factories,” deliberately invoking jobs, national capacity, and strategic assets. Infrastructure gets funded. Infrastructure gets defended. Infrastructure gets regulated.
Once electricity becomes the choke point, platform power shifts downward. Control starts with who gets to build—and who gets to plug in.
The Keys: Permissions Are Power
Scarcity changes governance. Once AI is physical, it becomes permissioned.
Davos was full of enterprise vendors competing to define the layer that matters most, because whoever controls orchestration and access controls the ecosystem.
Workday’s CEO described his company as “the front door to work.” The logic is straightforward: Workday already manages identity, access, performance, and pay for humans. AI agents are simply a new class of worker. Whoever controls the front door controls accountability.
Salesforce emphasized its army of forward-deployed engineers embedded with large customers, converting bespoke lessons into scalable products. Microsoft positioned its orchestration and data-access tooling as the glue that lets companies unify systems without centralizing everything. Snowflake’s CEO admitted his biggest fear is speed—whether incumbents can move fast enough before model providers move down the stack and displace them.
Once AI touches payroll, procurement, compliance, or customer records, permissions become power. Orchestration becomes a moat.
Geopolitics reinforced the same point. Anthropic CEO Dario Amodei criticized recent policy decisions allowing advanced chips into China, arguing that the AI stack now has a passport. Compute is strategic material. Governments are gatekeepers. A company can lose on intelligence and still win on permissioning.
Even inside firms, power dynamics surfaced. Siemens chairman Jim Hagemann Snabe argued CEOs must act like “dictators” about where AI gets deployed. Tools don’t transform organizations. Decisions do. Scale creates winners and losers—and leadership prefers that process to be controlled.
The Proof: ROI as Enforcement
Once the meter constrains and the keys govern, proof becomes the filter.
EY’s Julie Teigland put it bluntly: “There is no ROI if you’re not willing to change the job descriptions.” Training, role redesign, and organizational change are the real work—and endless pilots are a “death trap.” That’s Davos shorthand for board-level impatience.
ROI is now the enforcement mechanism. It determines which vendors survive procurement cycles, which initiatives keep headcount, and which get quietly labeled “interesting learnings.”
Even AGI talk carried discipline. Demis Hassabis’s 5–10 year timeline invites investment—but also forces companies to prove value in the long middle. When buildouts are expensive and patience is conditional, outcomes matter.
The Bubble Question, Carefully Avoided
No one wanted to say “bubble” out loud.
Larry Fink insisted there is no AI bubble—though he expects big failures alongside huge winners. Meta’s CTO compared the moment to railroads and fiber buildouts, reassuring, yet admitting the scale is a “tremendous land grab” for power and GPUs.
Davos did what Davos always does: it kept existential dread offstage and talked execution onstage. That’s what happens when the people speaking are signing checks for concrete, transformers, and reactors.
AI, Governed
AI didn’t get quieter in the Alps. It got governed. It got physical. It got competitive in the specific way markets get competitive when inputs are scarce and scoreboards are financial.
The platform war now looks less like a beauty contest and more like a fight for choke points. The winners won’t just have the best models. They’ll control the meter, hold the keys, and produce the receipts.
AI’s future still showed up in Davos with confidence. It just also arrived with a compliance checklist—and a demand to see the bill.
