The Expectation Escalator: AI Was Supposed to Save Us Time. So, Why Are We More Burned Out?
Remember when AI was going to give us our evenings back?
The vision was genuinely appealing: let the machine handle the repetitive stuff, free yourself up for the work that actually matters, maybe even close your laptop at a reasonable hour. A lot of us bought into that story. And honestly, why wouldn't we? It sounded great.
Here's what happened instead. A founder I know mentioned recently — almost in passing — that his team's AI engagement had dropped. He said it the way someone mentions a missed sales target. Flat. Disappointed. A little pointed. And right there, in that casual remark, was everything that had gone sideways with the promise.
Companies aren't just encouraging AI use anymore. They're tracking it. Interactions per day. Engagement scores. Usage dashboards are pulled up in front of the whole team. The tool that was supposed to lighten the load had become another thing to keep up with.
Sound familiar? There's a name for this.
A UC Berkeley study found that people are actually working longer hours since AI tools became widespread, not shorter. The pressure didn't go away. It just shifted.
I call this the expectation escalator — and once you see it, you can't unsee it. Here's how it works: a new tool makes something faster or easier, and instead of banking that time, organizations quietly raise the baseline. What used to be impressive is now just... expected. The floor moves up, and nobody announces it. It just happens.
It's workplace lifestyle creep. Except instead of upgrading your apartment, you're absorbing more work at the same salary.
But isn't the work at least more interesting now?
This is the part where people push back, and it's a fair point — up to a moment. Yes, a lot of the grunt work has genuinely gotten easier. But there's a difference between work that's fun and work that's restful, and AI has blurred that line in ways that quietly wear people down.
Think about what "reviewing AI output" actually involves. You're catching confident-sounding errors. You're making the 80%-done draft actually good. You're deciding what to trust and what to question. That's not easy background work — that's judgment-heavy, high-stakes thinking, running more or less constantly. Your calendar might look manageable. Your nervous system knows otherwise.
Meanwhile, the bar for what individuals are expected to produce keeps rising. Product managers are prototyping their own features. Marketing teams are rebuilding websites in-house. CEOs are demoing weekend projects at Monday's all-hands meetings. When one person can do in an afternoon what used to take a specialist a week, everyone in the building quietly starts doing the math on their own value.
The escalator in its purest form
If you want a vivid example of where this all leads, meet Patty — Burger King's AI-powered headset assistant. (Yes, really. Patty.) She flags inventory issues, updates menus when items sell out, and — here's the part that made me do a double-take — monitors how often employees say "please" and "thank you."
We have optimized the burger-making process so completely that the next frontier is measuring human warmth in real time. Efficiency always finds a new ceiling. It just keeps moving.
One CEO made headlines celebrating when his team's AI spend jumped tenfold. He called it a "come to Jesus moment" — proof his people had finally gotten on board. Spending more on the tool that was supposed to save money had become the success metric. The efficiency gain didn't reduce demand. It fed it.
So what do you actually do with this?
First, the honest part: you can't opt out of these tools, and you probably don't want to. They genuinely do make a lot of things easier, and pretending otherwise won't help you.
But you can name what's happening. That's more powerful than it sounds. When you can see the escalator, you can start making intentional choices about it — rather than just being carried along by it.
That might look like auditing, where "efficiency gains" on your team have quietly become new expectations, and deciding consciously which ones are worth pushing back on. It might look like protecting your people from the next round of scope creep disguised as innovation.
The expectation escalator isn't a personal failing. It's not a sign that you're using AI wrong. It's an organizational pattern as old as the assembly line, just wearing a sleeker interface now. It has always moved in one direction. It has always moved on its own — until someone with a bit of authority decides it doesn't have to.
That part isn't a technology problem. It never was. It's a people one.
.jpg)