There's a particular kind of task that defines early careers in white-collar work. Drafting meeting minutes at 5pm. Reconciling a spreadsheet where two columns stubbornly refuse to agree. Chasing down a citation that somehow links to the wrong source. Proofreading a slide deck until every comma is in its right place.
Nobody loves this work. But it was never just busywork.
When a junior analyst painstakingly formats a dataset, or a new consultant irons out the logic in a proposal deck, something quieter is happening beneath the surface: they're absorbing standards. Internalizing what "good" looks like. Training an eye for the kind of subtle inconsistency that no checklist can fully capture. The drudgery, unglamorous as it was, doubled as an apprenticeship.
AI is now absorbing much of that drudgery — and that's where things get complicated.
When Autopilot Weakens the Pilot
Aviation researchers have long documented a troubling pattern: pilots who rely heavily on autopilot systems gradually lose the manual flying instincts that make them effective in emergencies. The skill doesn't disappear overnight. It quietly atrophies from disuse.
White-collar work is facing a version of the same problem.
When people delegate unfamiliar tasks to AI rather than wrestling with them directly, research suggests they don't build the conceptual framework needed to supervise, troubleshoot, or improve on that work. In controlled studies, learners who offloaded tasks to AI performed measurably worse on deeper comprehension measures than peers who engaged with the same tasks hands-on.
For professionals whose long-term value depends on judgment, pattern recognition, and strategic instinct, this isn't a small footnote. It's a slow structural risk.
If AI drafts every client memo, the junior lawyer may never develop an intuitive feel for legal argument structure. If an analyst outsources her charts, she may never train the eye that catches an anomaly before it becomes an error that reaches a client. The output looks fine. The capability underneath it quietly hollows out.
De-Skilling at Scale
Economists have a name for this dynamic: de-skilling. It's the process by which automation doesn't just replace labor — it gradually degrades the expertise of the humans still nominally in charge of it.
In white-collar contexts, the mechanism is subtle. AI compresses complex, judgment-intensive tasks into polished outputs that require minimal interpretation to consume. That's the pitch, and it's often accurate. The problem is that consuming a polished output is not the same as understanding what makes it right — or catching what makes it wrong.
An AI-generated slide deck with a quietly misaligned argument. A financial model with a subtly flawed assumption baked into row seven. These are the kinds of errors that a seasoned professional catches because something feels off — a diagnostic instinct built from years of doing the boring work themselves. Without that foundation, the draft looks clean and the error ships.
The Case for Intentional Use
None of this is an argument against AI. Used well, it genuinely frees professionals from low-leverage repetition and creates space for the work that requires human judgment: strategy, relationship-building, ethical reasoning, creative problem-solving.
The issue isn't AI. It's unreflective dependence on it.
The professionals who will thrive in this era will be those who treat AI outputs as drafts to be interrogated, not conclusions to be accepted. They'll ask the follow-up questions AI can't answer without human context. They'll use AI to accelerate their thinking, not to replace it. They'll stay close enough to the detail that they still know when something is wrong, even if they can't immediately say why.
The future of white-collar work isn't about preserving every skill from the pre-AI era. Many of those skills will — and should — give way. But the skills that remain will matter more than ever: the ability to think strategically, reason ethically, navigate genuine ambiguity, and know when the clean output in front of you is hiding a problem.
Speed and output are rising across every profession. The quieter question, worth asking now, is what happens to depth and capability if nobody is paying attention.
That trade is worth debating before it's already made.
