The Cost of Seniority: When Your Best Engineers Stop Coding
Spotify's senior engineers haven't written code since December. They supervise AI instead. It sounds like efficiency—but what happens when the path to senior engineer no longer includes writing code?
On February 10, Spotify co-CEO Gustav Söderström told investors something that sent shockwaves through developer communities: the company's most experienced engineers "have not written a single line of code since December." They generate code through AI and supervise it. That's it.
The workflow is almost comically frictionless. An engineer opens Slack during their morning commute, tells Claude to fix a bug in the iOS app, gets a compiled version pushed back on Slack to test, and merges to production—all before arriving at the office. The tool handling this? Honk, an internal system built on Anthropic's Claude Code that now merges more than 650 agent-generated pull requests per month.
The market loved it. Spotify's Q4 numbers were stellar: 751 million monthly active users, 290 million premium subscribers, €701 million in operating income. Söderström framed this as the beginning of something enormous. Software companies "will start producing enormously more amount of software."
Developers heard something different. Within 48 hours, the story had racked up over 14,000 upvotes on Reddit. The consensus? This isn't just about productivity—it's about what happens when the fastest path to shipping code isn't the same as the path that builds expertise.
The Assembly Line Problem
Here's what's nagging at me: if Spotify's senior engineers are supervising AI output, when did they stop being engineers and start being reviewers?
Software engineer Siddhant Khare, who builds AI agent infrastructure, described this shift as "AI fatigue." In an essay that resonated across the industry, he wrote about how developers are quietly moving "from creator to code reviewer on an assembly line." The AI doesn't get tired between problems. Humans do.
And the data backs up the skepticism. A study from Anthropic—the same company whose Claude Code powers Spotify's Honk—found that developers using AI coding assistants scored 17% lower on comprehension tests, even as they completed tasks faster. Let me say that again: faster and worse at understanding what they built.
Another analysis found that AI-heavy codebases see 39% more code churn—more code written, more code rewritten, more code thrown away. Research examining AI-generated pull requests found they produce 1.7x more issues overall: 10.83 issues per PR versus 6.45 for human-written code. According to a Sonar developer survey, 61% of developers agree that AI often produces code that looks correct but isn't reliable.
The Missing Middle
There's a detail in the Spotify story that keeps bothering me. Between 2023 and 2025, the company cut roughly 20% of its workforce—about 1,500 people. CEO Daniel Ek later admitted the layoffs "disrupted operations more than anticipated."
So the timeline reads: fire a fifth of your company, then announce the remaining engineers don't write code anymore.
Is this because AI made those 1,500 people redundant? Or did the engineers who stayed become supervisors because there was no one left to do the hands-on work? Spotify isn't saying. But the sequence matters.
What strikes me from my cognitive science background is this: expertise isn't just knowledge—it's pattern recognition built through thousands of hours of hands-on problem-solving. When you remove the hands-on part, you don't just lose time spent coding. You lose the iterative feedback loop that creates senior engineers.
The Career Progression Paradox
Let's think through what this means for someone early in their career. You start as a junior engineer. You spend years writing code, debugging systems, learning how different architectural decisions play out at scale. That experience gradually transforms you into someone who can look at a problem and intuitively know what will work.
But now imagine that halfway through that journey, your company introduces an AI system that writes most of the code. You're told your new job is to review AI output. You're being asked to exercise judgment you haven't yet developed on code patterns you haven't yet internalized.
How do you become a senior engineer in that world?
This isn't a hypothetical. It's playing out right now at Spotify and, increasingly, at companies everywhere racing to adopt AI coding tools. According to industry surveys, four in five developers now use AI tools in their workflow. The shift is already here.
The Legal Grey Area Nobody's Talking About
Here's an angle that surprised me in my research: the U.S. Copyright Office maintains that only works created by humans can be copyrighted. An appellate court affirmed in 2025 that human authorship is "a bedrock requirement" for copyright registration.
If Spotify's best engineers genuinely "have not written a single line of code" since December—if the code was generated by Claude and humans only reviewed and approved it—what's the copyright status of that code? This isn't academic. It's 650-plus pull requests a month of code whose authorship an engineer could deny under oath.
No company has tested this in court yet. Someone will. The implications could reshape how AI-generated code is valued, licensed, and protected.
What Söderström Actually Said
Strip away the headlines and look at what Söderström described. He didn't say AI is better than his engineers. He said his engineers are doing a different job now—one where they issue natural language instructions and approve results.
The developer didn't disappear. The developer became a manager of something that writes code the way a junior developer might: quickly, confidently, and with a reliability rate that demands constant supervision.
That's not a breakthrough in how software is made. That's a change in who does the reviewing.
What This Means for Your Career
If you're reading this and trying to figure out what this means for your next five years, here's what I keep coming back to:
The skills that matter are shifting, not disappearing. Systems thinking, architecture, understanding user needs, debugging complex interactions—these all still require human expertise. But if your current role is becoming "AI supervisor," you need to actively create space to maintain hands-on technical skills.
Mentorship becomes even more critical. If junior engineers can't learn by osmosis from watching senior engineers code, companies need deliberate strategies for skill development. If your company doesn't have one, that's a red flag.
Ask about the learning environment. In interviews, don't just ask about tech stack. Ask: How much time do engineers spend reviewing AI output versus building systems hands-on? How are you ensuring junior engineers develop deep technical skills? What happens when the AI generates something that looks right but isn't?
Watch the churn. Companies optimizing purely for velocity might be accumulating technical debt faster than they realize. When code comprehension drops 17% while output increases, something has to give. Make sure you're not the one left holding the bag when it does.
The Velocity Trap
Spotify can report to investors that development velocity has never been higher while simultaneously facing the reality that they can't train new senior engineers by having them watch Claude. That gap between velocity and expertise development? That's where the risk lives.
The question isn't whether AI will change how we build software—it already has. The question is whether companies will recognize that optimizing for short-term velocity might sacrifice the long-term capacity to build complex, reliable systems.
Because here's what I learned from studying cognitive science: expertise isn't something you can supervise your way into. You have to build it, one problem at a time, with your hands on the keyboard.
And if the path to senior engineer no longer includes that journey? We're not just changing how software gets made. We're changing what "senior engineer" means—and whether that title will still represent the deep technical expertise it once did.