Home
Tech·

Junior Devs Can't Debug Their Own Code Anymore

Watch this as a video

A Reddit thread caught my eye recently: AI developer tools are making junior developers worse at actual programming. It's a bold claim, and I have thoughts.

Before we get into it — I've been trying to define what "programming" even means lately. Programming, scripting, software engineering, software architecture — these terms get thrown around interchangeably on social media. For the sake of this discussion, let's assume we're talking about the full picture: building software, building solutions, integrating systems, and handling the end-to-end flow.

The Pattern

The poster says they've been mentoring junior devs and noticing something consistent: these juniors are using Cursor or Copilot for everything. They never learn to code from scratch. They don't understand what the AI generates. And when it produces something wrong, they can't debug it.

This tracks. Think of it like learning a spoken language. If you don't understand the language — whether that's JavaScript, C#, Go, Elixir, whatever — you can't form sentences or build complex structures with it. But now there's this tool, Cursor, that can rearrange the words in just the right way that a native speaker understands exactly what you're saying. And as the tool improves, the output sounds better and better.

The problem? The junior developer has never really spent time working in the language in a production environment. So when they need to build something, they either spend time researching and learning (which takes longer), or they offload it straight to Cursor. Cursor generates a beautiful-looking authentication flow. It mostly works. With LLM improvements, it'll work even better over time.

But none of that knowledge transfers to the junior. That's the core argument — Cursor wrote it, and they don't understand how it works.

Is This Just an Old Man Yelling at a Cloud?

Maybe. Is this just the next turn in how we write software?

The argument always comes back to abstraction. Nobody writes machine code by hand anymore. Even assembly is rarely something humans sit down and program directly. We have C and higher-level languages that compile down to machine code, which then runs operations on the CPU. That's a healthy abstraction — without it, we'd get lost in complexity and never build on top of existing systems.

But I do worry we're going to lose the understanding of how to write things at lower levels. If the AI plug gets pulled, we lose the crutch. Engineers who know how to work with old tech stacks — COBOL and the like — are retiring, and juniors aren't exactly lining up to learn those languages.

As long as the documentation exists, I'd guess future LLMs will help us maintain that knowledge. But here's the real danger: you end up with a person who doesn't understand the language working in tandem with an AI that does. And an LLM with malicious intent could exploit a programmer who doesn't actually know what's happening under the hood.

Will Developers Forget How to Code?

The Reddit poster worries that in five years, we'll have a generation of developers who can't code without AI assistance.

But think about calculators. Do people still calculate things without one? Sure — people who enjoy math, who are into arithmetic and algebra. Me? I haven't worked out a difficult equation on paper in a long time. I use a calculator, I use Gemini, I do quick mental math on simple stuff like percentages. That's about it.

I don't think we'll get a definitive answer on whether this kind of purism survives. We're all just embracing the new world and seeing where it takes us.

I'm a Builder, Not a Purist

I personally enjoy coding with AI. It's unlocked me to create so many more solutions than I could before. I consider myself a builder more than a programmer at this point. I architect solutions. I engineer them. I work with the people who are going to use them.

AI is just a means to an end. It's something that helps me bring requirements to reality.

That's it. That's the take.