AI As False Mastery
Claude wrote me Go tests that passed. They were beautiful, and they were worthless. They all collapsed to
true == true
.
That's the trap. AI gives us the illusion of mastery without the work. The shape of the code looks right, making it easy to skim the details.
I'm not being careless choosing to use AI. None of us are. Claude is a gift: it can write Go code with tests, comments, SQL queries, and the shape of a real feature. It can produce a Svelte frontend with components and fetch calls, and it even knows the backend API signatures. It feels like you're being handed quality work, like the boilerplate has been finally solved.
Yet, when I worked to debug the Svelte code Claude had generated, it took forty minutes of rereading code and tutorials to see that Claude had casually mixed version 4 and 5 syntax. I had seen it written, I had nodded along, skimming, until I forced myself to stop, trace through the code, and uncover why it didn't behave.
The AI had moved me forward, but it hadn't saved me the real work. I had thought I was mastering front end development, quickly. But mastery still required: building the model, holding it in my head, doing the thinking.
False mastery is mistaking convincing syntax for real understanding.
The False Mastery Trap
There's a cultural conversation Software People are having about AI. We're thoughtful professionals, and we try hard to do good work. That's exactly why this false mastery is dangerous: AI makes us feel like we can finally relax and still get good results.
The problem isn't that the tool is bad. It's like fitness: you stop going to the gym for a day and it's not too hard to get back on track, but stop for a few weeks and turning the habit back on feels...not harder, but less essential; you got this far without the gym, what harm's another day? The gym's still a good tool, still the right tool, but I'm less focused.
The problem is that we recognise the shape of the implementation AI generates, so we think it must be the thing we want. When entire teams relax into false mastery, codebases turn into Rorschach blots: familiar shapes, no underlying model. It's organizational decay.
The interplay scares me, for myself, my teammates, and for you. AI starts as a relief that lifts the work from us. Within a couple of days, it's clear that AI does not carry the cognitive burden: we see the code starts circling the feature without ever quite getting there, and it's now our responsibility to understand how to carry it over the finish line. But I've put that burden of understanding down, and it feels so damn heavy to pick it back up.
This Work Requires Effort
This is hard work we've been doing for years: reading code carefully, building models in our heads, debugging when things don't make sense. That's our craft.
Mastery has always been the ability to carry the burden. Put that down for too long, you won't want to pick it back up.