I just had what you might probably describe as the opposite experience. I was sat at a very important all hands meeting by our senior tech leader with about 100 people or so .who was mandating an AI goal for every employee on workday, he basically says that “if we all do not learn to adapt to AI, we will all get left behind” , and he had presented how to utilise spec driven development. He opened up the room for Q&A at the end of the meeting. A lot of them had technical questions about the agentic framework itself but I had a philosophical one. I I felt uncomfortable asking him the question in the open, so I sent him a private note .
The note read something like as follows : I don’t exactly agree with the framing that we will all get left behind if we all don’t learn to adapt to AI. More accurately, I see it this way. While the company definitely stands to gain from all the hyper increase in productivity by the use of said AI tools, I stand to pay a personal price and that personal price is this - I’m may very slowly stop exercising my critical thinking muscles because I am accustomed passing that to AI for everything and this will render me less employable, it is this personal price that I feel reluctant to pay. There has always been a delicate balance between an employer and employee. We learn new technologies on the job and we’re more employable for transferring that to other companies. This equation is now unbalanced. The company trapped more value, but there is skill erosion on my side . For instance, our team actually has to perform a Cassandra DB migration this year . Usually, I’d have to take a small textbook and read about the internals of CassandraDB, and maybe learn a guide on how to write Cassandra queries. What do I put in my resume now? That I vibe coded Cassandra migration? How employable is that? And I’m not sure if others felt the same way. But I definitely felt like the odd one out here for asking that question because everyone else in the meeting was on board with AI adoption.
The leader did respond to me and he said that learning agentic AI actually will make me more employable. So there is a fundamental disagreement as to what constitutes skill. I think he just spoke past me. Oh well at least I tried.
You are definitely not alone, and it’s unfortunate when people pushing AI ignore that legitimate fear and talk past it.
You are right, there is something you lose, but for what it’s worth, I don’t think the loss is necessarily critical thinking - I think it’s possible to use AI and still hone your critical thinking skills.
The thing you start to lose first is touching the code directly, of course, making the constant stream of small decisions, syntax, formatting, naming, choosing container classes, and a large set of other things. And sometimes it’s the doing battle with those small decisions that leads to deeper understanding. However, it is true, and AI agents are proving, that a lot of us have to make the same small decisions over and over, and we’re frequently repeating designs that many other people have already thought through. So one positive tradeoff for this loss is better leveraging of ground already covered.
Another way to think about AI is that it can help you spend all of your time doing and thinking about software design and goals and outcomes rather that having to spend the majority of it in the minutiae of writing the code. This is where you can continue to apply critical thinking, just perhaps at a higher level than before. AI can make you lazy, if you let it. It does take some diligence and effort to remain critical, but if you do, personally I think it can be a lot of fun and help you spend more time thinking critically, rather than less.
Some possible analogies are calculators and photography. People were fretting we’d lose something if we stop calculating divisions by hand, and we do, but we still just use calculators by and large. People also thought photography would ruin art and prevent people from being able to make or appreciate images.
Software in general is nearly always automating something that someone was doing by hand, and in way every time we write a program we’re making this same tradeoff, losing the close hands-on connection to the thing we were doing in favor of something a touch more abstract and a lot faster.
Database migrations are hard and inductive and often fail in some aspect. Why would you want to spend time doing them when you can spend time building the important thing after the migration is done.
Secondly - AI helps with happy path tasks for a migration. But most database migrations are complex beyond what an LLM can just spit out. There is so much context outside the observable parts of the database AI has access to. So I don’t think you have to worry about vibe coding eating the entire migration project.
I understand your sentiment. I personally would never use a textbook for anything code related, if there's no proper documentation online then I wouldn't touch it with a ten-foot pole, haha.
However, even though I've never worked with CassandraDB, I feel pretty confident that I could do it with Claude Code. Not just "do it for me", but more like "I have done a lot of database migrations in my time, but haven't worked with CassandraDB in particular. Can you explain to me the complexities of this migration, and come up with a plan for doing it, given the specifics of this project?"
That question alone is already a massive improvement over a few years ago. I don't feel like I was using my "critical thinking muscles" when I tried to figure out how the hell to get hadoop to run on windows, that was just an exercise in frustration as none of the documentation matched the actual experience I was getting. Doing it together with Claude Code would be so much easier, because it'll say something like "Oh yeah this is because you still need to install XYZ, you can do that by running this line here: ...".
Now I'm not saying that Claude Code, and agentic in general, isn't taking away some of my critical thinking: it really is. But it also allows me to learn new skills much more quickly. It feels more like pair programming with someone who is a better programmer than me, but a much worse architect. The trick is to keep challenging yourself to take an active role in the process and not just tell it to "do it", I think.