Yep, the real strength of AI is less in replacing engineering skills, it's more in slashing all the time we spend not using those skills and doing low level research and data correlation tasks instead. Which isn't to say that those tasks aren't valuable in their own way, but in terms of raw output...
I treat the low level tasks as building blocks. You need a grasp and understanding of what is possible with them, but you do not need to remember the exact byte order and syntax. I think the idea is you should structure your workflow in a deterministic way, and just use Claude/ LLM as the interface. It is much easier and enjoyable to use high level language, where you give pointers to building blocks/ directions/ say hard no when you understand things deviate.
If I had to output the code myself, would take around 8 hours of constant writing to get around 1k LoC of code. For FUSE level tricky stuff, I might need to spend 3 weeks for 10 LoC. Very easy to burnout and build pain.
I long for the day when they will supervise CI/CD systems.
Trying to fix syntax errors in strong interpolation on a 5-minute-delay loop is hell.