logoalt Hacker News

ericolyesterday at 3:33 PM2 repliesview on HN

> The long-term effect is less clear. If we generate more code, faster, does that reduce cost or just increase the surface area we need to maintain, test, secure, and reason about later?

My take is that the focus is mostly oriented towards code, but in my experience everything around code got cheaper too. In my particular case, I do coding, I do DevOps, I do second level support, I do data analysis. Every single task I have to do is now seriously augmented by AI.

In my last performance review, my manager was actually surprised when I told him that I am now more a manager of my own work than actually doing the work.

This also means my productivity is now probably around 2.5x what it was a couple of years ago.


Replies

randitoyesterday at 4:51 PM

> In my last performance review, my manager was actually surprised when I told him that I am now more a manager of my own work than actually doing the work.

I think this is very telling. Unless you have a good manager who is paying attention, a lot of them are clueless and just see the hype of 10x ing your developers and don't care about the nuance of (as they say) all the surrounding bits to writing code. And unfortunately, they just repeat this to the people above them, who also read the hype and just see $$ of reducing headcount. (sorry, venting a little)

show 1 reply
kldgyesterday at 5:27 PM

This has been my experience, too. In dealing with hardware, I'm particularly pleased with how vision models are shaping up; it's able to identify what I've photographed, put it in a simple text list, and link me to appropriate datasheets. yday, it even figured out how I wanted to reverse engineer a remote display board for a just-released inverter and correctly identified which pin of which unfamiliar Chinese chip was spitting out the serial data I was interested in; all I actually asked for was chip IDs with a quick vague note on what I was doing. It doesn't help me solder faster, but it gets me to soldering faster.

A bit OT, but I would love to see some different methods of calculating economic productivity. After looking into how BLS calculates software productivity, I quit giving weight to the number altogether and it left me feeling a bit blue; they apply a deflator in part by considering the value of features (which they claim to be able to estimate by comparing feature sets and prices in a select basket of items of a category, applying coefficients based on differences); it'll likely never actually capture what's going on in AI unless Adobe decides to add a hundred new buttons "because it's so quick and easy to do." Their methodology requires ignoring FOSS (except for certain corporate own-account cases), too; if everyone switched from Microsoft365 to LibreOffice, US productivity as measured by BLS would crash.

BLS lays methodology out in a FAQ page on "Hedonic Quality Adjustment"[1], which covers hardware instead of software, but software becomes more reliant on these "what does the consumer pay" guesses at value (what is the value of S-Video input on your TV? significantly more than supporting picture-in-picture, at least in 2020).

[1] https://www.bls.gov/cpi/quality-adjustment/questions-and-ans...