logoalt Hacker News

tosser12344321yesterday at 7:13 PM1 replyview on HN

I've thought about that as well - what derails this, what invalidates the unstoppable forward march? That is often how the world works. City real estate costs were flying up year after year after year, and others rust-belting, until Covid and remote work, for example.

So, what can derail AI out of left field? Maybe building DCs for it in Arizona and EMEA can, for one.... choosing very "water-rich" locations there for water-cooled systems.

So, how could this land longterm, assuming AI works sort of good, sort of bad against the use cases? The real questions here for industry people though should be this:

1) How does this play out, over the 5-10 yrs we have to see it occur of trying it/redoing it/trying a new version/going back to the old version, all the while it's occurring over my career, all the while when I have bills to pay and relationships to maintain.

Ans: I think that's a hell of a lot of financial and employment stress induced on us by people who don't understand the tech they're rolling out, the state change that's occurring, and don't need to deal with the consequences. All the while, I go mid career, to late career, dealing with what AI can actually do in the background.

2) What is actually going to work wrt being relevant to my job?

Ans: I think what actually works is the vuln research aspect of AI, feedback loops rapidly, rapidly speeding up on that.

And, what is the most stressful, obnoxious, high burnout part of the job - sec arch and vuln remediation, or IR and vuln response. Both about to go on overddrive, and already are if you're minding bug bounties and IR these days.

3) Has this happened to other industries, how did it go?

Ans: trading, trading, trading, trading. Check it out.


Replies

01100011yesterday at 9:39 PM

I don't know what derails it, I just know that the line on the chart going up or down rarely goes straight. AI might finally be the thing that results in permanent exponential growth and not a sigmoid, or maybe it hits some limits. Maybe those limits are on the human side(our ability to use it, regulatory, social backlash, etc). Maybe management tries to cut out the tech folks only to result in a tangled mess of crap that only we can help them untangle? Maybe the folks with background knowledge will suddenly be needed en masse to control and leverage AI?

We are, for example, about to grow the reach of tech even further thanks to AI. A large percentage of future warfare, for instance, will now be taken over by tech. If humanoid robots get gud, there's a whole 'nother world of applications that will probably need people to specify, test, improve, etc.

Sure, on the one hand I think the value of writing code will probably go to zero in ten years(although some applications explicitly forbid AI coding like some critical infra or space stuff), but writing code is a small part of many SWE's jobs. AI currently still needs to be told what to build and how to make a cohesive, sensible product. Maybe that changes, maybe it doesn't. But the path to eliminating human work is not short or clear-cut.