I'm 47 and excited to live in a time of the moat important innovation since the printing press.
A bit younger, and exact opposite. Probably the most excited I've ever been about the state of development!
The deepest thing I read from HN in months. Respect.
Abstractions can take away but many add tremendous value.
For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.
Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.
Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.
Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.
These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.
You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).
Or to take a moment to marvel.
Same as assembly programmers felt when C came along I guess
As someone who has always enjoyed designing things, but was never really into PUZZLES, I always felt like an outsider in the programming domain. People around me really enjoyed the "fun" of programming, whereas I was more interested in the Engineering of the thing - balancing tradeoffs until within acceptable margins and then actually calling it "DONE". People around me rarely called things "done", they rewrote it and rewrote it so that it kept satisfying their need for puzzle-solving (today, it's Ruby, tomorrow, it's rewritten in Scala, and the day after that, it's Golang or Zig!)
I feel that LLMs have finally put the ball in MY court. I feel sorry for the others, but you can always find puzzles in the toy section of the bookstore.
"They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of."
and they still call themselves 'full stack developers' :eyeroll:
'It’s not a “back in my day” piece.'
That's exactly what it is.
> …Not burnout…
Than meybe wadeAfay? ;)Late 30s here, I have seen:
* dial-up being replaced by DSL
* CAT being replaced with fiber for companies
* VOIP replacing bulk BPX
* Cloud replacing on-prem to an extent
* Cloud services plague now called SaaS
* License for life being replaced by subscription
* AI driving everything to shit literally
The technology is no longer helping anything, it is actually tearing our society apart. Up to 2000s, things were indeed evolution, improvements, better life style be it personal or professional. Since 2000s, Enshitification started, everything gets worse, from services, to workflows, to processes, to products, to laws.
Gen-Z does not realize how bad things are, and how we are no longer becoming smarter but dumber, kids cannot even read but have every single social media account.
If they could spend one day back in early 2000s, the current generation would start a civil war in every single city across the globe.
>"The abstraction tower
Here’s the part that makes me laugh, darkly.
I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.
They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
But sure. AI is the moment they lost track of what’s happening.
The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack.
AI is just the layer that made the pretence impossible to maintain."
Absolutely brilliant writing!
Heck -- absolutely brilliant communicating! (Which is really what great writing is all about!)
You definitely get it!
Some other people here on HN do too, yours truly included in that bunch...
Anyway, stellar writing!
Related:
https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...
https://en.wikipedia.org/wiki/Tower_of_Babel
https://en.wikipedia.org/wiki/Abstraction_(computer_science)
https://en.wikipedia.org/wiki/Abstraction
https://ecommons.cornell.edu/entities/publication/3e2850f6-c...
>But sure. AI is the moment they lost track of what’s happening. The abstraction ship sailed decades ago.
Bullshit. While abstraction has increased over time, AI is no mere incremental change. And the almost natural language interaction with an agent is not the same as Typescript over assembly (not to mention you could very well right C or Rust and the like, and know most of the details of the machine by heart, and no, microcode and low level abstractions are not a real counter-argument to that). Even less so if agents turn autonomous and you just herd them onto completion.
I have been around for a similar amount of time. Another change I have seen over the years is the shift from programming being an exercise in creative excellence at work to being a white-collar ditch-digging job.
I have the opposite take. There’s nothing stopping you from jumping into any component to polish things up. You can code whatever you wish. And AI takes away nearly all of the drudgery : boilerplate, test cases, inspecting poor documentation, absurd tooling.
It also lets me focus more on improving things since I feel more liberated to scrap low quality components. I’m much braver to take on large refactors now – things that would have taken days now take minutes.
In many ways AI has made up for my growing lack of patience and inability to stay on task until 3am.
I was happy riding my horse when this dude invented a car.
Programming changed all along.
New concepts came out all along.
They became standardized all along and came down market to smaller and smaller projects.
Source control.
Cloud.
Agile/Scrum.
Code completion IDEs.
Higher Level languages.
These were not LLMs but did represent a shift that had to be kept up with.
LLMs are no different, just a bigger jump.
There is just as much opportunity here.
Software development and software developers are not going away.
More software that never could be built will now be built.
For the forseeable future there will always be software that needs to be overseen by a human.
Humans have a special knack for taking the humanity out of basically anything. It's a bizarre pattern.
It's not like it's changing by itself, you can always opt out of the slop race and scratch your itches instead.
I've written sse2 optimized C, web apps, and probably everything in between (hw, datasci, etl, devops).
I like coding with AI both vibe and assisted, since as soon as the question enters my head I can create a prototype or a test or a xyz to verify my thoughts. The whole time I'm writing in my notebook or whiteboard or any other thing I would have gotten up to. This is enabling tech, the trouble for me is there is a small thread that leads out of the room into the pockets of billion dollar companies.
It is no longer you vs the machine.
I have spent tons of time debugging weird undocumented hardware with throwaway code, or sat in a debugger doing hex math.
I think one wire that is crossed right now in this world is that computing is more corporate than ever, with what seems like ever growing platforms and wealth extraction at scale. Don't let them get you down, host your own shit and ignore them. YES IT WILL COST MORE -> YOUR FREEDOM HAS A PRICE.
Another observation is that people that got into the game for pure money are big mad right now. I didn't make money in the 00s, I did in the end of the 10s, and we're back at job desolation. In my groups, the most annoyed are code boot campers who have faked it until they made it and have just managed to survive this cycle with javascript.
Cycles come and go, the tech changes, but problem solving is always there.
“... when I was 7. I'm 50 now and the thing I loved has changed”
Welcome to the human condition, my friend. The good news is that a plurality of novels, TV shows, country songs, etc. can provide empathy for and insight into your experience.
The irony of these "My craft is dead" posts is that they consistently, heavily leverage AI for their writing. So you're crying about losing one craft to AI while using AI to kill another. It's disingenuous. And yes it is so damn obvious.
I'm 57 and wrote my first line of BASIC in 1980, so while I can still chime in on this specific demographic I feel that I ought to. So im like this guy, but like a lot of other people in my specific demographic we aren't writing these long melancholy blog posts about AI because it's not that big of a deal. As an OSS maintainer most of my work is a lot of boring slog adding features to libraries to suit new features in upstream dependencies, nitpicky things people point out, new docs, tons of tedium. Claude helps a ton with all of that. no way is Claude doing the real architectural puzzle stuff, that's still fully on me! I can just use Claude to help implement it. It's like the ultimate junior programmer assistant. It's certainly a new, different and unique experience in one's programming career but it really feels like another tool, like an autocomplete or code refactoring tool that is just a lot better, with similar caveats. I mean in my career, I've had to battle the whole time people who don't "get" source code control (starting with me), who don't "get" IDEs (starting with me), people who dont "get" distributed version control (same), people who don't "get" ORMs (oh yes, same for me though this one I took much more dramatic steps to appreciate them), people who don't "get" code formatters, now we're battling people who don't "get" LLMs used for coding, in that sense the whole thing doesnt feel like that novel of a situation.
it's the LLMs that are spitting out fake photos and videos and generating lots of shitty graphics for local businesses, that's where I'm still wielding a pitchfork...
There's 3-4 of these posts a day - why don't people spend more time hand-building things for fun in their free time? That's what led a lot of us to this career path to start with. I have a solid mix of hand-code and AI-assisted projects in my free time.
>>The machines I fell in love with became instruments of surveillance and extraction.
Surveillance and Extraction
"We were promised flying cars", and what we got was "investors" running the industry off the cliff into cheap ways to extract money from people instead of real innovation.
> I started programming when I was seven because a machine did exactly what I told it to
What a poetic ending. So beautiful! And true, in my experience.
This isn't new. It's the same feeling the first commercial programmers had working in assembly, or machine code, once compilers became available. Ultimately I think even Mel Kaye forsook being able to handpick memory locations for optimum drum access before his retirement, in favor of being able to build vastly more complex software than before.
AI has just vastly extended your reach. No sense crying about it. It is literally foolish to lament the evolution of our field into something more.
Programming is dead. In the last 4 days I've done 2 months of work. The future is finally here.
Bad times to be a programmer. Start learning business.
I'm 57. I was there when the ZX81 came out.
I had my first paid programming job when I was 11, writing a database for the guy that we rented our pirate VHS tapes from.
AI is great.
Don't program as a career, but am also 50 and programming since TRS-80. AI has transformed this era, and I LOVE IT! I can focus on making and not APIs or syntax or all of the bootstrapping.
Professional development is changing dramatically. Nothing stops anyone from coding "the old way," though. Your hobby project remains yours, exactly the way you want it. Your professional project, on the other hand, was never about you in the first place. It's always about the customer/audience/user, period full stop.
Please stop upvoting these posts. We have gotten to the point where both the front page and new page is polluted with these laments
It’s literally the same argument over and over and it’s the same comments over and over and over
HN will either get back to interesting stuff or simply turn into a support group for aging “coders” that refuse to adapt
I’m going to start flagging these as spam
same bud.
maybe that just means it's a maturing field and we gotta adapt?
yes, the promise has changed, but you still gotta do it for the love of the game. anything else doesnt work.
I’m 50 too and I’ve complained and yearned about the “old” days too, a lot of this is nostalgia as we reminisce about periods of time in our youth when we had the exuberance and time to play and build with technology of our own time
Working in AI startups strangely enough I see a lot of the same spirit of play and creativity applied to LLM based tools - I mean what is OpenClaw but a fun experiment
Those kids these days are going to reminisce about the early days of AI when prompts would be handwritten and LLMs would hallucinate
I’m not really sure 1983, 1993 or 2003 really was that gold of age but we look at it with rose colored glasses
11 and now 45. I am still interested in it, but I feel like in my 20s I would get a dopamine rush when a row showed up in a database. In my 30s I would get that only if a message passed through a system and updated on-screen analytics within 10 seconds. Thank god for LLMs because all of it became extremely boring, I can't stand having to get these little milestones each new company or each new product I'm working on. At least with LLMs the dopamine hit comes from being in awe of the code that gets generated and realizing it found every model, every messaging system interface, every API, and figuring out how to make it backwards compatible, updating the UI - something that would take half a day, now in 5 minutes or less.
> I’ve had that experience. And losing it — even acknowledging that it was lost
What are you talking about? You don't know how 99% of the systems in your own body work yet they don't confront you similarly. As if this "knowledge" is a switch that can be on or off.
> I gave 42 years to this thing, and the thing changed into something I’m not sure I recognise anymore.
Stop doing it for a paycheck. You'll get your brain back.
[dead]
[dead]
[dead]
So tired of this sort of complaint (and I'm 62).
The computing the author enjoyed/enjoys is still out there, they are just looking for it in all the wrong places. Forget about (typical) web development (with its front and backend stacks). Forget about windows and macOS, and probably even mobile (though maybe not).
Hobby projects. C++/Rust/C/Go/some-current-Lisp. Maybe even Zig! Unix/Linux. Some sort of hardware interaction. GPL, so you can share and participate in a world of software created by people a lot more like you and a lot less like Gates and Jobs and Zuckerberg and ...
Sure, corporate programming generally tends to suck, but it always did. You can still easily do what you always loved, but probably not as a job.
At 62, as a native desktop C++ app developer doing realtime audio, my programming is as engrossing, cool, varied and awesome as it has ever been (probably even more so, since the GPL really has won in the world I live in). It hasn't been consumed by next-new-thing-ism, it hasn't been consumed by walled platforms, it hasn't been taken over by massive corporations, and it still very much involves Cool Stuff (TM).
Stop whining and start doing stuff you love.
This is at least partially AI-written, by the way