logoalt Hacker News

I started programming when I was 7. I'm 50 now and the thing I loved has changed

569 pointsby jamesrandallyesterday at 3:08 PM478 commentsview on HN

Comments

adamtaylor_13yesterday at 10:30 PM

This essay begins by promising not to be a "back in my day" piece, but ends up dunking on 20-year-olds who are only a few years into their career, as if they have any choice about when they were born.

kwar13yesterday at 6:02 PM

> They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

yup.

show 1 reply
ilitirityesterday at 3:52 PM

I'm roughly the same (started at 9, currently 48), but programming hasn't really changed for me. What's changed is me having to have pointless arguments with people who obviously have no clue what they're talking about but feel qualified either because:

a) They asked an LLM

b) "This is what all our competitors are doing"

c) They saw a video on Youtube by some big influencer

d) [...insert any other absurd reason...]

True story:

In one of our recent Enterprise Architecture meetings, I was lamenting the lack of a plan to deal with our massive tech debt, and used an example of a 5000 line regulatory reporting stored procedure written 10 years ago that noone understood. I was told my complaint was irrelevant because I could just dump it into ChatGPT and it would explain it to me. These are words uttered by a so-called Senior Developer, in an Enterprise Architecture meeting.

show 1 reply
dwoldrichyesterday at 4:17 PM

I am in a very similar boat, age and experience-wise. I would like to work backward from the observation that there is no resource constraints and we're collectively hopelessly lost up the abstraction Jenga tower.

I observe that the way we taught math was not oriented on the idea that everyone would need to know trigonometric functions or how to do derivatives. I like to believe math curricula was centered around standardizing a system of thinking about maths and those of us who were serious about our educational development would all speak the same language. It was about learning a language and laying down processes that everyone else could understand. And that shaped us, and it's foolish to challenge or complain about that or, God forbid, radically change the way we teach math subjects because it damages our ability to think alike. (I know the above is probably completely idealistic verging on personal myth, but that's how I choose to look at it.)

In my opinion, we never approached software engineering the same way. We were so focused on the compiler and the type calculus, and we never taught people about what makes code valuable and robust. If I had FU money to burn today, I'd start a Mathnasium company focused around making kids into systems integrators with great soft skills and the ability to produce high quality software. I would pitch this business under the assumption that the jenga tower is going to be collapsing pretty much continuously for the next 25-50 years and civilization needs absolute unit super developers coming out of nowhere who will be able to make a small fortune helping companies dig their way out of 75 years of tech debt.

JohnMakinyesterday at 4:57 PM

I'm ~40ish but middle career and not in management. I envy this author, whatever joy he found in solving little puzzles and systems was extinguished in me very early in my career in an intense corporate environment. I was never one to love fussing much with code, but I do love solving system scale problems, which also involve code. I don't feel I am losing anything, the most annoying parts of code I deal with are now abstracted into human language and specs, and I can now architect/build more creatively than before. So I am happy. But, I was one of those types that never had a true passion for "code" and have meant plenty of people that do have that, and I feel for them. I worry for people that carved out being really good at programming as a niche, but you enter a point in your career where that becomes much less important than being able to execute and define requirements and understand business logic. And yea, that isn't very romantic or magical, but I find passion outside of what pays my bills, so I lost that ennui feeling a while ago.

show 1 reply
vlnyesterday at 11:03 PM

The thing I loved has changed.

And I fell in love with it again. I'm learning how to work in this new world and it's fun as hell.

benttyesterday at 9:02 PM

Some farmers probably lamented the rise of machines because they feared their strength would no longer be needed in the fields. These farmers were no doubt more concerned with their own usefulness as laborers than in the goals of the farm: to produce food.

If you program as labor, consider what you might build with no boss. You’re better equipped to start your own farm than you think.

show 1 reply
marginalia_nuyesterday at 4:30 PM

You can still have fun programming. Just sit down and write some code. Ain't nobody holding a gun to your head forcing you to use AI in your projects.

And the part of programming that wasn't your projects, whether back in the days of TPS reports and test coverage meetings, or in the age of generative AI, that bit was always kinda soul draining.

kraig911yesterday at 5:01 PM

"Over four decades I’ve been through more technology transitions than I can count. New languages, new platforms, new paradigms. CLI to GUI. Desktop to web. Web to mobile. Monoliths to microservices. Tapes, floppy discs, hard drives, SSDs. JavaScript frameworks arriving and dying like mayflies."... made me think of

I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.

where we came from and where we're going this whole time in my career those things are kind of hard to pinpoint. Abstraction is killing us for sure. Time to market above all else. It's no wonder why software in cars, appliances and medical equipment is a factor that is killing people.

qsiyesterday at 4:23 PM

Well-written and it expresses a mood, a feeling, a sense of both loss and awe. I was there too in the 8-bit era, fully understanding every byte of RAM and ROM.

The sense of nostalgia that can turn too easily into a lament is powerful and real. But for me this all came well before AI had become all consuming... It's the just the latest manifestation of the process. I knew I didn't really understand computers anymore, not in the way I used to. I still love coding and building but it's no longer central to my job or lif3. It's useful, I enjoy it but at the same time I also marvel at the future that I find myself living in. I've done things with AI that I wouldn't have dared to start for lack of time. It's amazing and transformative and I love that too.

But I will always miss the Olden Days. I think more than anything it's the nostalgia for the 8-bit era that made me enjoy Stranger Things so much. :)

yayitsweiyesterday at 4:35 PM

Is there some magic lost also when using AI to write your blog post?

show 2 replies
Decabytesyesterday at 5:24 PM

I too have felt these feelings (though I'm much younger than the author). I think as I've grown older I have to remind myself

1. I shouldn't be so tied to what other people think of me (craftsman, programmer, low level developer)

2. I shouldn't measure my satisfaction by comparing my work to others'. Quality still matters especially in shared systems, but my responsibility is to the standards I choose to hold, not to whether others meet them. Plus there are still community of people that still care about this (handmade network, openbsd devs, languages like Odin) that I can be part of it I want to

3. If my values are not being met either in my work or personal life I need to take ownership of that myself. The magic is still there, I just have to go looking for it

DanielBMarkhamyesterday at 6:03 PM

This is quite the lament. Very well written.

I'm about ten years ahead of the author. I felt this a long time before AI arrived. I went from solving problems for people to everything I tried to ending up in an endless grind of yak-shaving.

I worked my way through it, though. It made me both give up programming, at least in the commercial sense, and appreciate the journey he and I have gone through. It's truly an amazing time to be alive.

Now, however, I'm feeling sucked back into the vortex. I'm excited about solving problems in a way I haven't been in a long time. I was just telling somebody that I spent 4-6 hours last night watching Claude code. I watched TV. I scratched my butt. I played HexaCrush. All the time it was just chugging along, solving a problem in code that I have wanted to solve for a decade or more. I told him that it wasn't watching the code go by. That would be too easy to do. It was paying attention to what Claude was doing and _feeling that pain_. OMG, I would see it hit a wall, I would recognize the wall, and then it'd just keep chugging along until it fixed it. It was the kind of thing that didn't have damned thing to do with the problem but would have held me up for hours. Instead, I watched Pitt with my wife. Every now I then I'd see a prompt, pop up, and guide/direct/orchestrate/consult/? with Claude.

It ain't coding. But, frankly, coding ain't coding. It hasn't been in a long, long time.

If a lot of your job seems like senseless bullshit, I'm sad to say you're on the way out. If it doesn't, stick around.

I view AI as an extinction level threat. That hasn't changed, mainly because of how humans are using it. It has nothing to do with the tech. But I'm a bit perplexed now as to what to do with my new-found superpowers. I feel like that kid on the first Spiderman movie. The world is amazing. I've got half-a-dozen projects I'm doing right now. I'm publishing my own daily newspaper, just for me to read, and dang if it's not pretty good! No matter how this plays out, it is truly an amazing time to be alive, and old codgers like us have had a hella ride.

Zaskodayesterday at 5:24 PM

I found that feeling again while building a game on the EVM. All of the constraints were new and different. Solidity feels somewhere between and high and low level language, not as abstracted as most popular languages today but a solid step above writing assembly.

A lot of people started building projects like mine when the EVM was newer. Some managed to get a little bit of popularity, like Dark Forest. But most were never noticed. The crypto scene has distracted everyone from the work of tinkerers and artists who just wanted to play with a new paradigm. The whole thing became increasingly toxic.

It was like one last breath of fresh cool air before the pollution of AI tools arrived on the scene. It's a bitter sweet feeling.

sebringjyesterday at 9:59 PM

idk, i'm loving the newness of all of it, I feel more empowered than ever before, like it's my time. Before startups would take like a year to get going, now it's like a month or so. It's exciting and scary, we have no idea where it's going. Not boring at all. I was getting bored as shit and bam, now i can dream up shit quick and have it validated to, ya i figured that out with an MCP so ya this is my jam. Program MCPs and speed it up!!!!!!

jppopeyesterday at 4:26 PM

Fantastic Article, well written, thoughtful. Here are a couple of my favorite quotes:

  * "Then it professionalised. Plug and Play arrived. Windows abstracted everything. The Wild West closed. Computers stopped being fascinating, cantankerous machines that demanded respect and understanding, and became appliances. The craft became invisible."

  * "The machines I fell in love with became instruments of surveillance and extraction. The platforms that promised to connect us were really built to monetise us. The tinkerer spirit didn’t die of natural causes — it was bought out and put to work optimising ad clicks."

  * "Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that. It’s not a new platform or a new language or a new paradigm. It’s a shift in what it means to be good at this."

  * "They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of... But sure. AI is the moment they lost track of what’s happening."

  * "Typing was never the hard part."

  * "I don’t have a neat conclusion. I’m not going to tell you that experienced developers just need to “push themselves up the stack” or “embrace the tools” or “focus on what AI can’t do.” All of that is probably right, and none of it addresses the feeling."
To relate to the author, I think with a lot of whats going on I feel the same about, but other parts I feel differently than they do. There appears to be a shallowness with this... yes we can build faster than ever, but so much of what we are building we should really be asking ourselves why do we have to build this at all? Its like sitting through the meeting that could have been an email, or using hand tools for 3 hours because the power tool purchase/rental is just obscenely expensive for the ~20min you need it.
ookblahyesterday at 5:03 PM

maybe we just change, honestly. i think when i were younger there was nothing to lose, time felt unlimited, no "career" to gamble with, no billion dollar idea, just learning and tinkering and playing with whatever was out there because it was cool and interesting to me. in some respects i miss that.

not sure how that relates to llms but it does become an unblocker to regain some of that "magic", but also i know to deep dive requires an investment i cannot shortcut.

the new generation of devs are already playing with things few dinosaurs will get to experience fully, having sunk decades into the systems built and afraid to let it go. some of that is good (to lean on experience) and some of it holding us back.

paulmooreparksyesterday at 4:03 PM

I'm 55 and I started at age 13 on a TI-99/4A, then progressed through Commodore 64, Amiga 2000, an Amiga XT Sidecar, then a real XT, and on and on. DOS, Windows, Unix, the first Linux. I ran a tiny BBS and felt so excited when I heard the modem singing from someone dialing in. The first time I "logged into the Internet" was to a Linux prompt. Gopher was still a bigger thing than the nascent World-Wide Web.

The author is right. The magic has faded. It's sad. I'm still excited about what's possible, but it'll never create that same sense of awe, that knowledge that you can own the entire system from the power coming from the wall to the pixels on your screen.

show 2 replies
ge96yesterday at 3:47 PM

Yeah I could use Cursor or whatever but I don't, I like writing code. I guess that makes me a luddite or something, although I still develop agents. I enjoy architecting things (I don't consider myself an architect) I'm talking about my hobby hardware projects.

towndrunkyesterday at 3:19 PM

I know exactly how you feel. I don't know how many hours I sat in front of this debugger (https://www.jasik.com) poking around and trying to learn everything at a lower level. Now its so different.

show 1 reply
My_Nameyesterday at 5:06 PM

The irony is that you could still code the way you always did, where you control every pixel. Nothing is stopping you.

But you would not be able to make anything anywhere near as complex as you can with modern tools.

karolistyesterday at 5:44 PM

Was this text run through LLM before posting? I recognize that writing style honestly; or did we simply speak to machines enough to now speak like machines?

show 1 reply
xg15yesterday at 8:04 PM

> They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.

But sure. AI is the moment they lost track of what’s happening.

I feel this is conflating different things. Yes, the abstraction tower was massive already before, but at least the abstractions were mostly well-defined and understandable through interfaces: even if you don't understand the intricacies of your storage device, driver and kernel, you can usually get a quite reliable and predictable mental representation how files work. Same goes for network protocols, higher-level programming languages or the web platform.

Sure, there are edge cases where the abstraction breaks down and you have to get into the lower levels, but those situations are the exception, not the norm.

With AI, there is no clearly defined interface, and no one really knows what (precise) input a given output will produce. Or maybe to put it better, the interface is human language and your mental representation is the one you have talking to a human - which is far more vague than previous technical abstractions.

On the bright side, at least we (still) have the intermediate layer of generated code to reason about, which offsets the unpredictability a bit.

harelyesterday at 7:49 PM

I've had the same journey, same age markers. The sentiment is the same, but at the same time this new world affords me super powers I'm currently drunk on. When that drunkenness becomes a hangover I hope I won't be disappointed.

dcanelhasyesterday at 4:27 PM

> I wrote my first line of code in 1983. I was seven years old, typing BASIC into a machine that had less processing power than the chip in your washing machine

I think there may be a counterpoint hiding in plain sight here: back in 1983 the washing machine didn't have a chip in it. Now there are more low-level embedded CPUs and microcontrollers to develop for than before, but maybe it's all the same now. Unfathomable levels of abstraction, uniformly applied by language models?

stronglikedanyesterday at 3:44 PM

Same, but it changed when I was 17 and again when I was 27 and then 37 and so on. It has always been changing dramatically, but this latest leap is just so incredibly different that it seems unique.

KingOfCodersyesterday at 3:58 PM

Cool, at 7? I started at 9 and I'm 53 now. And Claude does all the things. Need to get adjusted to that though. Still not there.

Last year I found out that I always was a creator, not a coder.

CrzyLngPwdyesterday at 5:07 PM

Starting code when I was 14, sold my first bit of code at 17, which was written in 6502 assembler.

40+ years later, been through many BASICs, C, C++ (CFront on onwards) and now NodeJS, and I still love writing code.

Tinkering with RPi, getting used to having a coding assistant, looking forward to having some time to work on other fun projects and getting back into C++ sooooon.

What's not to love?

sebnukem2yesterday at 8:17 PM

Oh boy this hits home.

At this point I entered surviving mode, and curious to see where we will be 6 months, 2 years from now. I am pessimistic.

I want to tinker with my beloved Z80 again.

hmaxwellyesterday at 9:17 PM

You can still write code yourself. Just like you can still walk to work, you do not need to use a car.

metalrainyesterday at 5:54 PM

I think it's the loss of control.

Even if you can achieve awesome things with LLMs you give up the control over tiny details, it's just faster to generate and regenerate until it fits the spec.

But you never quite know how long it takes or how much you have to shave that square peg.

sowbugyesterday at 5:39 PM

Did hardware engineers back in the 1970s-80s* think that software took the joy out of their craft? What do those engineers now think in retrospect?

*I'm picking that era because it seems to be when most electronic machines' business logic moved from hardware to software.

aldousd666yesterday at 5:48 PM

I'm 46 but same. I'm not quite as melancholy about it, but I do feel a lot of this.

rraghuryesterday at 4:26 PM

Are you me?

I'm 49.... Started at 12... In the same boat

First 286 machine had a CMOS battery that was loose so I had to figure that out to make it boot into ms-dos

This time it does feel different and while I'm using them ai more than ever, it feels soulless and empty even when I 'ship' something

pfdietzyesterday at 8:50 PM

I retired a few years ago and it's very clear that was a good thing.

kyproyesterday at 10:48 PM

> I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic

I'm significantly younger than OP, but this was it for me too. I'm autistic and found the world around me confusing growing up. Computers were wonderful because they were the only thing that really made sense to me.

I was obsessed with computers since I was 5. I started programming probably around age 10. Then in my early teens I started creating Flash applications, writing PHP, Java, etc...

When I look back on my early career now it was almost magical. This in the mid to late 00s (late to some I know), but this was before the era of package managers, before resources like Stackoverflow, before modern IDEs. You had some fairly basic frameworks to work with, but that was really about it. Everything else had to be done fully by hand.

This was also before agile was really a thing too. The places I worked at the time didn't have stand-ups or retrospectives. There were no product managers.

It was also before the iPhone and the mass adoption of the internet.

Back then no one went into software engineering as a profession. It was just some thing weird computer kids did, and sometimes businesses would pay us to build them things. Everyone who coded back then I got along with great, now everyone is so normal it's hard for me to relate with me. The industry today is also so money focused.

The thing and bothers me the most though is that computers increasingly act like humans that I need to talk to to get things done, and if that wasn't bad enough I also have to talk with people constantly.

Even the stuff I build sucks. All the useful stuff has been build so in the last decade or so stuff I've built feels increasingly detached from reality. When I started I felt like I was solving real practical problems for companies, now I'm building chatbots and internal dashboards. It's all bollocks.

There was a post recently about builders vs coders (I can't remember exactly). But I'm definitely a coder. I miss coding. There was something rewarding about pouring hours into a HTML design, getting things pixel perfect. Sometimes it felt laborious, but that was part of the craft. Claude Code does a great job and it does it 50x faster than I could, but it doesn't give me the same satisfaction.

I do hope this is my last job in tech. Unfortunately I'm not old enough to retire, but I think I need to find something better suited to my programatic way of thinking. I quite like the idea of doing construction or some other manual labour job. Seems like they're still building things by hand and don't have so many stupid meetings all the time.

cs02rm0yesterday at 4:08 PM

I'm 43. Took a year or so off from contracting after being flat out for years without taking any breaks, just poked around with some personal projects, did some stuff for my wife's company, petitioned the NHS to fix some stuff. Used Claude Code for much of it. Travelled a bit too.

I feel like I turned around and there seem to be no jobs now (500+ applications deep is a lot when you've always been given the first role you'd applied to) unless you have 2+ years commercial AI experience, which I don't, or perhaps want to sit in a SOC, which I don't. It's like a whole industry just disappeared while I had my back turned.

I looked at Java in Google Trends the other day, it doesn't feel like it was that long ago that people were bemoaning how abstracted that was, but it was everywhere. It doesn't seem to be anymore. I've tried telling myself that maybe it's because people are using LLMs to code, so it's not being searched for, but I think the game's probably up, we're in a different era now.

Not sure what I'm going to do for the next 20 years. I'm looking at getting a motorbike licence just to keep busy, but that won't pay the bills.

show 1 reply
28304283409234yesterday at 7:12 PM

> Cheaper. Faster. But hollowed out.

Given the bazillions poured into it I have yet to see this proven to be cheaper.

fabiensanglardyesterday at 4:53 PM

> the VGA Mode X tricks in Doom

Doom does not use mode-X :P ! It uses mode-Y.

That being said as a 47 years old having given 40 years to this thing as well, I can relate to the feeling.

onion2kyesterday at 3:51 PM

It'd be more strange if the thing you learned 43 years ago was exactly the same today. We should expect change. When that change is positive we call it progress.

show 1 reply
dave_sidyesterday at 6:14 PM

Great post. Good to see someone posting something positive for a change about the shift in development.

kwar13yesterday at 5:22 PM

I am younger than the author but damn this somehow hit me hard. I do remember growing up as a kid with a 486...

suprstarrdyesterday at 8:35 PM

This is at least partially AI-written, by the way

TimPCyesterday at 4:05 PM

I think more than ever programmers need jobs where performance matters and the naive way the AI does things doesn't cut it. When no one cares about things other than correctness your job turns into AI Slop. The good news right now is that AI tends to produce things that AI struggles to do well with so large scale projects often descend into crap. You can write a C-compiler for $20,000 with an explosive stack of agents, but that C-compiler isn't anywhere close to efficient or performant.

As model costs come down that $20,000 will become a viable number for doing entirely AI-generate coding. So more than ever you don't want to be doing work that the AI is good enough at. Either jobs where performance matters or being able to code the stack of agents needed to produce high quality code in an application context.

show 2 replies
enricotryesterday at 8:09 PM

The deepest thing I read from HN in months. Respect.

elzbardicoyesterday at 5:15 PM

I don't know what these people from our now traditional daily lamentation session are coding where Claude can do all the work for them just with a few prompts and minimal reviews.

Claude is a godsend to me, but fuck, it is sometimes dumb as door, loves to create regressions, is a fucking terrible designer. Small, tiny changes? Those are actually the worse, it is easy for claude, on the first setback, decides to burn the whole world and start from zero again. Not to mention when it gets stuck in an eternal loop where it increasingly degenerates the code.

If I care about what I deliver, I have to actively participate in coding.

franzeyesterday at 5:05 PM

I'm 47 and excited to live in a time of the moat important innovation since the printing press.

josefrichteryesterday at 4:19 PM

A bit younger, and exact opposite. Probably the most excited I've ever been about the state of development!

cadamsdotcomyesterday at 5:19 PM

Abstractions can take away but many add tremendous value.

For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.

Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.

Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.

Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.

These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.

You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).

Or to take a moment to marvel.

🔗 View 41 more comments