logoalt Hacker News

627467yesterday at 4:17 AM11 repliesview on HN

> I was made redundant recently "due to AI" (questionable) and it feels like my works in some way contributed to my redundancy where my works contributed to the profits made by these AI megacorps while I am left a victim.

I think anyone here can understand and even share that feeling. And I agree with your "questionable" - its just the lame HR excuse du jour.

My 2c:

- AI megacorps aren't the only ones gaining, we all are. the leverage you have to build and ship today is higher than it was five years ago.

- It feels like megacorps own the keys right now, but that’s a temporary. In a world of autonomous agents and open-weight models, control is decentralized.inference costs continue to drop, you dont need to be running on megacorp stacks. Millions (billions?) of agents finding and sharing among themselves. How will megacorps stop?

- I see the advent of LLMs like the spread of literacy. Scribes once held a monopoly on the written word, which felt like a "loss" to them when reading/writing became universal. But today, language belongs to everyone. We aren't losing code; we are making the ability to code a universal human "literacy."


Replies

latexryesterday at 8:17 AM

> AI megacorps aren't the only ones gaining, we all are.

No, no we are not.

> the leverage you have to build and ship today is higher than it was five years ago.

I don’t want more “leverage to build and ship”, I want to live in a world where people aren’t so disconnected from reality and so lonely they have romantic relationships with a chat window; where they don’t turn off their brains and accept any wrong information because it comes from a machine; where propaganda, mass manipulation, and surveillance aren’t at the ready hands of any two-bit despot; where people aren’t so myopic that they only look at their own belly button and use case for a tool that they are incapable of recognising all the societal harms around them.

> We aren't losing code; we are making the ability to code a universal human "literacy."

No, no we are not. What we are, however, is making ever increasingly bad comparisons.

Literacy implies understanding. To be able to read and write, you need to be able to understand how to do both. LLMs just spit text which you don’t need to understand at all, and increasingly people are not even caring to try to understand it. LLM generated code in the hands of someone who doesn’t read it is the opposite of literacy.

show 4 replies
j_bumyesterday at 4:43 AM

I’m not sure if the analogy is yours, but the scribe note really struck a chord with me.

I’m not a professionally trained SWE (I’m a scientist who does engineering work). LLMs have really accelerated my ability to build, ideate, and understand systems in a way that I could only loosely gain from sometimes grumpy but mostly kind senior engineers in overcrowded chat rooms.

The legality of all of this is dubious, though, per the parent. I GPL licensed my FOSS scientific software because I wanted it to help advance biomedical research. Not because I wanted it to help a big corp get rich.

But then again, maybe code like mine is what is holding these models back lol.

show 1 reply
ori_byesterday at 7:51 AM

> We aren't losing code; we are making the ability to code a universal human "literacy."

The same way that doordash makes kitchen skills universal.

show 1 reply
matheusmoreirayesterday at 8:44 AM

> It feels like megacorps own the keys right now, but that’s a temporary.

Remains to be seen. Hardware prices are increasing. Manufacturers are abandoning the consumer sector to serve the all consuming AI demands. Not to mention the constant attempts to lock down the computers so that we don't own them.

What does the future hold for us? Unknown. It's not looking too good though. What good is hardware if we're priced out? What good are open models and free software if we're unable to run them?

show 2 replies
psychoslaveyesterday at 9:26 AM

>But today, language belongs to everyone. We aren't losing code; we are making the ability to code a universal human "literacy."

Literacy require training though. It’s not the same to be able to make voice rendition of a text, understand what the text is about, have a critical analysis toolbox of texts, and having the habit to lookup for situated within a broader inferred context.

Just throwing LLMs into people hands won’t automatically make them able to use it in relevant manner as far as global social benefits can be considered.

The literacy issue is actually quite independent of the fact that LLMs used are distributed or centralised.

wiseowiseyesterday at 7:23 AM

> We aren't losing code; we are making the ability to code a universal human "literacy."

LLMs making the ability to code a universal human “literacy” is like saying that Markov chain is making the ability to write a universal human “literacy”.

show 1 reply
heyethanyesterday at 9:15 AM

The literacy analogy makes sense in terms of access.

But the tools back then were cheap and local. Now most of the leverage sits behind large models and infra.

So more people can “write”, but not necessarily on their own terms.

show 1 reply
Underqualifiedyesterday at 8:52 AM

This response sounds an awful lot like what ChatGPT would say ...

wolvesechoesyesterday at 8:19 AM

> the leverage you have to build and ship today is higher than it was five years ago

Wake me up when you do.

oblioyesterday at 11:58 AM

Once creation is commodotized, controlling eye balls is king. Look up aggregators. Apple, Facebook, Microsoft, Amazon, etc.

If anything, in Extremistan we're all useless. Platforms and whales are all that matters.

2790276yesterday at 4:53 AM

[dead]