I am bit tired of such discussions.
I don't care if LLMs are good at coding or bad at it (in my experience the answer is "it depends"). I don't care how good are they at anything else. What matters in the end is that this tech is not to empower a common person (although it could). It is not here to make our lives better, more worthwhile, more satisfying (it could do these as well). It is there to reduce our agency, to make it easier to fire us, to put us in even more precarious position, to suck even more wealth from those that have little to those that have a lot.
Yet what I see are pigs discussing the usefulness of bacon-making machine just because it also happens to be able to produce tasty soybean feed. They forget that it is not soybean feed that their owner bought this machine for, and that their owner expects a return from such investment.
This argument can be used, and has been used, about every innovation in automation since the dawn of the industrial revolution.
> It is there to reduce our agency, to make it easier to fire us, to put us in even more precarious position
Could be. It could also end up freeing us from every commercial dependency we have. Write your own OS, your own mail app, design your own machinery to farm with.
It’s here, so I don’t know where you’re going with “I’m unhappy this is happening and someone should do something”
Demand full automation. Demand universal basic income. Notice how the later is nearly absent from the conversation.
Another distraction is AGI that which is a danger to humanity- the only danger is people...
At some point, if most people lose their jobs, you have no market to sell your services to. So, either, new jobs have to be created in order to keep the capitalism machine running, or you have to provide for the needs of every human being from whatever you're doing with your AI. Otherwise, a lot of hungry people revolt and you have violence against these businesses.
I think new jobs will be created because AI is always limited by hardware and its current capabilities. Businesses, in order to compete, want to do things their competitors aren't currently doing. Those business needs always go beyond the current technological capabilities until the tech catches up and then they lather, rinse, repeat.
>It is there to reduce our agency
Complete bullshit.
The individual has never had as much ability to take on large projects as they do now. They’ve never been able to learn as easily as they can now.
>to make it easier to fire us
As of now, the technology increases productivity in the average user. The companies that take advantage of that and expand their offering will outperform the ones that simply replace workers and don’t expand or improve offerings.
More capable employees make companies more money in general. Productivity increases lead to richer societies and yes, even more jobs, just as it always has.
> It is there to reduce our agency, to make it easier to fire us, to put us in even more precarious position, to suck even more wealth from those that have little to those that have a lot.
You could say this is the story of society, it makes us dependent on each other, reduces our agency, puts us in precarious positions (like WW2). But nobody would argue against society like that.
What happens here is that we become empowered by AI and gain some advantages which we immediately use and become dependent on, eventually not being able to function without them - like computers and even thermostats.
Does anyone think how would economy operate without thermostats? No fridges, no data centers, no engines... they all need thermostats. We have lost some freedom by depending on them. But also gained.
> What matters in the end is that this tech is not to empower a common person (although it could).
How do you figure? 20 dollars/month is insanely cheap for what OpenAI/Anthropic/Google offer. That absolutely qualifies as "empowering a common person". It lowers barriers!
A lot of the anti-AI sentiment on HN concerns people losing their jobs. I don't think this will happen: programmers who know what they're doing are going to be way, way more effective at using AIs to generate code than others.
But even if it is true and we do see job losses in tech: are software devs really "in a precarious position"? Do they really qualify as "those that have little"? Seems like a fantasy to me. Computer programmers have done great over the past 30 years.
More broadly, anti-AI sentiment comes from people who dislike change. It's hard to argue someone out of that position. You're allowed to prefer stasis. But the world moves on and I think it's best to remain optimistic, keep an open mind, and make the most of it.