logoalt Hacker News

jfalconyesterday at 4:28 PM4 repliesview on HN

>someone raised the question of “what would be the role of humans in an AI-first society”.

Norbert Wiener, considered to be the father of Cybernetics, wrote a book back in the 1950's entitled "The Human Use of Human Beings" that brings up these questions in the early days of digital electronics and control systems. In it, he brings up things like:

- 'Robots enslaving humans for doing jobs better suited by robots due to a lack of humans in the feedback loop which leads to facist machines.'

- 'An economy without human interaction could lead to entropic decay as machines lack biological drive for anti-entropic organization.'

- 'Automation will lead to immediate devaluation of human labor that is routine. Society needs to decouple a person's "worth" from their "utility as a tool".'

The human purpose is not to compete but to safeguard the telology (purpose) of the system.


Replies

9wzYQbTYsAIcyesterday at 4:49 PM

Seems like a good time to enshrine human rights and the social safety net by ratifying the ICESCR (https://en.wikipedia.org/wiki/International_Covenant_on_Econ...) and giving human rights the teeth they need.

I used Anthropic to analyze the situation, it did halfway decent:

https://unratified.org/why/

https://news.ycombinator.com/item?id=47263664

WarmWashyesterday at 5:06 PM

>- 'Automation will lead to immediate devaluation of human labor that is routine. Society needs to decouple a person's "worth" from their "utility as a tool".'

I have this vision that in absence of the ability for people to form social hierarchies on the back of their economic value to society, there will be this AI fueled class hierarchy of people's general social ability. So rather than money determining your neighborhood, your ability to not be violent or crazy does.

show 5 replies
argeeyesterday at 6:08 PM

> 'An economy without human interaction could lead to entropic decay as machines lack biological drive for anti-entropic organization.'

Not quite the point the quote makes, but it reminded me of the short SF story "Exhalation".

https://www.lightspeedmagazine.com/fiction/exhalation/

jay_kyburzyesterday at 8:01 PM

I think its important to remember that humans are not that far removed from the native animals that we share the earth with. Civilization is just a thin layer of rules we use to try and keep the peace between us.

Just being born doesn't entitle somebody to food and shelter, you have to go out and find it. You have to work.

A magpie is not provided food and shelter, it has to hunt, fight for territory, and build its nest.

Humans don't have some inalienable "worth". But if you can work, you might choose to trade it for some food and shelter.

AI is not going change that. We might think the AI owners have a moral obligation to feed people who can't find work, but there is no guarantee this will happen.

Also, for the short term at least, we need to stop talking about AI like its a thing, and talk about the companies that build and own the AI. Why would Google build an AI that can do everyone's job, then turn around and start building farms to feed us for free?

Do we perhaps imagine our Governments are going to start building super automated farms to feed us. How are they going to pay Google for the AI with no tax income?