logoalt Hacker News

Fr0styMatt88today at 9:55 AM6 repliesview on HN

I feel like it’s something more fundamental and broad than that. We slowly remove excuses to talk to other people.

The thought crossed my mind the other day — if I’m asking the AI a question, that’s replacing a human interaction I would have had with a coworker.

It’s not just in coding, it’s everything. With ChatGPT always available in your pocket, what social interactions is it replacing?

The thing that gets me is, we are meant to fundamentally be social creatures, yet we have come to streamline away socialisation any chance we get.

I’m guilty of this too — I much prefer Doordash to having to call up the restaurant like in the old days, for example.


Replies

MattJ100today at 10:07 AM

We see this in our open-source community. We've had a community channel for over two decades, where community members help newcomers and each other solve problems and answer questions.

Increasingly we have people join who tell us they've been struggling with a problem "for days". Per routine, we ask for their configuration, and it turns out they've been asking ChatGPT, Claude or some other LLM for assistance and their configuration is a total mess.

Something about this feels really broken, when a channel full of domain experts are willing to lend a hand (within reason) for free. But instead, people increasingly turn to the machines which are well-known to hallucinate. They just don't think it will hallucinate for them.

In fact I see this pattern a lot. People use LLMs for stuff within their domain of expertise, or just ask them questions about washing cars, and they laugh at how incompetent and illogical they are. Then, hours later, they will happily query ChatGPT for mortgage advice, or whatever. If they don't have the knowledge to verify it themselves then they seem more willing to believe it is accurate, where in fact they should be even more careful.

show 1 reply
2ndorderthoughttoday at 10:22 AM

There is a lot of wisdom in this.

At the end of the day chatgpt won't be there to hold our hands in the hospital, have a laugh over failing to pick up a date, get invited to a bbq, groan over the state of the code in utils.c, or recommend us for our next job/promotion. They say software is social for a different reason than most of these examples.

It's good to be efficient, whatever that means, but there are no metrics on the gains that get made by talking to people. In a lot of ways those gains are what life is about.

lxgrtoday at 10:45 AM

> if I’m asking the AI a question, that’s replacing a human interaction I would have had with a coworker.

Importantly, you're removing a signal: If I'm not asked things anymore, I don't know which aspects of our domain are causing the most confusion/misunderstandings and would as such benefit most from simplifying the boundaries of.

hnthrow0287345today at 2:05 PM

You could have done this with Google search or Wikipedia or reading through books though

gonzalohmtoday at 11:20 AM

I think you are right, but it also makes sense. Human communication is inherently inefficient. Points of view, miscommunication, interpretation... It's the obvious point to automate. Not defending it, just my thoughts

croisillontoday at 11:32 AM

i see what you did there :)