logoalt Hacker News

kstrauseryesterday at 4:37 PM2 repliesview on HN

That’s fundamentally different, and I think you know that.

It’s one thing to ask an algorithm how to build an A* driving map from point A to point B. It’s another to ask one how to be a better person and go to Heaven.

I’m not religious, and I’m not arguing this from a pro-religion POV. I happily work in AI, and I’m not arguing this from an anti-AI POV. I am highly technical. I love computers. I’m excited about the future. I rely on deterministic algorithms to make my days better. And yet, I do not want to trust the words of an LLM to counsel me on how to be a better husband or father. At this stage, the AI does not know me in the way a counselor or advisor, or even pastor or priest would. And yes, I think that’s a crucial difference.


Replies

ben_wyesterday at 4:44 PM

3/4-agree; LLM advice is only one step up from an Agony Aunt column in a newspaper.

And I'd expect "Target stock scheduling system does for target employees for restocking shelves" to be an A* or similar.

But also, Google maps has directed people to their deaths: https://gizmodo.com/three-men-die-after-google-maps-reported... isn't even what I was originally looking for, which was: https://www.cbsnews.com/news/google-sued-negligence-maps-dri...

show 1 reply
AndrewKemendoyesterday at 5:09 PM

It’s not fundamentally different it’s people who are taking physical actions in the real world based on trust in some system

whether it’s a human or not they’re trusting the system with their existential outcomes

That is literally exactly the same thing.

The fact that you think that the rules of you being a father are somehow different than the rules of you driving to a appointment indicate that you have a completely incoherent world view based on two incompatible models of epistemology

As usual dualists will come up with a incoherent model and then try and act like it’s valid

show 3 replies