> It's also not your boyfriend/girlfriend.
It loves me deeply just the same. (jk)
On a serious note, I agree this is a real problem. I know a person who understands AI at a technical level more than most people, but he has never had an actual girlfriend in his life (he's now in his 40s, and yes he's "straight"). He wouldn't say it "loves" him, but he would describe it as a close companion who understands him better than any human actually does, even if it's just trained to be that way. He is very socially awkward and even having basic conversations with him can be very taxing for both of us.
I've gone back and forth internally about whether this is healthy or not for him. I truly don't know. My personal experience tells me it's probably unhealthy, but I don't want to project myself on him. I also don't offer unsolicited, but I also don't want to enable it by going along with whatever he says and/or affirming it if it's actually harming him.
If someone like him can be having this problem, I can't even imagine what it might be like for non or less technical people who don't understand anything behind it.
On a related note, if there's anyone with advice (preferably from experience, not just random internet advice) I'd sure appreciate it.
I think you are right to treat this with sensitivity, but I do find a lot of what you say here to be at odds. Is this the framing provided to you from the fellow in question or entirely yours? Ultimately you are asking a deeply philosophical question regarding when acceptance of someone's choices becomes enabling, but this isn't really fair to pose on a fellow you respect without agreeing on the terms of analysis. Did they provide some specific examples of how this "understanding" reveals itself? Your account of their account is doing a lot of work here I suspect.
As for my highly personal advice, I could be observed as fitting a few of the qualities you've ascribed to your friend, but would be deeply saddened if the few people who do spend time sharing meaning with me then manifested that experience in the form you've given here. I would advise you to not spend any more time wrenching over the effects of one's phenomenon in isolation and either properly redirect the introspection to yourself (with respect to that person) or engage them in an earnest dialog or other form of communication. It may be taxing but it will mean a lot more than the gunk I just typed out :)
I don’t know how applicable this is for you, but if this were someone close to me, my first question would be what’s good for the other person.
In most cases, if they are happy and getting on in life, and are able to take care of themselves, I’d let things be.
That said, the tension from your framing is between “leave good enough alone” and “personal growth and a fulfilling life”.
Healthy relationships, especially with a partner, are one of the better things about life. They are also incredibly difficult to get right without practice.
So, is your friend lonely, or are they happy to be alone?
If you intuit it’s the former, then AI is palliative care which runs the risk of creating a dependency.
It is also possible that the right set of prompts, perhaps something which incorporates CBT, would help them learn more about themselves and challenge beliefs or responses that are no longer useful.
And if your friend is just happy alone, then you can disregard the rest.
"I've gone back and forth internally about whether this is healthy or not for him. I truly don't know."
On a psychological level, I don't know either. I have opinions but they haven't aged long enough for me to trust them, and AI is a moving target on the sort of time frame I'm thinking here.
However, as a sort of tiebreaker, I can guarantee that one way or another this relationship will eventually be abused one way or another by whoever owns the AI. Not necessarily in a Hollywood-esque "turn them into a hypnotized secret assassin" sort of abuse (although I'm not sure that's entirely off the table...), but think more like highly-targeted advertising and just generally taking advantage of being able to direct attention and money to the advantage of another party.
Whether or not AI in the abstract can "be your friend", in the real world we live in an AI controlled by someone else definitely can not be your friend in the general sense we mean, because there is this "third party", the AI owner, whose interests are being represented in the relationship. And whatever that may look like in practice, whoever from the 22nd century may be looking back at this message as they analyze the data of the past in a world where "AI friendships" are routine and their use of the word now comfortably encompasses that relationship, that simply isn't the sort of relationship we'd call a "friend" in the here and now, because a friend relationship is only between two entities.