Isn't the fact that a person is asking an AI whether to leave their partner in its own an indication that they should?
EDIT: typo
How is it an indication? I think people on here don't realize that most of the people don't think things through as much as (software) engineers
The idea that asking implies a yes is actually a pretty common logical fallacy. In relationship science, we call this "Relational Ambivalence" and its a completely normal part of any longterm commitment.
No, but it is an indication of brain-rot to make a question seriously and also to think that it means the conclusion is foregone. It is an advent of our childlike current generations. Of course, the moment anything becomes difficult or unpleasant, one should quit, apparently. Surely, this kind of resiliency is what got humanity so far.
>asking an AI whether to leave your partner
is that what they're asking though? because "relationship advice" is pretty vague