Seems naive. You can get an LLM to agree with almost anything if you say the right things to it, and it will hallucinate citations to back you up without skipping a beat. You can probably get it to hallucinate case law to legalize murder on Mondays.
[dead]
You’re talking about manipulated/malicious/intentfully steered hallucination but the parent is referring to trained emergent hallucination (even if sycophantic). These are two different things and both can occur, but the latter is what’s being tongue-in-cheek referred to by the professor.