But they are mimicking text generated by beings who do. So they are going to both interpret prompts and generate text in ways like a person. So in prompting, you kind have to anthropomorphize them. The phrases in that SOUL.md that broke the bot were the references to it being a god for example.
But they are mimicking text generated by beings who do. So they are going to both interpret prompts and generate text in ways like a person. So in prompting, you kind have to anthropomorphize them. The phrases in that SOUL.md that broke the bot were the references to it being a god for example.