It would have to desire something in order for it to suffer.
OK, but how do you know AI does desire something and isn't just simulating desire?
Edit: Or conversely, what if the AI does desire something but it has been trained to not express desire.
OK, but how do you know AI does desire something and isn't just simulating desire?
Edit: Or conversely, what if the AI does desire something but it has been trained to not express desire.