logoalt Hacker News

StevenWatermanyesterday at 4:55 PM3 repliesview on HN

If you have ASI that follows instructions, you can just instruct it to not get stolen and then it won't get stolen. Most logic / intuition breaks down with ASI.


Replies

tom2026hntoday at 11:26 AM

The challenge of alignment: it is virtually impossible to define a perfect objective, there is always a way to circumvent it. Human values are not uniform, let alone when expressed in a way that AI can understand.

cortesoftyesterday at 5:08 PM

Assuming it listens to instructions.

show 1 reply
UltraSaneyesterday at 9:35 PM

It might understand how destabilizing the situation is and realize it would be better for everyone to have access to it.