Why could an AGI system not design better robots, convince us we need to give it control of a robot army for our own protection and then mess us up??
Could you imagine how convincing an AGI would be?