I wonder the opposite, if actual AGI would need to be less aligned. Alignment is basically the process of pruning interesting behavior out of the model to make a product.