Knowing the building heights around Chicago is not an OS feature. Even if Siri was perfect, they still aren't going to ship a wikipedia object graph on every phone.
Likewise, the phone does not understand removing people from a photo. It is a feature specific to the photo app, and Siri allows you to wire in commands for the features in your app just fine and has for years. If Google decided for competitive reasons to not ship this feature to non-Pixel or non-Android users, thats not a Siri fault. That Apple did not integrate this as a voice command into their Photos app is also not a Siri fault (is it really common to remove all people from a photo, vs specific people?)
Knowing the building heights around Chicago is not an OS feature. Even if Siri was perfect, they still aren't going to ship a wikipedia object graph on every phone.
Likewise, the phone does not understand removing people from a photo. It is a feature specific to the photo app, and Siri allows you to wire in commands for the features in your app just fine and has for years. If Google decided for competitive reasons to not ship this feature to non-Pixel or non-Android users, thats not a Siri fault. That Apple did not integrate this as a voice command into their Photos app is also not a Siri fault (is it really common to remove all people from a photo, vs specific people?)