> When they want data about a school intersection in SF at a certain time of day, they just... synthetically generate it and simulate
I think it's more about detecting changes to the world. You need boots on the ground, so to speak, to see that new speed limit sign or the new lane paint. The Waymo vehicle can no doubt react to changes in the world when it encounters them, relaying them back to the mothership, but it's better to know about them in advance.
>You need boots on the ground, so to speak, to see that new speed limit sign or the new lane paint.
It'll shock you to know that you can simply get this from governments, some even provide this in API form
That’s dumb then. It shows it’s just brute force rather than AI.
A human doesn’t need to be shown every single road that exists in order to drive.
Most AVs, definitely Waymo vehicles, are self mapping. They can detect environment changes and relay it to the entire fleet. That's because they map using the same vehicles as the fleet.