Robotaxi supevision is just an emergency brake switch.
Consumer supervision is having all the controls of the car right there in front of you. And if you are doing it right, you have your hands on wheel and foot on the pedals ready to jump in.
> Robotaxi supevision is just an emergency brake switch
That was the case when they first started the trial in Austin. The employee in the car was a safety monitor sitting in the front passenger seat with an emergency brake button.
Later, when they started expanding the service area to include highways they moved them to the driver seat on those trips so that they can completely take over if something unsafe is happening.
> And if you are doing it right, you have your hands on wheel and foot on the pedals ready to jump in.
Seems like there's zero benefit to this, then. Being required to pay attention, but actually having nothing (ie, driving) to keep my engaged seems like the worst of both worlds. Your attention would constantly be drifting.
Similarly Tesla using Teleoperators for their Optimus robots is a safety fake for robots that are not autonomous either. They are constantly trying to cover there inability to make autonomous anything. Cheap lidars or radar would have likely prevented those "hitting stationary objects" accidents. Just because the Furher says it does not make it so.
They had supervisors in the passenger seat for a whole but moved them back to the drivers seat, then moved some out to chase cars. In the ones where they are in driver seat they were able to take over the wheel weren't they?
So the trillion dollar company deployed 1 ton robots in unconstrained public spaces with inadequate safety data and chose to use objectively dangerous and unsafe testing protocols that objectively heightened risk to the public to meet marketing goals? That is worse and would generally be considered utterly depraved self-enrichment.
Nah the relevant factor, which has been obvious to anyone who cared to think about this stuff honestly for years, is that Tesla's safety claims on FSD are meaningless.
Accident rates under traditional cruise control are also extremely below average.
Why?
Because people use cruise control (and FSD) under specific conditions. Namely: good ones! Ones where accidents already happen at a way below-average rate!
Tesla has always been able to publish the data required to really understand performance, which would be normalized by age of vehicle and driving conditions. But they have not, for reasons that have always been obvious but are absolutely undeniable now.