logoalt Hacker News

devolving-devtoday at 9:24 PM4 repliesview on HN

I've always wondered if Tesla's issues with FSD were a sensor problem or an intelligence problem. I think Tesla's claim is that when they look at accident footage, it is clear to a human how the car could have avoided the accident, and thus, if FSD was more intelligent, the accident could have been avoided. Is this reasoning wrong?

I personally find it convincing that the problem with self-driving is mostly that the models aren't intelligent enough, and that adding LiDAR wouldn't be enough to achieve the reliability required. But I don't know, I don't really work in that field so maybe engineers who have more experience with self driving might say otherwise.


Replies

dbcurtistoday at 9:47 PM

It is easy to underestimate how much one relies on senses other than vision. You hear many kinds of noises that indicate road surface, traffic, etc. You feel road surface imperfections telegraphed through the steering wheel. You feel accelerations in your butt, and conclude loss of traction from response of the accelerator and motion of the vehicle. Secondly, the human eye has much more dynamic range than any camera. And is mounted on an exquisite PTZ platform. Then turning to the model -- you are classifying obstacles and agents at a furious rate, and making predictions about the behavior of the agents. So, in part I agree that the models need work, but the models need to be fed, and IMHO computer vision is not a sufficient sensor feed.

Consider an exhaust condensation cloud coming from a vehicle's tail pipe -- it could be opaque to a camera/computer-vision system. Can you model your way out of that? Or is it also useful to do sensor fusion of vision data with radar data (cloud is transparent) and others like lidar, etc. A multi-modal sensor feed is going to simplify the model, which in the end translates into compute load.

arijuntoday at 9:47 PM

> I've always wondered if Tesla's issues with FSD were a sensor problem or an intelligence problem

Even if it’s an intelligence problem, it’s possible that machine intelligence will not get to the point where it can resolve anytime soon, whereas more sensors might circumvent the issue completely. It’s like with Musk’s big claim (that humans use camera only to drive); the question is not if a good enough brain will be able to drive vision-only, but if Tesla can make that brain.

wombat-mantoday at 9:44 PM

maybe? But also LiDAR just gives a more complete picture of what is around the car. I think this is supported by how many miles waymo cars run unsupervised vs Tesla.

I am skeptical that tesla has this solved but interested in seeing how it goes when as they move to expand their robotaxi service.

janalsncmtoday at 9:45 PM

Some problems are simply undecideable: if for identical inputs the desired output varies wildly, you simply need more information. There is no algorithm that will help you.

Sensors or intelligence, at the end of the day it’s an engineering problem which doesn’t require pure solutions. Sometimes sensors break and cameras get covered in mud.

The problem is maintaining an acceptable level of quality at the lowest possible price, and at some point you spend more money on clever algorithms and researchers than a lidar.