logoalt Hacker News

zelphirkalttoday at 10:56 AM21 repliesview on HN

Since lidar has distance information and cameras do not, it was always a ridiculous idea by a certain company to use cameras only. Lidar using cars are going to replace at least the ones that don't make use of this obvious answer to obstacle detection challenges.


Replies

runjaketoday at 3:42 PM

Karpathy provided additional context on the removal of LiDAR during his Lex Fridman Podcast appearance. This article condenses what he said:

https://archive.is/PPiVG

And here's one of Elon's mentions (he also has talked about it quite a bit in various spots).

https://xcancel.com/elonmusk/status/1959831831668228450?s=20

Edit: My personal view is that LiDAR and other sensors are extremely useful, but I worked on aircraft, not cars.

show 3 replies
galangalalgoltoday at 11:21 AM

The reasoning is cynical but sound. If the system uses only the sensing modes people have, it will make the mistakes people do. If a jury thinks "well I could have done that either!" You win. It doesn't matter if your system has fewer accidents if some of the failure modes are different than human ones, because the jury will think "how could it not figure that out?"

show 8 replies
Someonetoday at 12:13 PM

> Since lidar has distance information and cameras do not, it was always a ridiculous idea by a certain company to use cameras only

Human eyes do not have distance information, either, but derive it well enough from spatial (by ‘comparing’ inputs from 2 eyes) or temporal parallax (by ‘comparing’ inputs from one eye at different points in time) to drive cars.

One can also argue that detecting absolute distance isn’t necessary to drive a car. Time to-contact may be more useful. Even only detecting “change in bearing” can be sufficient to avoid collision (https://eoceanic.com/sailing/tips/27/179/how_to_tell_if_you_...)

Having said that, LiDAR works better than vision in mild fog, and if it’s possible to add a decent absolute distance sensor for little extra cost, why wouldn’t you?

show 5 replies
nlitenedtoday at 12:26 PM

As I understand, lidars don't work well in rain/snow/fog. So in the real world, where you have limited resources (research and production investment, people talent, AI training time and dataset breadth, power consumption) that you could redistribute between two systems (vision and lidar), but one of the systems would contradict the other in dangerous driving conditions — it's smarter to just max out vision and ignore lidar altogether.

show 11 replies
radial_symmetrytoday at 1:44 PM

I'm not an expert on ML vision, but I do have a Tesla and it seems to be able to tell how far away things are just fine. I'm not sure what would be wrong with the vision system that lidar needs to fix.

show 4 replies
spydertoday at 12:04 PM

Yea, even in the case they could match human level stereo depth perception with AI, why would they say "no" to superhuman lidar capabilities. Cost could be a somewhat acceptable answer if there wouldn't be problems with the camera only approach but there are still examples of silly failures of it. And if I remember correctly they also removed their other superhuman radar in their newer models, the one which in certain conditions was capable of sensing multiple cars ahead by bouncing the signal below other cars.

show 1 reply
uyzstvqstoday at 6:33 PM

It's not that simple. Cameras don't report 3D depth, but these AI models can and do pick up on pictorial depth cues. LiDAR is incredibly valuable for collecting training and validation data, but may also make only an insignificant difference in production inference.

show 1 reply
wasmainiactoday at 11:18 AM

Just say Tesla, why censor yourself.

show 1 reply
DonsDiscountGastoday at 9:31 PM

Humans don't have explicit distance sensors either. When LIDAR sensors were $20k+ I think it made a lot of sense to avoid them.

mgoetzketoday at 12:30 PM

considering cameras can create reliable enough distance measurements AND also handle all the color reception needed for legally driving roads it was always a ridiculous idea by a certain set of people that lidar is necessary.

show 5 replies
nova22033today at 3:22 PM

It's not complicated. LIDAR hardware was in short supply during COVID. Elon obviously couldn't slow down production and sink the inflated stock price.

show 2 replies
pbreittoday at 7:35 PM

All of driving is designed for visual.

show 1 reply
MetaWhirledPeastoday at 3:01 PM

I find it comical that people continue to go back to this rage well against "a certain company" for their vision-only approach when the truth is they have the best automatic driving system an individual can buy, rivaling Waymo and beating the Chinese brands.

Why are the commenters not pissed at the dozens of other car companies who have done absolutely nothing in this space? Answer: because it's not nearly as fun to be pissed at Kia or Mercedes or whoever. Clearly they are just enjoying the shared anger, regardless of whether it is justified.

show 2 replies
dzhiurgistoday at 9:23 PM

Certain company has 300k subscribers that rely on that ridiculous service.

My father lost vision in 1 eye and 50% in other one something like 20 years ago. He struggles in parking but otherwise doing ok without lidar. Turns out motion vision is more accurate after 10-20 meters than stereoscopic vision.

foooorsythtoday at 2:47 PM

I wouldn’t take too much issue with the “cameras are enough” claim if cameras actually performed like eyes. Human eyes have high dynamic range and continuous autofocus performance that no camera can match. They also have lids with eyelashes that can dynamically block light and assist with aperture adjustment.

The appeal to human biology and argument against fusion between disparate sensors kinda falls flat when you’re building a world model by fusing feeds from cameras all around the car. Humans don’t have 8 eyes in a 360 array around their head. What they do have is two eyes (super cameras) on ~180 degree swiveling and ~180 degree tilting gimbal. With mics attached that help sense other vehicles via road noise. And equilibrioception, vibration detection, and more all in the same system, all fused. If someone were actually building this system to drive the car, the argument based on “how did you drive here today?” gets a lot stronger. One time I had some water blocking my ear and I drove myself to the hospital to get it fixed. That was a shockingly scary drive — your hearing is doing a lot of sensing while driving that you don’t value until it’s gone.

leptonstoday at 6:10 PM

One camera can't really produce depth/distance information, but two cameras sure can. The eyes in your head don't capture distance information individually, but with two eyes you can infer distance.

show 1 reply
SecretDreamstoday at 1:07 PM

I'll preface by saying lidar should be used with autonomous vehicles.

Individual cameras don't have distance information, but you can easily calibrate a system of cameras to give you distance information. Your eyes do this already, albeit not quantitatively. The quantitative part comes from math our brains aren't setup to do in real time.

FrustratedMonkytoday at 1:07 PM

It was cost wasn't it?

If this lowers Lidar costs, and Tesla has spent all this time refining the camara technology. Now have both.

Use both.

DoesntMatter22today at 6:17 PM

It was a great decision to drop LiDAR. The cars are running excellently without it

NedFtoday at 10:51 PM

[dead]

bkotoday at 2:11 PM

Why make things more complicated than they need to be? Humans don't have lidar and we are the only intelligence that can reliably drive. Lidar just seems like feature engineering, which has proven to be a dead end in most other AI applications (bitter lesson).

https://www.cs.utexas.edu/~eunsol/courses/data/bitter_lesson...

show 7 replies