logoalt Hacker News

falcor84yesterday at 7:32 PM1 replyview on HN

But that's how ML works - as long as the output can be differentiated, we can utilize gradient descent to optimize the difference away. Eventually, the difference will be imperceptible.

And of course that brings me back to my favorite xkcd - https://xkcd.com/810/


Replies

emp17344yesterday at 8:07 PM

Gradient descent is not a magic wand that makes computers behave like anything you want. The difference is still quite perceptible after several years and trillions of dollars in R&D, and there’s no reason to believe it’ll get much better.

show 1 reply