logoalt Hacker News

armcatyesterday at 7:53 PM1 replyview on HN

That's a soft distinction (distilling vs learning). If I read a chapter in a text book I am distilling the knowledge from that chapter into my own latent space - one would hope I learn something. Flipping it the other way, you could say that model from Lab Y is ALSO learning the model from Lab X. Not just "distilling". Hence my original comment - how deep does this go?


Replies

EnPissantyesterday at 8:08 PM

And yet nearly every machine learning engineer would disagree with you, which is a given away that your argument is rooted in ideology.

show 1 reply