logoalt Hacker News

jmalickiyesterday at 4:20 PM1 replyview on HN

The cross entropy loss function is softmax. They are one and the same.


Replies

canjobearyesterday at 4:29 PM

They’re not. Cross entropy loss is E[-log q] where q is a probability. You could convert the model outputs x into probabilities using some other function like q = 1/Z x^2, and compute cross entropy loss just fine.

show 1 reply