You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:
elif self.costFunction == "cross_entropy":
epsilon = 1e-12 # prevent zero argument in logarithm or division
error = -(y * ncp.log(z + epsilon) + (1 - y) * ncp.log(1 - z + epsilon))
I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!
The text was updated successfully, but these errors were encountered:
Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:
I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!
The text was updated successfully, but these errors were encountered: