Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the cross-entropy loss. #13

Open
aa1234241 opened this issue Jun 18, 2023 · 0 comments
Open

About the cross-entropy loss. #13

aa1234241 opened this issue Jun 18, 2023 · 0 comments

Comments

@aa1234241
Copy link

aa1234241 commented Jun 18, 2023

Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:

 elif self.costFunction == "cross_entropy":
       epsilon = 1e-12  # prevent zero argument in logarithm or division
       error = -(y * ncp.log(z + epsilon) + (1 - y) * ncp.log(1 - z + epsilon))

I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant