Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix softmax flag behavior #639

Merged
merged 3 commits into from
Feb 17, 2021
Merged

Conversation

roywei
Copy link
Contributor

@roywei roywei commented Feb 10, 2021

fix #520

@roywei roywei requested a review from a team February 10, 2021 19:04
Copy link
Contributor

@stu1130 stu1130 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I saw all rnn tests are failing due to wrong output. maybe we need to re-calculate the output value

@roywei
Copy link
Contributor Author

roywei commented Feb 12, 2021

Investigating the RNN test output. Loss is not supposed to be negative numbers to begin with.

Change-Id: I18ee08116a7ca302a0542ef5d361a64c9e5e2227
Change-Id: I25f80a4f965e820d7d16aba515928b009d1a8b76
Change-Id: I855f71ec7f5ba30e11b3d3ca11c21937873dff6d
@roywei roywei merged commit d879306 into deepjavalibrary:master Feb 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

LogSumExp seems to be missing in SoftmaxCrossEntropyLoss (sparseLabel=false)
3 participants