-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect Use of torch.no_grad() in fit_epoch Method in d2l/torch.py::Trainer::fit_epoch #2573
Comments
I think it should be
|
Apologies for not being clear earlier. I'm uncertain about the correctness of a specific part of the code found at https:/d2l-ai/d2l-en/blob/master/d2l/torch.py. Here is the original code:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello,
I noticed a potential issue in the fit_epoch method in https:/d2l-ai/d2l-en/blob/master/d2l/torch.py, where loss.backward() is called within a torch.no_grad() block:
This usage likely prevents the calculation of gradients, as loss.backward() should not be inside a torch.no_grad() block. The correct approach would be:
Here is the original code:
The text was updated successfully, but these errors were encountered: