We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Currently torchopt.optim classes aren't compatible with lightning's configure_optimizers.
configure_optimizers
This is because lightning doesn't think they are Optimizable
import torchopt from lightning.fabric.utilities.types import Optimizable optimizer = torchopt.Adam(model.parameters(), lr=1e-3) isinstance(optimizer, Optimizable) # False
For it to be Optimizable it requires defaults and state attributes.
defaults
state
If simply you do
from collections import defaultdict optimizer.defaults = {} optimizer.state = defaultdict()
then isinstance(optimizer, Optimizable) passes and torchopt <> lightning works a charm 😍
isinstance(optimizer, Optimizable)
Can we add defaults and state attributes to the torchopt.optim.Optimizer class?
torchopt.optim.Optimizer
No response
The text was updated successfully, but these errors were encountered:
Benjamin-eecs
No branches or pull requests
Required prerequisites
Motivation
Currently torchopt.optim classes aren't compatible with lightning's
configure_optimizers
.This is because lightning doesn't think they are Optimizable
For it to be Optimizable it requires
defaults
andstate
attributes.If simply you do
then
isinstance(optimizer, Optimizable)
passes and torchopt <> lightning works a charm 😍Solution
Can we add
defaults
andstate
attributes to thetorchopt.optim.Optimizer
class?Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: