Skip to content

Commit

Permalink
Mark some optimizer update arguments as Noneable (they were being cal…
Browse files Browse the repository at this point in the history
…led with Nones)
  • Loading branch information
akx committed Mar 13, 2024
1 parent 3ec3dd2 commit 0c6dda0
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions bitsandbytes/functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -1618,18 +1618,18 @@ def optimizer_update_8bit(
g: Tensor,
p: Tensor,
state1: Tensor,
state2: Tensor,
state2: Optional[torch.Tensor],
beta1: float,
beta2: float,
eps: float,
step: int,
lr: float,
qmap1: Tensor,
qmap2: Tensor,
qmap2: Optional[torch.Tensor],
max1: Tensor,
max2: Tensor,
max2: Optional[torch.Tensor],
new_max1: Tensor,
new_max2: Tensor,
new_max2: Optional[torch.Tensor],
weight_decay: float = 0.0,
gnorm_scale: float = 1.0,
unorm_vec: Optional[torch.Tensor] = None,
Expand Down Expand Up @@ -1751,16 +1751,16 @@ def optimizer_update_8bit_blockwise(
g: Tensor,
p: Tensor,
state1: Tensor,
state2: Tensor,
state2: Optional[torch.Tensor],
beta1: float,
beta2: float,
eps: float,
step: int,
lr: float,
qmap1: Tensor,
qmap2: Tensor,
qmap2: Optional[torch.Tensor],
absmax1: Tensor,
absmax2: Tensor,
absmax2: Optional[torch.Tensor],
weight_decay: float = 0.0,
gnorm_scale: float = 1.0,
skip_zeros=False,
Expand Down

0 comments on commit 0c6dda0

Please sign in to comment.