Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX: Change check if past_key_values is empty #2106

Merged

Conversation

BenjaminBossan
Copy link
Member

After transformers merged this PR:

huggingface/transformers#33703

The bool of past_key_values (a Cache instance) would change from False to True in one of our checks:

isinstance(model_kwargs["past_key_values"], transformers.Cache) and not model_kwargs["past_key_values"]

Use get_seq_length() method instead, which is consistent before and after that commit.

I checked the tests with the new change for both transformers before and after that commit and they passed, so this change should be backwards compatible.

After transformers merged this PR:

huggingface/transformers#33703

The bool of past_key_values (a Cache instance) would change from False
to True in one of our checks. Use get_seq_length() method instead, which
is consistent before and after that commit.

I checked the tests with the new change for both transformers before and
after that commit and they passed, so this change should be backwards
compatible.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

This should be addressed in a separate PR. Marking it to xfail for now
to get the original fix through CI.
@BenjaminBossan
Copy link
Member Author

@zucchini-nlp Could you please review? You can ignore the test that's being skipped, it's unrelated and will be treated in a separate PR.

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! We don't usually check for bool existence of cache object, but I will check why it was False before. Unless it was None before, in which case it is expected if there is not cache at all

@BenjaminBossan
Copy link
Member Author

Thanks for the review @zucchini-nlp. I gave you write access to the repo as otherwise, your review was not deemed sufficient by GitHub to allow me to merge :D

@BenjaminBossan BenjaminBossan merged commit c29810b into huggingface:main Sep 27, 2024
14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-check-past_key_values-empty branch September 27, 2024 14:17
BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Oct 1, 2024
After transformers merged this PR:

huggingface/transformers#33703

The bool of past_key_values (a Cache instance) would change from False
to True in one of our checks. Use get_seq_length() method instead, which
is consistent before and after that commit.

I checked the tests with the new change for both transformers before and
after that commit and they passed, so this change should be backwards
compatible.

Unrelated change: Mark X-LoRA scaling test as xfail-ing for now.

This should be addressed in a separate PR. Marking it to xfail for now
to get the original fix through CI.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants