Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bring back curated_encoder prefix #34

Merged
merged 3 commits into from
Apr 12, 2024

Conversation

danieldk
Copy link
Contributor

Description

Curated Transformers for spaCy 3.x used to have a curated_encoder prefix that we used in e.g. the discriminative learning rate schedule. Curated Transformers doesn't use such a prefix since 1.0. Add a small wrapper to bring back the prefix, so that we can distinguish transformer parameters from other parameters.

Types of change

Bugfix.

Checklist

  • I confirm that I have the right to submit this contribution under the project's MIT license.
  • I ran the tests, and all new and existing tests passed.
  • My changes don't require a change to the documentation, or if they do, I've added all required information.

@danieldk danieldk added the bug Something isn't working label Apr 12, 2024
@danieldk danieldk requested a review from shadeMe April 12, 2024 09:25
Curated Transformers for spaCy 3.x used to have a `curated_encoder`
prefix that we used in e.g. the discriminative learning rate schedule.
Curated Transformers doesn't use such a prefix since 1.0. Add a small
wrapper to bring back the prefix, so that we can distinguish transformer
parameters from other parameters.
@danieldk danieldk force-pushed the bugfix/curated-encoder-prefix branch from d4709ec to a1b36c1 Compare April 12, 2024 09:26
@danieldk danieldk merged commit 700d896 into explosion:v4 Apr 12, 2024
7 of 8 checks passed
@danieldk danieldk deleted the bugfix/curated-encoder-prefix branch April 12, 2024 13:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants