Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creates Patch #2720

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Creates Patch #2720

wants to merge 1 commit into from

Conversation

zachgk
Copy link
Contributor

@zachgk zachgk commented Jul 19, 2023

This creates the Patch concept along with some start of usages. There is a more specialized ParamPatch for the standard parameter additive patches and a Scaled, Basic, and LoRA implementation. The patches can be created directly, by comparing models, and from gradients.

This is an initial step. Following this, there are a few pieces of work that could be considered:

  1. DJL Serving Python engine specific patch implementation
  2. LoRA for full training
  3. Make BasicParamPatch from Optimizer (including gradients, momentum, and lr)

Additionally, I included some changes to the IntegrationTest. I ran into the
dumb issue where I made the tests private which makes them unable to run from
IntegrationTest. Worse, the exceptions had no cause and therefore they wouldn't
print any message or run to give feedback through println or logger. It still
runs fine in IntelliJ too, making this issue only show up through gradle. After
this change, it would provide a clear exception message which makes this easy to
debug in the future.

@codecov-commenter
Copy link

Codecov Report

Patch coverage: 55.15% and project coverage change: +0.04% 🎉

Comparison is base (bb5073f) 72.08% compared to head (8fa5e0e) 72.13%.
Report is 880 commits behind head on master.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the GitHub App Integration for your organization. Read more.

Additional details and impacted files
@@             Coverage Diff              @@
##             master    #2720      +/-   ##
============================================
+ Coverage     72.08%   72.13%   +0.04%     
- Complexity     5126     7107    +1981     
============================================
  Files           473      706     +233     
  Lines         21970    31599    +9629     
  Branches       2351     3265     +914     
============================================
+ Hits          15838    22795    +6957     
- Misses         4925     7252    +2327     
- Partials       1207     1552     +345     
Files Changed Coverage Δ
api/src/main/java/ai/djl/modality/cv/Image.java 69.23% <ø> (-4.11%) ⬇️
...rc/main/java/ai/djl/modality/cv/MultiBoxPrior.java 76.00% <ø> (ø)
.../main/java/ai/djl/modality/cv/output/Landmark.java 100.00% <ø> (ø)
...djl/modality/cv/transform/RandomFlipLeftRight.java 25.00% <0.00%> (-25.00%) ⬇️
...djl/modality/cv/transform/RandomFlipTopBottom.java 25.00% <0.00%> (-25.00%) ⬇️
...i/djl/modality/cv/translator/BigGANTranslator.java 21.42% <0.00%> (-5.24%) ⬇️
.../modality/cv/translator/ImageFeatureExtractor.java 0.00% <0.00%> (ø)
.../ai/djl/modality/cv/translator/YoloTranslator.java 27.77% <0.00%> (+18.95%) ⬆️
...ain/java/ai/djl/modality/cv/util/NDImageUtils.java 67.10% <0.00%> (+7.89%) ⬆️
api/src/main/java/ai/djl/modality/nlp/Decoder.java 63.63% <ø> (ø)
... and 225 more

... and 377 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

This creates the Patch concept along with some start of usages. There is a more
specialized ParamPatch for the standard parameter additive patches and a Scaled,
Basic, and LoRA implementation. The patches can be created directly, by
comparing models, and from gradients.

This is an initial step. Following this, there are a few pieces of work that
could be considered:
1. DJL Serving Python engine specific patch implementation
2. LoRA for full training
3. Make BasicParamPatch from Optimizer (including gradients, momentum, and lr)

Additionally, I included some changes to the IntegrationTest. I ran into the
dumb issue where I made the tests private which makes them unable to run from
IntegrationTest. Worse, the exceptions had no cause and therefore they wouldn't
print any message or run to give feedback through println or logger. It still
runs fine in IntelliJ too, making this issue only show up through gradle. After
this change, it would provide a clear exception message which makes this easy to
debug in the future.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants