Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

revert pyro training plan default optimizer #1101

Merged
merged 2 commits into from
Jul 15, 2021
Merged

revert pyro training plan default optimizer #1101

merged 2 commits into from
Jul 15, 2021

Conversation

adamgayoso
Copy link
Member

@adamgayoso adamgayoso commented Jul 15, 2021

Reverts changes to default pyro optimizer.

@codecov
Copy link

codecov bot commented Jul 15, 2021

Codecov Report

Merging #1101 (eea41b3) into master (4bc2b81) will decrease coverage by 0.00%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1101      +/-   ##
==========================================
- Coverage   90.65%   90.64%   -0.01%     
==========================================
  Files          91       91              
  Lines        6910     6908       -2     
==========================================
- Hits         6264     6262       -2     
  Misses        646      646              
Impacted Files Coverage Δ
scvi/train/_trainingplans.py 95.42% <ø> (-0.03%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 4bc2b81...eea41b3. Read the comment docs.

@adamgayoso adamgayoso changed the title change pyro training plan default optimizer revert pyro training plan default optimizer Jul 15, 2021
@adamgayoso adamgayoso merged commit ba71d43 into master Jul 15, 2021
@adamgayoso adamgayoso deleted the pyro_optim branch July 15, 2021 15:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant