Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Relay] [Pass] Add mixed precision (e.g. FP16) model conversion pass (a…
…pache#8069) * Initial skeleton for fp16 pass. initial green gray and red lists move fp16 conversion to own fodler second pass example split up files a bit more cool nodes bro initial transofmr pass * Working python version of fp16 pass. fix topi conv2d not casting kernel to output type working resnet, but conv2d topi intrinsics need work tests for resnet add more tests, extend coverage for converter update tests, ensure red ops convert back to fp32 clean up code a bit simplify fp16 output dtype examination fix pass update tests initial coloring * Rewrite python passes in C++ inspect arg fields add propagate colors pass" private -> public inheritance" rewrite draft full transformation in c++ remove prints fp16 pass the proper wrapping insert extra cast to pass type checking fix previously broken test by removing cast in wrong scenario remove old python_files * Extend support to things besides CallNodes. E.g. tuples and lets fp32 invalidate typing instead of cast adding basic tests skeleton code out Stash work -- casting based on checked types working let statements add more ops, handle functions more generally add multiply, fix broken case support TupleNodes properly, move hash function for datatypes into data_type.h" update simple let test with structural expectation cleanup p1 remove old file * Rewrite how and when casting is done by checking types directly. add support for GPT2, BERT add some more comments new single pass version formatting make a lot of things const references clean up tests more cleanup more comments final comment add newline * linting and formatting * add AST header * remove todo * lint errors2 * remove i386 incompatible features * Trigger CI again * set seed * lint * address animesh's initial comments * mutate attributes only if they were originally floats * initial comments from matthew * add comment on hashing strat * add missing ; * edge case when mutating attrs * Cody's easy to address comments * add test to show green-red casting works * remove np.random seed from each test * remove as many references to fp16 types in favor of generic mixed types * rename RED, GREEN, GRAY to MIXED_PRECISION_ALLOW, etc. * skeleton for supporting arbitrary mixed types * cool tests * Using MixedModeMutator * rename things ToMixedPrecision * rename passes to amp.cc * rename tests to match transform * clean up typos * rename even better to_mixed_precision * don't insert into cache when dtypes equal * new python interface for registering ops * cleaner registering ops * add fp64 structural test * clean up and comments * make copy of attributes * asf header * pylint * remove TODO which is solved * Apply nits from code review (comaniac) Co-authored-by: Cody Yu <[email protected]> * change cast_node_cache --> cast_node_cache_ * add check for returned vals * better error msg * docstring for pass in python * fix default behavior to be proper * better error reporting via single flag * priority to 0 * address more nits * fix story telling slightly * restart * correct docstring * change class fields to have _ at end * add class docstring * add comment on accumulation dtype hack * ADT warnings * add todo * fix linter Co-authored-by: Cody Yu <[email protected]>
- Loading branch information