Skip to content
This repository has been archived by the owner on Oct 13, 2021. It is now read-only.

Fix GPT2 UT for transformers==2.8.0 #478

Merged
merged 57 commits into from
May 8, 2020
Merged

Conversation

jiafatom
Copy link
Collaborator

@jiafatom jiafatom commented May 7, 2020

GPT2 UT works for transformers==2.5.0 on my local dev. Nightly build uses transformers 2.8.0 and causes output shape mismatch for keras prediction model.predict. This PR fixes it (and it also works for 2.5.0).

The onnx inference result matches model.predict, i.e., a list. But need postprocessing for model(inputs) because keras outputs one Eagertensor as final result with internal (a tuple of Eagertensors).

jiafatom and others added 30 commits February 26, 2020 11:38
…nnx#394)

* Efficient-net test cases.

* Add DepthwiseConv2d to subclassed model.

* disable efficient net test cases.
@jiafatom jiafatom requested a review from wenbingl May 7, 2020 16:44
@jiafatom jiafatom merged commit 022fb2a into onnx:master May 8, 2020
@jiafatom jiafatom deleted the gpt2_test branch May 8, 2020 02:03
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants