Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Max supported opset is very old #296

Open
addisonklinke opened this issue Jul 10, 2024 · 5 comments
Open

Max supported opset is very old #296

addisonklinke opened this issue Jul 10, 2024 · 5 comments
Assignees

Comments

@addisonklinke
Copy link

DEFAULT_OPSET_NUMBER is currently 15 and was last updated in Nov 2021. This corresponds to a max of onnx==1.10.2 from the official versioning table which is 6 minor versions behind the latest 1.16.0. Additionally, 1.10.2 only has wheels for Python <=3.9 which is EOL Oct 2025 and makes using a more modern env difficult

What are the limitations in upgrading this / why is it lagging so far behind the onnx releases?

@MaanavD
Copy link
Collaborator

MaanavD commented Jul 11, 2024

Hey, I'm assuming you're referencing the torch.onnx.export() api. If you can try torch.onnx.export(...,dynamo=True) you might have a better outcome!

@MaanavD MaanavD self-assigned this Jul 11, 2024
@addisonklinke
Copy link
Author

Actually my use-case is in PySpark and not torch. From my understanding, DEFAULT_OPSET_NUMBER is internal to onnxconverter-common so any 3rd party library which uses it for conversion would be limited to that opset

@addisonklinke
Copy link
Author

@MaanavD any direction for how we could bump the opset version?

@MaanavD
Copy link
Collaborator

MaanavD commented Jul 31, 2024

@addisonklinke going to add @gramalingam to this thread, I think he'd know best!

@gramalingam
Copy link

@xadupre knows more about these converters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants