-
Notifications
You must be signed in to change notification settings - Fork 620
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR: Your GPU does not support Int8 Matmul! #97
Comments
Same problem with my Nvidia Tesla P40.
$ nvcc -V
I was under the impression the P40 supported all the int8 operations. It has CUDA Compute Capability 6.1, which is higher than the 3.7 offered by the K80. Is this a software bug, or does the Tesla P40 really not offer int8 matmul? |
The Int8 Matmul feature relied on Int8 Tensor Cores and as such your GPUs were not supported. This changed with the commit de53588 which is already pushed to pip in the latest 0.37.0 version. So if you update it should work for you now. Please reopen this issue if you encounter any problems. |
I have update to 0.38.1 version, but this problem still exists. |
Having the same issue here |
The text was updated successfully, but these errors were encountered: