-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NotImplementedError running HF model "mlfoundations/dclm-7b-it" for inference #303
Comments
This is usually an xformers issue. I think the main issue is that xformers doesn't run on CPU, so the quick short-term fix is to make sure you send all your models and tensors to device/GPU. That should resolve the issue. I think the long-term solution here would probably be to get rid of xformers entirely. You can do this locally by setting |
I am trying to use the HF model "mlfoundations/dclm-7b-it" for inference, simply using the code below:
I see this warning when loading the model:
Some weights of OpenLMForCausalLM were not initialized from the model checkpoint at mlfoundations/dclm-7b-it and are newly initialized: [...]
And I get NotImplementedError:
I have also tried
model = AutoModel.from_pretrained("mlfoundations/dclm-7b-it")
, but this model class also fails with ValueError: Unrecognized configuration class.Which model class should I use here?
The text was updated successfully, but these errors were encountered: