-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ZeroDivisionError: integer division or modulo by zero #166
Comments
Use ChatGPT 4 to understand why this happens |
The issue is about |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello,
I have encountered a "division by zero" error while attempting to run
generator.py
on my Windows machine. I have made the necessary modifications tobitsandbytes
and added the required .dll files, but the error persists.I would appreciate any assistance in identifying the cause of this issue.
Error:
`
Output exceeds the size limit. Open the full output data in a text editor
ZeroDivisionError Traceback (most recent call last)
Cell In[1], line 36
29 if device == "cuda":
30 model = LlamaForCausalLM.from_pretrained(
31 BASE_MODEL,
32 load_in_8bit=LOAD_8BIT,
33 torch_dtype=torch.float16,
34 device_map="auto",
35 )
---> 36 model = PeftModel.from_pretrained(
37 model,
38 LORA_WEIGHTS,
39 torch_dtype=torch.float16,
40 )
41 elif device == "mps":
42 model = LlamaForCausalLM.from_pretrained(
43 BASE_MODEL,
44 device_map={"": device},
45 torch_dtype=torch.float16,
46 )
File c:\Users\walee\miniconda3\lib\site-packages\peft\peft_model.py:167, in PeftModel.from_pretrained(cls, model, model_id, **kwargs)
165 no_split_module_classes = model._no_split_modules
166 if device_map != "sequential":
...
457 # - the size of no split block (if applicable)
458 # - the mean of the layer sizes
459 if no_split_module_classes is None:
ZeroDivisionError: integer division or modulo by zero
`
The text was updated successfully, but these errors were encountered: