Skip to content

Commit

Permalink
nits
Browse files Browse the repository at this point in the history
  • Loading branch information
kartikayk committed Mar 24, 2024
1 parent 917f926 commit 2a06b8f
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 6 deletions.
7 changes: 5 additions & 2 deletions recipes/configs/mistral/7B_full.yaml
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# This config is currently a WIP. Use it with caution


# Tokenizer
tokenizer:
_component_: torchtune.models.mistral.mistral_tokenizer
Expand All @@ -16,13 +19,13 @@ model:

checkpointer:
_component_: torchtune.utils.FullModelHFCheckpointer
checkpoint_dir: /data/users/kartikayk/cpts/Mistral-7B-v0.1
checkpoint_dir: /tmp/Mistral-7B-v0.1
checkpoint_files: [
pytorch_model-00001-of-00002.bin,
pytorch_model-00002-of-00002.bin
]
recipe_checkpoint: null
output_dir: /data/users/kartikayk/cpts/Mistral-7B-v0.1
output_dir: /tmp/Mistral-7B-v0.1
model_type: LLAMA2
resume_from_checkpoint: False

Expand Down
5 changes: 3 additions & 2 deletions recipes/full_finetune_distributed.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,8 @@ class FullFinetuneRecipeDistributed(FTRecipeInterface):
The following configs can be used to run this recipe:
>>> tune ls
RECIPE CONFIG
full_finetune_distributed llama2/7B_full, llama2/13B_full
full_finetune_distributed llama2/7B_full
llama2/13B_full
Args:
cfg (DictConfig): OmegaConf object parsed from yaml file
Expand All @@ -81,7 +82,7 @@ def __init__(self, cfg: DictConfig) -> None:
# logging attributes
self._output_dir = cfg.output_dir
self._log_every_n_steps = cfg.log_every_n_steps if cfg.log_every_n_steps else 1
self._log_peak_memory_every_n_steps = 10
self._log_peak_memory_every_n_steps = 100

# _is_rank_zero is used primarily for logging. In the future, the logger
# should directly take care of this
Expand Down
3 changes: 2 additions & 1 deletion recipes/lora_finetune_distributed.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,8 @@ class LoRAFinetuneRecipeDistributed(FTRecipeInterface):
The following configs can be used to run this recipe:
>>> tune ls
RECIPE CONFIG
lora_finetune_distributed llama2/7B_lora, llama2/13B_lora
lora_finetune_distributed llama2/7B_lora
llama2/13B_lora
Args:
cfg (DictConfig): OmegaConf object parsed from yaml file
Expand Down
3 changes: 2 additions & 1 deletion recipes/lora_finetune_single_device.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,8 @@ class LoRAFinetuneRecipeSingleDevice(FTRecipeInterface):
The following configs can be used to run this recipe:
>>> tune ls
RECIPE CONFIG
lora_finetune_single_device llama2/7B_lora_single_device, llama2/7B_qlora_single_device
lora_finetune_single_device llama2/7B_lora_single_device
llama2/7B_qlora_single_device
Args:
cfg (DictConfig): OmegaConf object parsed from yaml file
Expand Down

0 comments on commit 2a06b8f

Please sign in to comment.