Skip to content

Commit

Permalink
added
Browse files Browse the repository at this point in the history
  • Loading branch information
shaheennabi committed Dec 2, 2024
1 parent b23b506 commit a9710d1
Showing 1 changed file with 16 additions and 0 deletions.
16 changes: 16 additions & 0 deletions src/finetuning/config/lora_params.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
lora_params:
r: 16 # LoRA rank
target_modules:
- "q_proj"
- "k_proj"
- "v_proj"
- "o_proj"
- "gate_proj"
- "up_proj"
- "down_proj"
lora_alpha: 16
lora_dropout: 0 # Optimized at 0
bias: "none" # No additional bias terms
use_gradient_checkpointing: "unsloth" # Gradient checkpointing to save memory
random_state: 3407
use_rslora: false # Rank stabilized LoRA, can be enabled for stability

0 comments on commit a9710d1

Please sign in to comment.