A LoRA is tied to a specific model architecture — a LoRA trained on Llama 3 8B won't work on Mistral 7B. Train on the exact model you plan to use. You should also use Copy parameters from to restore ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results