All Tags

#lora

3 posts tagged with "lora"

Switching LoRA Adapters at Runtime

S-LoRA enables switching adapters in ~10ms without reloading the base model. One deployment serves hundreds of customizations.

What Actually Works with LoRA

LoRA tutorials make it look easy. Production LoRA requires learning rate adjustments, layer selection, rank tuning, and careful validation. Here's what actually works.

When LoRA Makes Sense

Full fine-tuning updates billions of parameters. LoRA updates millions. The 0.1% of parameters can capture 80% of the adaptation. Know when that's enough.