Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extra lora fix #63

Merged
merged 4 commits into from
Dec 19, 2024
Merged

Extra lora fix #63

merged 4 commits into from
Dec 19, 2024

Conversation

daanelson
Copy link
Collaborator

We have a bug right now in our hotswap loras where if a users passes a given extra_lora for a prediction, and then removes that extra_lora but otherwise doesn't change lora_scale or replicate_weights, the extra_lora will persist.

Any change to lora_scale, replicate_weights, or adding a new extra_lora should reset the weights (assuming that request maps to the instance that has the extra_lora that's sticking around) - but still, this is the very definition of a weird, annoying bug.

The fix below explicitly monitors for changes in extra_lora and handles it the same way that changes in lora_weights are handled. It also handles the case where extra_lora_weights are passed in and lora_weights aren't; this is an edge case, but good to handle.

@daanelson daanelson requested a review from a team December 19, 2024 00:51
@daanelson daanelson merged commit 7807dd3 into main Dec 19, 2024
2 of 3 checks passed
@fofr fofr deleted the extra-lora-fix branch December 19, 2024 16:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant