fixed cr load lora loaded_lora deletion #140
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Deleting
self.loaded_lora
effectively remove the member from the class.Not an issue in most cases as the following lines will reattribute it with the new loaded LoRA.
But if
self.load_lora
wasn'tNone
, different fromlora_path
andlora_path
is an invalid path, then the deletion happen but no new value is attributed toself.load_lora
ascomfy.utils.load_torch_file()
fails and crash.Then at the next execution, since
self.loaded_lora
does not exists, the check forself.load_lora is not None
crash.What it means in practice is that loading a valid lora, then trying to load an incorrect one will lock the node in a invalid state until the node is reinitialized (deleted/recreated) or if the UI is restarted.
The solution is to replace the deletion with an attribution to None.