[WIP] Add LORA_STACK input/output to Power Lora Loader#263
Closed
DrJKL wants to merge 2 commits intorgthree:mainfrom
Closed
[WIP] Add LORA_STACK input/output to Power Lora Loader#263DrJKL wants to merge 2 commits intorgthree:mainfrom
DrJKL wants to merge 2 commits intorgthree:mainfrom
Conversation
This has the issue of re-applying the LoRAs to the model/clip for each PLL node...
This was referenced Jun 26, 2024
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What?
Adds a secondary output to Power Lora Loader in the
LORA_STACKformat. Also takes in a lora_stack to allow for chainingWhy?
I have a multi-phase workflow that I want to share some common LoRAs but then augment them with others without having to clone nodes or keep them updated. I like the Power Lora Loader better than the Efficiency Nodes' LoRA Stacker.
@rgthree I could also break this into a separate
Power Lora Stacker (rgthree)node. Or setup a property to allow for either/or (I figure just hiding the input/output nodes?)Note
There is an issue with this implementation when all of the outputs and inputs are chained.
LoRAs are applied in each node, leading to weird results.
I put together a test workflow: Power_Lora_Loader_Stack_Test.json
(Open in a text-editor and Find/Replace
<<<CHECKPOINT>>>,<<<LORA_1>>>,<<<LORA_2>>>for convenience)