Open
Description
pipe = FluxPipeline.from_pretrained('models--black-forest-labs--FLUX.1-dev', torch_dtype=torch.bfloat16)
First Step, load unconditional custom_lora_weight to base model, and fuse to base model. This result-base model is the same as base model.
pipe.load_lora_weights(custom_lora_weight, adapter_name="a_lora")
pipe.fuse_lora(lora_scale=1.0)
pipe.unload_lora_weights()
pipe.unload_lora_weights() use unload_lora_weights() to unload the LoRA weights since they’ve already been fused with the underlying base model
Second Step, load OminiControl lora as inference code.
pipe.load_lora_weights(ominicontrol_lora_weight, adapter_name=adapter_name)
pipe.set_adapters([adapter_name], adapter_weights=[1.0])
That's all.
Metadata
Metadata
Assignees
Labels
No labels