8000 Unable to use bf16 lora with black forest labs flux-dev · Issue #81 · replicate/cog-flux · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Unable to use bf16 lora with black forest labs flux-dev #81

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
LokeshJonnakuti opened this issue Apr 24, 2025 · 0 comments
Open

Unable to use bf16 lora with black forest labs flux-dev #81

LokeshJonnakuti opened this issue Apr 24, 2025 · 0 comments

Comments

@LokeshJonnakuti
Copy link

I am unable to use my bf16 lora with flux-dev even though I am disabling go_fast parameter in the ui , and for the api call I gave it "false". still no change. why cant I do it? It gave me this error "Prediction failed. cannot access local variable 'weight_is_f8' where it is not associated with a value". here are the error logs .....
"Downloaded weights in 5.51s
2025-04-24 00:05:06.210 | INFO | fp8.lora_loading:convert_lora_weights:502 - Loading LoRA weights for /src/weights-cache/7d2104666de2bfa9
Warning - loading loras that fine-tune the text encoder is not supported at present, text encoder weights will be ignored
2025-04-24 00:05:06.711 | INFO | fp8.lora_loading:convert_lora_weights:523 - LoRA weights loaded
2025-04-24 00:05:06.712 | DEBUG | fp8.lora_loading:apply_lora_to_model_and_optionally_store_clones:610 - Extracting keys
2025-04-24 00:05:06.712 | DEBUG | fp8.lora_loading:apply_lora_to_model_and_optionally_store_clones:617 - Keys extracted
Applying LoRA: 0it [00:00, ?it/s]
Applying LoRA: 0it [00:00, ?it/s]
Traceback (most recent call last):
File "/root/.pyenv/versions/3.11.11/lib/python3.11/site-packages/cog/server/worker.py", line 352, in _predict
result = predict(**payload)
^^^^^^^^^^^^^^^^^^
File "/src/predict.py", line 539, in predict
model.handle_loras(lora_weights, lora_scale)
File "/src/bfl_predictor.py", line 108, in handle_loras
load_lora(model, lora_path, lora_scale, self.store_clones)
File "/root/.pyenv/versions/3.11.11/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/fp8/lora_loading.py", line 545, in load_lora
apply_lora_to_model_and_optionally_store_clones(model, lora_weights, lora_scale, store_clones)
File "/root/.pyenv/versions/3.11.11/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/fp8/lora_loading.py", line 668, in apply_lora_to_model_and_optionally_store_clones
if weight_is_f8:
^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'weight_is_f8' where it is not associated with a value"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant
0