[Bug] FLA models fail to return logits when labels are not provided · Issue #237 · fla-org/flash-linear-attention · GitHub
More Web Proxy on the site http://driver.im/
You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have checked FAQs and existing issues for similar problems
Please report this bug in English to ensure wider understanding and support
Describe the Bug
The fuse_cross_entropy option is enabled by default. In the following code segment: modeling_transformer.py#L373C1-L374C1, there appears to be a logic issue that prevents logits from being returned when no labels are provid
Steps to Reproduce the Bug
N/A
Expected Behavior
N/A
Environment Information
should be applied for all version
The text was updated successfully, but these errors were encountered:
Checklist
Describe the Bug
The
fuse_cross_entropy
option is enabled by default. In the following code segment:modeling_transformer.py#L373C1-L374C1, there appears to be a logic issue that prevents logits from being returned when no labels are provid
Steps to Reproduce the Bug
N/A
Expected Behavior
N/A
Environment Information
should be applied for all version
The text was updated successfully, but these errors were encountered: