8000 [Bug] FLA models fail to return logits when labels are not provided · Issue #237 · fla-org/flash-linear-attention · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

[Bug] FLA models fail to return logits when labels are not provided #237

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
2 tasks done
sustcsonglin opened this issue Mar 22, 2025 · 2 comments
Closed
2 tasks done
Labels
bug Something isn't working stale

Comments

@sustcsonglin
Copy link
Collaborator
sustcsonglin commented Mar 22, 2025

Checklist

  • I have checked FAQs and existing issues for similar problems
  • Please report this bug in English to ensure wider understanding and support

Describe the Bug

The fuse_cross_entropy option is enabled by default. In the following code segment:
modeling_transformer.py#L373C1-L374C1, there appears to be a logic issue that prevents logits from being returned when no labels are provid

Steps to Reproduce the Bug

N/A

Expected Behavior

N/A

Environment Information

should be applied for all version

@sustcsonglin sustcsonglin added the bug Something isn't working label Mar 22, 2025
Copy link

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Apr 27, 2025
Copy link
github-actions bot commented May 4, 2025

This issue was closed because it has been inactive for 7 days since being marked as stale.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale
Projects
None yet
Development

No branches or pull requests

1 participant
0