-
Notifications
You must be signed in to change notification settings - Fork 30
Insights: fla-org/native-sparse-attention
Overview
-
0 Active pull requests
-
- 0 Merged pull requests
- 0 Open pull requests
- 5 Closed issues
- 1 New issue
There hasn’t been any commit activity on fla-org/native-sparse-attention in the last month.
Want to help out?
5 Issues closed by 1 person
-
[Bug + Fix + Discussion] NaNs due to chunking edge-case in FLA
#23 closed
Jun 1, 2025 -
[Bug] which function you call in test/test_nsa_with_compression.py
#22 closed
May 25, 2025 -
[Bug] nsa benchmark is slower than flash-attn, not match the ReadME result
#19 closed
May 18, 2025 -
[Bug] results of 'naive_nsa_with_compression' are different compared to 'parallel_nsa'
#20 closed
May 18, 2025
1 Issue opened by 1 person
-
[Bug] Not able to run the basic test
#25 opened
May 22, 2025
1 Unresolved conversation
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Parallel training error: different number of tensor for forward and backword pass.
#24 commented on
Jun 1, 2025 • 0 new comments