8000
We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
vector attention for multi-head attention variant
fix nearest neighbors w/ multihead
fix multihead point transformer layer
add multi-head point transformer layer
add ability to attend to k-nearest neighbors, to cutdown on attention… … between faraway points
Merge pull request lucidrains#3 from lucidrains/modulate-feature-dime… …nsion different attention map for each element of feature dimension
Create python-publish.yml