8000 Tags · lucidrains/PaLM-pytorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Tags: lucidrains/PaLM-pytorch

Tags

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.

v0.2.1

Toggle v0.2.1's commit message
fix alibi in palm lite

v0.2.0

Toggle v0.2.0's commit message
release palm lite version, thanks to @conceptofmind

v0.2.0a

Toggle v0.2.0a's commit message
release palm lite version, thanks to @conceptofmind

0.1.0

Toggle 0.1.0's commit message
fuse the attention and feedforward projections, thanks to @feifeibear …

…for alerting me to this

0.0.12

Toggle 0.0.12's commit message
fix rotary embedding caching

0.0.11

Toggle 0.0.11's commit message
start chipping away at Triton version of PaLM, use causal numerically…

… stable softmax (no need for causal mask) + bias-less layernorm, modified from Phil Tillets layernorm tutorial, cite Triton

0.0.10a

Toggle 0.0.10a's commit message
cache causal mask and rotary embeddings within attention module

0.0.9

Toggle 0.0.9's commit message
fix prelayernorm in attention

0.0.8

Toggle 0.0.8's commit message
add enwik8 training

0