Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
a6ec1782dc69b1d1a9ed94e2323c3ed5ba56cc13
Switch branch/tag
flash-attention
usage.md
16 Dec, 2022
2 commits
[Docs] Mention Megatron-LM
· b78f5a39
Tri Dao
authored
Dec 15, 2022
b78f5a39
[Docs] Mention PubMedGPT
· ece8f05d
Tri Dao
authored
Dec 15, 2022
ece8f05d
05 Dec, 2022
1 commit
[Docs] Mention FasterTransformer integration
· a84d0728
Tri Dao
authored
Dec 05, 2022
a84d0728
25 Nov, 2022
1 commit
[Docs] Clarify OpenFold speedup
· b784ed73
Tri Dao
authored
Nov 25, 2022
b784ed73
23 Nov, 2022
1 commit
[Docs] Mention OpenFold
· d9021ae4
Tri Dao
authored
Nov 23, 2022
d9021ae4
14 Nov, 2022
4 commits
Mention DeepSpeed inference in usage.md
· b0ed0a73
Tri Dao
authored
Nov 14, 2022
b0ed0a73
Mention AITemplate Stable Diffusion in usage.md
· 25387b24
Tri Dao
authored
Nov 14, 2022
25387b24
Link to Colossal-AI's stable diffusion in usage.md
· b92f2c3b
Tri Dao
authored
Nov 13, 2022
b92f2c3b
Add a page on where FlashAttention is being used
· 79160a69
Tri Dao
authored
Nov 13, 2022
79160a69