Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
7ffba9a501ba6e377bd36ffa18875c9077e3c1b3
Switch branch/tag
flash-attention
flash_attn
models
btlm.py
25 Dec, 2023
1 commit
Implement BTLM model
· 7ffba9a5
Tri Dao
authored
Dec 24, 2023
7ffba9a5