Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Repository
"TensorFlow/NLP/transformer-xl-master/sota/download.sh" did not exist on "a22e7ca70541cb7c60343e38985d7e7a1604a4fa"
984d5204e2502607fa20cd6b296068989b3ad299
Switch branch/tag
flash-attention
training
Dockerfile
Find file
Blame
History
Permalink
Update training Dockerfile to use flash-attn==0.2.6
· 984d5204
Tri Dao
authored
Dec 29, 2022
984d5204
Dockerfile
4.03 KB
Edit
Web IDE
Replace Dockerfile
×
Attach a file by drag & drop or
click to upload
Commit message
Replace Dockerfile
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.