Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Repository
a01d1213d7dd2224d91b54affaa9cacbcdd7e721
Switch branch/tag
flash-attention
csrc
ft_attention
cuda_bf16_wrapper.h
Find file
Blame
History
Permalink
[Gen] Add kernel from FasterTransformer for benchmarking
· a01d1213
Tri Dao
authored
Jan 03, 2023
a01d1213
cuda_bf16_wrapper.h
867 Bytes
Edit
Web IDE
Replace cuda_bf16_wrapper.h
×
Attach a file by drag & drop or
click to upload
Commit message
Replace cuda_bf16_wrapper.h
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.