Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Repository
7b33743a728a69dbb92d7e52e2ecae3d399e0dc1
Switch branch/tag
flash-attention
flash_attn
modules
mha.py
Find file
Blame
History
Permalink
[Rotary] Pass max_seqlen from mha.py to rotary during inference
· de2949f3
Tri Dao
authored
Sep 03, 2023
de2949f3
mha.py
38.2 KB
Edit
Web IDE
Replace mha.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace mha.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.