- 20 Jan, 2024 1 commit
-
-
Tri Dao authored
-
- 14 Jan, 2024 4 commits
- 22 Dec, 2023 1 commit
-
-
Tri Dao authored
-
- 26 Sep, 2023 1 commit
-
-
Tri Dao authored
Co-authored-by:Timothee Lacroix <t@mistral.ai>
-
- 12 Sep, 2023 1 commit
-
-
Tri Dao authored
-
- 25 Aug, 2023 1 commit
-
-
Tri Dao authored
-
- 24 Aug, 2023 1 commit
-
-
BoxiangW authored
Support flash attention 2 with causal masking when KV's seq length is longer than Q's seq length. (#436)
-
- 01 Aug, 2023 1 commit
-
-
Tri Dao authored
-
- 17 Jul, 2023 1 commit
-
-
Tri Dao authored
-