Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
43ab0b52
Commit
43ab0b52
authored
Nov 15, 2022
by
Tri Dao
Browse files
Mention that some CUDA extensions have only been tested on A100s
parent
e4d3013e
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
9 additions
and
0 deletions
+9
-0
csrc/fused_dense_lib/README.md
csrc/fused_dense_lib/README.md
+3
-0
csrc/layer_norm/README.md
csrc/layer_norm/README.md
+3
-0
csrc/xentropy/README.md
csrc/xentropy/README.md
+3
-0
No files found.
csrc/fused_dense_lib/README.md
View file @
43ab0b52
...
...
@@ -5,6 +5,9 @@ We make it work for bfloat16.
For best performance, you should use CUDA >= 11.8. CuBLAS versions before
this doesn't have the best matmul + bias + gelu performance for bfloat16.
It has only been tested on A100s.
```
sh
cd
csrc/fused_dense_lib
&&
pip
install
.
```
csrc/layer_norm/README.md
View file @
43ab0b52
This CUDA extension implements fused dropout + residual + LayerNorm, based on
Apex's
[
FastLayerNorm
](
https://github.com/NVIDIA/apex/tree/master/apex/contrib/layer_norm
)
.
We add dropout and residual, and make it work for both pre-norm and post-norm architecture.
It has only been tested on A100s.
```
sh
cd
csrc/layer_norm
&&
pip
install
.
```
csrc/xentropy/README.md
View file @
43ab0b52
This CUDA extension implements optimized cross-entropy loss, adapted from Apex's
[
Xentropy
](
https://github.com/NVIDIA/apex/tree/master/apex/contrib/xentropy
)
.
We make it work for bfloat16 and support in-place backward to save memory.
It has only been tested on A100s.
```
sh
cd
csrc/xentropy
&&
pip
install
.
```
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment