Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
b92f2c3b
"ppocr/vscode:/vscode.git/clone" did not exist on "ebfd447584c260a4999469d760029625e48a00ef"
Commit
b92f2c3b
authored
Nov 13, 2022
by
Tri Dao
Browse files
Link to Colossal-AI's stable diffusion in usage.md
parent
343492ec
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
0 deletions
+5
-0
usage.md
usage.md
+5
-0
No files found.
usage.md
View file @
b92f2c3b
...
...
@@ -59,6 +59,11 @@ yields the fastest BERT training on cloud instances in MLPerf training 2.0 (June
v0.7.0
](
https://github.com/huggingface/diffusers/releases/tag/v0.7.0
)
.
Up to 2x faster inference and lower memory usage.
-
Colossal-AI's
[
implementation
](
https://github.com/hpcaitech/ColossalAI/tree/main/examples/images/diffusion
)
of Stable Diffusion: with FlashAttention as one of its components, it speeds up
pretraining by up to 6.5x, and reduces the hardware cost of fine-tuning by 7x.
-
Stable Diffusion inference from
[
Labml.ai
](
https://twitter.com/labmlai/status/1573634095732490240
)
: 50% speedup.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment