Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
fengzch-das
nunchaku
Commits
16f4161e
Unverified
Commit
16f4161e
authored
Aug 01, 2025
by
senlyu163
Committed by
GitHub
Jul 31, 2025
Browse files
docs: fix flash-attention2 typos (#576)
parent
9acc9991
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
2 additions
and
2 deletions
+2
-2
docs/source/usage/attention.rst
docs/source/usage/attention.rst
+1
-1
nunchaku/models/transformers/transformer_flux.py
nunchaku/models/transformers/transformer_flux.py
+1
-1
No files found.
docs/source/usage/attention.rst
View file @
16f4161e
...
...
@@ -12,6 +12,6 @@ and 50-series GPUs compared to FlashAttention-2, without precision loss.
The key change from `Basic Usage <./basic_usage>`_ is use ``transformer.set_attention_impl("nunchaku-fp16")`` to enable FP16 attention.
While FlashAttention-2 is the default, FP16 attention offers better performance on modern NVIDIA GPUs.
Switch back with ``transformer.set_attention_impl("flash
-
att
entio
n2")``.
Switch back with ``transformer.set_attention_impl("flashattn2")``.
For more details, see :meth:`~nunchaku.models.transformers.transformer_flux.NunchakuFluxTransformer2dModel.set_attention_impl`.
nunchaku/models/transformers/transformer_flux.py
View file @
16f4161e
...
...
@@ -654,7 +654,7 @@ class NunchakuFluxTransformer2dModel(FluxTransformer2DModel, NunchakuModelLoader
impl : str
Attention implementation to use. Supported values:
- ``"flash
-
att
entio
n2"`` (default): Standard FlashAttention-2.
- ``"flashattn2"`` (default): Standard FlashAttention-2.
- ``"nunchaku-fp16"``: Uses FP16 attention accumulation, up to 1.2× faster than FlashAttention-2 on NVIDIA 30-, 40-, and 50-series GPUs.
"""
block
=
self
.
transformer_blocks
[
0
]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment