"examples/vscode:/vscode.git/clone" did not exist on "c7250f2b8ae8e4b084dce363da5f9babc733454e"
Unverified Commit 8178c840 authored by Pedro Cuenca's avatar Pedro Cuenca Committed by GitHub
Browse files

Mention training problems with xFormers 0.0.16 (#2254)

parent 3a0d3da6
......@@ -27,3 +27,9 @@ The xFormers PIP package requires the latest version of PyTorch (1.13.1 as of xF
</Tip>
After xFormers is installed, you can use `enable_xformers_memory_efficient_attention()` for faster inference and reduced memory consumption, as discussed [here](fp16#memory-efficient-attention).
<Tip warning={true}>
According to [this issue](https://github.com/huggingface/diffusers/issues/2234#issuecomment-1416931212), xFormers `v0.0.16` cannot be used for training (fine-tune or Dreambooth) in some GPUs. If you observe that problem, please install a development version as indicated in that comment.
</Tip>
......@@ -235,5 +235,12 @@ python train_text_to_image_flax.py \
--output_dir="sd-pokemon-model"
```
### Training with xformers:
You can enable memory efficient attention by [installing xFormers](https://github.com/facebookresearch/xformers#installing-xformers) and padding the `--enable_xformers_memory_efficient_attention` argument to the script. This is not available with the Flax/JAX implementation.
### Training with xFormers:
You can enable memory efficient attention by [installing xFormers](https://huggingface.co/docs/diffusers/main/en/optimization/xformers) and passing the `--enable_xformers_memory_efficient_attention` argument to the script.
xFormers training is not available for Flax/JAX.
**Note**:
According to [this issue](https://github.com/huggingface/diffusers/issues/2234#issuecomment-1416931212), xFormers `v0.0.16` cannot be used for training in some GPUs. If you observe that problem, please install a development version as indicated in that comment.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment