"router/vscode:/vscode.git/clone" did not exist on "55106ec4766c787823361db80ea461715aa57a7a"
Unverified Commit 216d1901 authored by Sayak Paul's avatar Sayak Paul Committed by GitHub
Browse files

Update README.md to include our blog post (#1998)



* Update README.md
Co-authored-by: default avatarPedro Cuenca <pedro@huggingface.co>
parent 9b37ed33
...@@ -321,3 +321,6 @@ python train_dreambooth_flax.py \ ...@@ -321,3 +321,6 @@ python train_dreambooth_flax.py \
You can enable memory efficient attention by [installing xFormers](https://github.com/facebookresearch/xformers#installing-xformers) and padding the `--enable_xformers_memory_efficient_attention` argument to the script. This is not available with the Flax/JAX implementation. You can enable memory efficient attention by [installing xFormers](https://github.com/facebookresearch/xformers#installing-xformers) and padding the `--enable_xformers_memory_efficient_attention` argument to the script. This is not available with the Flax/JAX implementation.
You can also use Dreambooth to train the specialized in-painting model. See [the script in the research folder for details](https://github.com/huggingface/diffusers/tree/main/examples/research_projects/dreambooth_inpaint). You can also use Dreambooth to train the specialized in-painting model. See [the script in the research folder for details](https://github.com/huggingface/diffusers/tree/main/examples/research_projects/dreambooth_inpaint).
### Experimental results
You can refer to [this blog post](https://huggingface.co/blog/dreambooth) that discusses some of DreamBooth experiments in detail. Specifically, it recommends a set of DreamBooth-specific tips and tricks that we have found to work well for a variety of subjects.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment