Unverified Commit 468ae09e authored by Tolga Cangöz's avatar Tolga Cangöz Committed by GitHub
Browse files

Errata - Trim trailing white space in the whole repo (#8575)



* Trim all the trailing white space in the whole repo

* Remove unnecessary empty places

* make style && make quality

* Trim trailing white space

* trim

---------
Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
parent 3fca5202
...@@ -114,7 +114,7 @@ Now we'll simply specify the name of the dataset and caption column (in this cas ...@@ -114,7 +114,7 @@ Now we'll simply specify the name of the dataset and caption column (in this cas
``` ```
You can also load a dataset straight from by specifying it's name in `dataset_name`. You can also load a dataset straight from by specifying it's name in `dataset_name`.
Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset. Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loadin your own caption dataset.
- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer - **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer
- **pivotal tuning** - **pivotal tuning**
...@@ -393,7 +393,7 @@ The advanced script now supports custom choice of U-net blocks to train during D ...@@ -393,7 +393,7 @@ The advanced script now supports custom choice of U-net blocks to train during D
> In light of this, we're introducing a new feature to the advanced script to allow for configurable U-net learned blocks. > In light of this, we're introducing a new feature to the advanced script to allow for configurable U-net learned blocks.
**Usage** **Usage**
Configure LoRA learned U-net blocks adding a `lora_unet_blocks` flag, with a comma separated string specifying the targeted blocks. Configure LoRA learned U-net blocks adding a `lora_unet_blocks` flag, with a comma seperated string specifying the targeted blocks.
e.g: e.g:
```bash ```bash
--lora_unet_blocks="unet.up_blocks.0.attentions.0,unet.up_blocks.0.attentions.1" --lora_unet_blocks="unet.up_blocks.0.attentions.0,unet.up_blocks.0.attentions.1"
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment