"...git@developer.sourcefind.cn:OpenDAS/torchaudio.git" did not exist on "9dcc7a15b1b1d1cf9cb7fd5fa6154e6a56a01045"
Unverified Commit cc59505e authored by Linoy Tsaban's avatar Linoy Tsaban Committed by GitHub
Browse files

[training docs] smol update to README files (#11616)

add comment to install prodigy
parent 5f5d02fb
...@@ -128,6 +128,7 @@ You can also load a dataset straight from by specifying it's name in `dataset_na ...@@ -128,6 +128,7 @@ You can also load a dataset straight from by specifying it's name in `dataset_na
Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset. Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset.
- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer - **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer
- To use Prodigy, please make sure to install the prodigyopt library: `pip install prodigyopt`
- **pivotal tuning** - **pivotal tuning**
- **min SNR gamma** - **min SNR gamma**
......
...@@ -143,7 +143,8 @@ Now we'll simply specify the name of the dataset and caption column (in this cas ...@@ -143,7 +143,8 @@ Now we'll simply specify the name of the dataset and caption column (in this cas
You can also load a dataset straight from by specifying it's name in `dataset_name`. You can also load a dataset straight from by specifying it's name in `dataset_name`.
Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset. Look [here](https://huggingface.co/blog/sdxl_lora_advanced_script#custom-captioning) for more info on creating/loading your own caption dataset.
- **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer - **optimizer**: for this example, we'll use [prodigy](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers) - an adaptive optimizer
- To use Prodigy, please make sure to install the prodigyopt library: `pip install prodigyopt`
- **pivotal tuning** - **pivotal tuning**
### Example #1: Pivotal tuning ### Example #1: Pivotal tuning
......
...@@ -134,7 +134,7 @@ Note also that we use PEFT library as backend for LoRA training, make sure to ha ...@@ -134,7 +134,7 @@ Note also that we use PEFT library as backend for LoRA training, make sure to ha
Prodigy is an adaptive optimizer that dynamically adjusts the learning rate learned parameters based on past gradients, allowing for more efficient convergence. Prodigy is an adaptive optimizer that dynamically adjusts the learning rate learned parameters based on past gradients, allowing for more efficient convergence.
By using prodigy we can "eliminate" the need for manual learning rate tuning. read more [here](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers). By using prodigy we can "eliminate" the need for manual learning rate tuning. read more [here](https://huggingface.co/blog/sdxl_lora_advanced_script#adaptive-optimizers).
to use prodigy, specify to use prodigy, first make sure to install the prodigyopt library: `pip install prodigyopt`, and then specify -
```bash ```bash
--optimizer="prodigy" --optimizer="prodigy"
``` ```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment