"tests/unit_tests/dist_checkpointing/test_mapping.py" did not exist on "3aca141586a4b8cdc983c3ecf5f7baf60506c7f8"
Unverified Commit 468ae09e authored by Tolga Cangöz's avatar Tolga Cangöz Committed by GitHub
Browse files

Errata - Trim trailing white space in the whole repo (#8575)



* Trim all the trailing white space in the whole repo

* Remove unnecessary empty places

* make style && make quality

* Trim trailing white space

* trim

---------
Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
parent 3fca5202
......@@ -54,6 +54,8 @@ write_basic_config()
```
When running `accelerate config`, if we specify torch compile mode to True there can be dramatic speedups.
Note also that we use PEFT library as backend for LoRA training, make sure to have `peft>=0.6.0` installed in your environment.
### Dog toy example
......@@ -72,6 +74,8 @@ snapshot_download(
)
```
This will also allow us to push the trained LoRA parameters to the Hugging Face Hub platform.
Now, we can launch training using:
```bash
......
......@@ -42,7 +42,6 @@ input_image_path = "/path/to/input_image"
input_image = Image.open(input_image_path)
edited_images = pipe_lora(num_images_per_prompt=1, prompt=args.edit_prompt, image=input_image, num_inference_steps=1000).images
edited_images[0].show()
```
## Results
......
......@@ -29,7 +29,6 @@ export MALLOC_CONF="oversize_threshold:1,background_thread:true,metadata_thp:aut
numactl --membind <node N> -C <cpu list> python python inference_bf16.py
# Launch with DPMSolverMultistepScheduler
numactl --membind <node N> -C <cpu list> python python inference_bf16.py --dpm
```
## Accelerating the inference for Stable Diffusion using INT8
......
......@@ -239,7 +239,6 @@ accelerate launch --config_file $ACCELERATE_CONFIG_FILE train_text_to_image_lor
--seed=1234 \
--output_dir="sd-naruto-model-lora-sdxl" \
--validation_prompt="cute dragon creature"
```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment