1. 07 Jan, 2025 1 commit
    • Rahul Raman's avatar
      Refactor instructpix2pix lora to support peft (#10205) · f1e0c7ce
      Rahul Raman authored
      
      
      * make base code changes referred from train_instructpix2pix script in examples
      
      * change code to use PEFT as discussed in issue 10062
      
      * update README training command
      
      * update README training command
      
      * refactor variable name and freezing unet
      
      * Update examples/research_projects/instructpix2pix_lora/train_instruct_pix2pix_lora.py
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      
      * update README installation instructions.
      
      * cleanup code using make style and quality
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      f1e0c7ce
  2. 24 Jun, 2024 2 commits
  3. 02 Apr, 2024 1 commit
    • Bagheera's avatar
      7529 do not disable autocast for cuda devices (#7530) · 8e963d1c
      Bagheera authored
      
      
      * 7529 do not disable autocast for cuda devices
      
      * Remove typecasting error check for non-mps platforms, as a correct autocast implementation makes it a non-issue
      
      * add autocast fix to other training examples
      
      * disable native_amp for dreambooth (sdxl)
      
      * disable native_amp for pix2pix (sdxl)
      
      * remove tests from remaining files
      
      * disable native_amp on huggingface accelerator for every training example that uses it
      
      * convert more usages of autocast to nullcontext, make style fixes
      
      * make style fixes
      
      * style.
      
      * Empty-Commit
      
      ---------
      Co-authored-by: default avatarbghira <bghira@users.github.com>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      8e963d1c
  4. 13 Mar, 2024 1 commit
  5. 09 Feb, 2024 1 commit
  6. 08 Feb, 2024 1 commit
  7. 10 Jan, 2024 1 commit
    • Rahul Raman's avatar
      example: Train Instruct pix2 pix with lora implementation (#6469) · 2d1f2182
      Rahul Raman authored
      
      
      * base template file - train_instruct_pix2pix.py
      
      * additional import and parser argument requried for lora
      
      * finetune only instructpix2pix model -- no need to include these layers
      
      * inject lora layers
      
      * freeze unet model -- only lora layers are trained
      
      * training modifications to train only lora parameters
      
      * store only lora parameters
      
      * move train script to research project
      
      * run quality and style code checks
      
      * move train script to a new folder
      
      * add README
      
      * update README
      
      * update references in README
      
      ---------
      Co-authored-by: default avatarRahul Raman <rahulraman@gmail.com>
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      2d1f2182