1. 17 Oct, 2022 1 commit
  2. 14 Oct, 2022 2 commits
  3. 13 Oct, 2022 1 commit
  4. 12 Oct, 2022 2 commits
  5. 11 Oct, 2022 2 commits
    • spezialspezial's avatar
      Eventually preserve this typo? :) (#804) · e8959528
      spezialspezial authored
      e8959528
    • Suraj Patil's avatar
      stable diffusion fine-tuning (#356) · 66a5279a
      Suraj Patil authored
      
      
      * begin text2image script
      
      * loading the datasets, preprocessing & transforms
      
      * handle input features correctly
      
      * add gradient checkpointing support
      
      * fix output names
      
      * run unet in train mode not text encoder
      
      * use no_grad instead of freezing params
      
      * default max steps None
      
      * pad to longest
      
      * don't pad when tokenizing
      
      * fix encode on multi gpu
      
      * fix stupid bug
      
      * add random flip
      
      * add ema
      
      * fix ema
      
      * put ema on cpu
      
      * improve EMA model
      
      * contiguous_format
      
      * don't warp vae and text encode in accelerate
      
      * remove no_grad
      
      * use randn_like
      
      * fix resize
      
      * improve few things
      
      * log epoch loss
      
      * set log level
      
      * don't log each step
      
      * remove max_length from collate
      
      * style
      
      * add report_to option
      
      * make scale_lr false by default
      
      * add grad clipping
      
      * add an option to use 8bit adam
      
      * fix logging in multi-gpu, log every step
      
      * more comments
      
      * remove eval for now
      
      * adress review comments
      
      * add requirements file
      
      * begin readme
      
      * begin readme
      
      * fix typo
      
      * fix push to hub
      
      * populate readme
      
      * update readme
      
      * remove use_auth_token from the script
      
      * address some review comments
      
      * better mixed precision support
      
      * remove redundant to
      
      * create ema model early
      
      * Apply suggestions from code review
      Co-authored-by: default avatarPedro Cuenca <pedro@huggingface.co>
      
      * better description for train_data_dir
      
      * add diffusers in requirements
      
      * update dataset_name_mapping
      
      * update readme
      
      * add inference example
      Co-authored-by: default avataranton-l <anton@huggingface.co>
      Co-authored-by: default avatarPedro Cuenca <pedro@huggingface.co>
      66a5279a
  6. 10 Oct, 2022 1 commit
  7. 07 Oct, 2022 1 commit
  8. 06 Oct, 2022 4 commits
  9. 05 Oct, 2022 4 commits
  10. 04 Oct, 2022 2 commits
  11. 03 Oct, 2022 2 commits
  12. 29 Sep, 2022 1 commit
  13. 28 Sep, 2022 2 commits
  14. 27 Sep, 2022 6 commits
    • Suraj Patil's avatar
      [CLIPGuidedStableDiffusion] remove set_format from pipeline (#653) · c0c98df9
      Suraj Patil authored
      remove set_format from pipeline
      c0c98df9
    • Suraj Patil's avatar
      [dreambooth] update install section (#650) · e5eed523
      Suraj Patil authored
      update install section
      e5eed523
    • Suraj Patil's avatar
      [examples/dreambooth] don't pass tensor_format to scheduler. (#649) · ac665b64
      Suraj Patil authored
      don't pass tensor_format
      ac665b64
    • Kashif Rasul's avatar
      [Pytorch] Pytorch only schedulers (#534) · bd8df2da
      Kashif Rasul authored
      
      
      * pytorch only schedulers
      
      * fix style
      
      * remove match_shape
      
      * pytorch only ddpm
      
      * remove SchedulerMixin
      
      * remove numpy from karras_ve
      
      * fix types
      
      * remove numpy from lms_discrete
      
      * remove numpy from pndm
      
      * fix typo
      
      * remove mixin and numpy from sde_vp and ve
      
      * remove remaining tensor_format
      
      * fix style
      
      * sigmas has to be torch tensor
      
      * removed set_format in readme
      
      * remove set format from docs
      
      * remove set_format from pipelines
      
      * update tests
      
      * fix typo
      
      * continue to use mixin
      
      * fix imports
      
      * removed unsed imports
      
      * match shape instead of assuming image shapes
      
      * remove import typo
      
      * update call to add_noise
      
      * use math instead of numpy
      
      * fix t_index
      
      * removed commented out numpy tests
      
      * timesteps needs to be discrete
      
      * cast timesteps to int in flax scheduler too
      
      * fix device mismatch issue
      
      * small fix
      
      * Update src/diffusers/schedulers/scheduling_pndm.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      bd8df2da
    • Zhenhuan Liu's avatar
      Add training example for DreamBooth. (#554) · 3b747de8
      Zhenhuan Liu authored
      
      
      * Add training example for DreamBooth.
      
      * Fix bugs.
      
      * Update readme and default hyperparameters.
      
      * Reformatting code with black.
      
      * Update for multi-gpu trianing.
      
      * Apply suggestions from code review
      
      * improgve sampling
      
      * fix autocast
      
      * improve sampling more
      
      * fix saving
      
      * actuallu fix saving
      
      * fix saving
      
      * improve dataset
      
      * fix collate fun
      
      * fix collate_fn
      
      * fix collate fn
      
      * fix key name
      
      * fix dataset
      
      * fix collate fn
      
      * concat batch in collate fn
      
      * add grad ckpt
      
      * add option for 8bit adam
      
      * do two forward passes for prior preservation
      
      * Revert "do two forward passes for prior preservation"
      
      This reverts commit 661ca4677e6dccc4ad596c2ee6ca4baad4159e95.
      
      * add option for prior_loss_weight
      
      * add option for clip grad norm
      
      * add more comments
      
      * update readme
      
      * update readme
      
      * Apply suggestions from code review
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * add docstr for dataset
      
      * update the saving logic
      
      * Update examples/dreambooth/README.md
      
      * remove unused imports
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      3b747de8
    • Abdullah Alfaraj's avatar
      Fix docs link to train_unconditional.py (#642) · bb0c5d15
      Abdullah Alfaraj authored
      the link points to an old location of the train_unconditional.py file
      bb0c5d15
  15. 21 Sep, 2022 1 commit
  16. 19 Sep, 2022 2 commits
  17. 16 Sep, 2022 1 commit
  18. 15 Sep, 2022 1 commit
    • Kashif Rasul's avatar
      Karras VE, DDIM and DDPM flax schedulers (#508) · b34be039
      Kashif Rasul authored
      * beta never changes removed from state
      
      * fix typos in docs
      
      * removed unused var
      
      * initial ddim flax scheduler
      
      * import
      
      * added dummy objects
      
      * fix style
      
      * fix typo
      
      * docs
      
      * fix typo in comment
      
      * set return type
      
      * added flax ddom
      
      * fix style
      
      * remake
      
      * pass PRNG key as argument and split before use
      
      * fix doc string
      
      * use config
      
      * added flax Karras VE scheduler
      
      * make style
      
      * fix dummy
      
      * fix ndarray type annotation
      
      * replace returns a new state
      
      * added lms_discrete scheduler
      
      * use self.config
      
      * add_noise needs state
      
      * use config
      
      * use config
      
      * docstring
      
      * added flax score sde ve
      
      * fix imports
      
      * fix typos
      b34be039
  19. 08 Sep, 2022 1 commit
  20. 07 Sep, 2022 2 commits
  21. 06 Sep, 2022 1 commit