1. 16 Feb, 2023 7 commits
    • Sayak Paul's avatar
      [Pipelines] Adds pix2pix zero (#2334) · fd3d5502
      Sayak Paul authored
      * add: support for BLIP generation.
      
      * add: support for editing synthetic images.
      
      * remove unnecessary comments.
      
      * add inits and run make fix-copies.
      
      * version change of diffusers.
      
      * fix: condition for loading the captioner.
      
      * default conditions_input_image to False.
      
      * guidance_amount -> cross_attention_guidance_amount
      
      * fix inputs to check_inputs()
      
      * fix: attribute.
      
      * fix: prepare_attention_mask() call.
      
      * debugging.
      
      * better placement of references.
      
      * remove torch.no_grad() decorations.
      
      * put torch.no_grad() context before the first denoising loop.
      
      * detach() latents before decoding them.
      
      * put deocding in a torch.no_grad() context.
      
      * add reconstructed image for debugging.
      
      * no_grad(0
      
      * apply formatting.
      
      * address one-off suggestions from the draft PR.
      
      * back to torch.no_grad() and add more elaborate comments.
      
      * refactor prepare_unet() per Patrick's suggestions.
      
      * more elaborate description for .
      
      * formatting.
      
      * add docstrings to the methods specific to pix2pix zero.
      
      * suspecting a redundant noise prediction.
      
      * needed for gradient computation chain.
      
      * less hacks.
      
      * fix: attention mask handling within the processor.
      
      * remove attention reference map computation.
      
      * fix: cross attn args.
      
      * fix: prcoessor.
      
      * store attention maps.
      
      * fix: attention processor.
      
      * update docs and better treatment to xa args.
      
      * update the final noise computation call.
      
      * change xa args call.
      
      * remove xa args option from the pipeline.
      
      * add: docs.
      
      * first test.
      
      * fix: url call.
      
      * fix: argument call.
      
      * remove image conditioning for now.
      
      * 🚨 add: fast tests.
      
      * explicit placement of the xa attn weights.
      
      * add: slow tests 🐢
      
      * fix: tests.
      
      * edited direction embedding should be on the same device as prompt_embeds.
      
      * debugging message.
      
      * debugging.
      
      * add pix2pix zero pipeline for a non-deterministic test.
      
      * debugging/
      
      * remove debugging message.
      
      * make caption generation _
      
      * address comments (part I).
      
      * address PR comments (part II)
      
      * fix: DDPM test assertion.
      
      * refactor doc.
      
      * address PR comments (part III).
      
      * fix: type annotation for the scheduler.
      
      * apply styling.
      
      * skip_mps and add note on embeddings in the docs.
      fd3d5502
    • Patrick von Platen's avatar
      [Variant] Add "variant" as input kwarg so to have better UX when downloading... · e5810e68
      Patrick von Platen authored
      
      [Variant] Add "variant" as input kwarg so to have better UX when downloading no_ema or fp16 weights (#2305)
      
      * [Variant] Add variant loading mechanism
      
      * clean
      
      * improve further
      
      * up
      
      * add tests
      
      * add some first tests
      
      * up
      
      * up
      
      * use path splittetx
      
      * add deprecate
      
      * deprecation warnings
      
      * improve docs
      
      * up
      
      * up
      
      * up
      
      * fix tests
      
      * Apply suggestions from code review
      Co-authored-by: default avatarPedro Cuenca <pedro@huggingface.co>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarPedro Cuenca <pedro@huggingface.co>
      
      * correct code format
      
      * fix warning
      
      * finish
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Update docs/source/en/using-diffusers/loading.mdx
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarWill Berman <wlbberman@gmail.com>
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      
      * correct loading docs
      
      * finish
      
      ---------
      Co-authored-by: default avatarPedro Cuenca <pedro@huggingface.co>
      Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
      Co-authored-by: default avatarWill Berman <wlbberman@gmail.com>
      e5810e68
    • Damian Stewart's avatar
      Fix 3-way merging with the checkpoint_merger community pipeline (#2355) · e3ddbe25
      Damian Stewart authored
      correctly locate 3rd file; also correct misleading docs
      e3ddbe25
    • Will Berman's avatar
    • Will Berman's avatar
      add total number checkpoints to training scripts (#2367) · 296b01e1
      Will Berman authored
      
      
      * add total number checkpoints to training scripts
      
      * Update examples/dreambooth/train_dreambooth.py
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      
      ---------
      Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
      296b01e1
    • Will Berman's avatar
      a3ae4661
    • meg's avatar
      Funky spacing issue (#2368) · c613288c
      meg authored
      There isn't a space between the "Scope" paragraph and "Ethical Guidelines", here: https://huggingface.co/docs/diffusers/main/en/conceptual/ethical_guidelines , yet I can't see that in the preview. In this PR, I'm simply adding some spaces in the hopes that it resolves the issue.....
      c613288c
  2. 15 Feb, 2023 4 commits
  3. 14 Feb, 2023 4 commits
  4. 13 Feb, 2023 12 commits
  5. 10 Feb, 2023 5 commits
  6. 09 Feb, 2023 3 commits
  7. 08 Feb, 2023 5 commits