1. 17 Apr, 2023 4 commits
  2. 14 Apr, 2023 2 commits
  3. 13 Apr, 2023 1 commit
  4. 12 Apr, 2023 10 commits
  5. 11 Apr, 2023 7 commits
    • Will Berman's avatar
      Attn added kv processor torch 2.0 block (#3023) · ea39cd7e
      Will Berman authored
      add AttnAddedKVProcessor2_0 block
      ea39cd7e
    • Chanchana Sornsoontorn's avatar
      Fix typo and format BasicTransformerBlock attributes (#2953) · 52c4d32d
      Chanchana Sornsoontorn authored
      * ️chore(train_controlnet) fix typo in logger message
      
      * ️chore(models) refactor modules order; make them the same as calling order
      
      When printing the BasicTransformerBlock to stdout, I think it's crucial that the attributes order are shown in proper order. And also previously the "3. Feed Forward" comment was not making sense. It should have been close to self.ff but it's instead next to self.norm3
      
      * correct many tests
      
      * remove bogus file
      
      * make style
      
      * correct more tests
      
      * finish tests
      
      * fix one more
      
      * make style
      
      * make unclip deterministic
      
      * 
      
      ️chore(models/attention) reorganize comments in BasicTransformerBlock class
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      52c4d32d
    • Will Berman's avatar
      add only cross attention to simple attention blocks (#3011) · c6180a31
      Will Berman authored
      * add only cross attention to simple attention blocks
      
      * add test for only_cross_attention re: @patrickvonplaten
      
      * mid_block_only_cross_attention better default
      
      allow mid_block_only_cross_attention to default to
      `only_cross_attention` when `only_cross_attention` is given
      as a single boolean
      c6180a31
    • Pedro Cuenca's avatar
      Fix invocation of some slow Flax tests (#3058) · e3095c5f
      Pedro Cuenca authored
      * Fix invocation of some slow tests.
      
      We use __call__ rather than pmapping the generation function ourselves
      because the number of static arguments is different now.
      
      * style
      e3095c5f
    • Will Berman's avatar
      config fixes (#3060) · 80bc0c0c
      Will Berman authored
      80bc0c0c
    • Patrick von Platen's avatar
      Fix config prints and save, load of pipelines (#2849) · 8b451eb6
      Patrick von Platen authored
      * [Config] Fix config prints and save, load
      
      * Only use potential nn.Modules for dtype and device
      
      * Correct vae image processor
      
      * make sure in_channels is not accessed directly
      
      * make sure in channels is only accessed via config
      
      * Make sure schedulers only access config attributes
      
      * Make sure to access config in SAG
      
      * Fix vae processor and make style
      
      * add tests
      
      * uP
      
      * make style
      
      * Fix more naming issues
      
      * Final fix with vae config
      
      * change more
      8b451eb6
    • Pedro Cuenca's avatar
      mps: skip unstable test (#3037) · fbc9a736
      Pedro Cuenca authored
      fbc9a736
  6. 10 Apr, 2023 4 commits
  7. 06 Apr, 2023 1 commit
  8. 05 Apr, 2023 1 commit
  9. 04 Apr, 2023 1 commit
  10. 31 Mar, 2023 3 commits
  11. 30 Mar, 2023 1 commit
  12. 28 Mar, 2023 4 commits
  13. 27 Mar, 2023 1 commit