1. 11 Apr, 2023 8 commits
    • Will Berman's avatar
      unet time embedding activation function (#3048) · 2d52e81c
      Will Berman authored
      * unet time embedding activation function
      
      * typo act_fn -> time_embedding_act_fn
      
      * flatten conditional
      2d52e81c
    • Chanchana Sornsoontorn's avatar
      Fix typo and format BasicTransformerBlock attributes (#2953) · 52c4d32d
      Chanchana Sornsoontorn authored
      * ️chore(train_controlnet) fix typo in logger message
      
      * ️chore(models) refactor modules order; make them the same as calling order
      
      When printing the BasicTransformerBlock to stdout, I think it's crucial that the attributes order are shown in proper order. And also previously the "3. Feed Forward" comment was not making sense. It should have been close to self.ff but it's instead next to self.norm3
      
      * correct many tests
      
      * remove bogus file
      
      * make style
      
      * correct more tests
      
      * finish tests
      
      * fix one more
      
      * make style
      
      * make unclip deterministic
      
      * 
      
      ️chore(models/attention) reorganize comments in BasicTransformerBlock class
      
      ---------
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      52c4d32d
    • Will Berman's avatar
      add only cross attention to simple attention blocks (#3011) · c6180a31
      Will Berman authored
      * add only cross attention to simple attention blocks
      
      * add test for only_cross_attention re: @patrickvonplaten
      
      * mid_block_only_cross_attention better default
      
      allow mid_block_only_cross_attention to default to
      `only_cross_attention` when `only_cross_attention` is given
      as a single boolean
      c6180a31
    • Pedro Cuenca's avatar
      Fix scheduler type mismatch (#3041) · 526827c3
      Pedro Cuenca authored
      When doing generation manually and using guidance_scale as a static
      argument.
      526827c3
    • George Ogden's avatar
      Update documentation (#2996) · cb63febf
      George Ogden authored
      * Update documentation
      
      Based on sampling, the width and height must be powers of 2 as the samples halve in size each time
      
      * make style
      cb63febf
    • Will Berman's avatar
      `AttentionProcessor.group_norm` num_channels should be `query_dim` (#3046) · 8c6b47cf
      Will Berman authored
      * `AttentionProcessor.group_norm` num_channels should be `query_dim`
      
      The group_norm on the attention processor should really norm the number
      of channels in the query _not_ the inner dim. This wasn't caught before
      because the group_norm is only used by the added kv attention processors
      and the added kv attention processors are only used by the karlo models
      which are configured such that the inner dim is the same as the query
      dim.
      
      * add_{k,v}_proj should be projecting to inner_dim
      8c6b47cf
    • Will Berman's avatar
      config fixes (#3060) · 80bc0c0c
      Will Berman authored
      80bc0c0c
    • Patrick von Platen's avatar
      Fix config prints and save, load of pipelines (#2849) · 8b451eb6
      Patrick von Platen authored
      * [Config] Fix config prints and save, load
      
      * Only use potential nn.Modules for dtype and device
      
      * Correct vae image processor
      
      * make sure in_channels is not accessed directly
      
      * make sure in channels is only accessed via config
      
      * Make sure schedulers only access config attributes
      
      * Make sure to access config in SAG
      
      * Fix vae processor and make style
      
      * add tests
      
      * uP
      
      * make style
      
      * Fix more naming issues
      
      * Final fix with vae config
      
      * change more
      8b451eb6
  2. 10 Apr, 2023 13 commits
  3. 06 Apr, 2023 3 commits
  4. 05 Apr, 2023 1 commit
  5. 04 Apr, 2023 1 commit
  6. 31 Mar, 2023 10 commits
  7. 30 Mar, 2023 1 commit
  8. 28 Mar, 2023 3 commits