Unverified Commit 080081bd authored by Chi's avatar Chi Committed by GitHub
Browse files

Remove the redundant line from the adapter.py file. (#5618)



* I added a new doc string to the class. This is more flexible to understanding other developers what are doing and where it's using.

* Update src/diffusers/models/unet_2d_blocks.py

This changes suggest by maintener.
Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>

* Update src/diffusers/models/unet_2d_blocks.py

Add suggested text
Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>

* Update unet_2d_blocks.py

I changed the Parameter to Args text.

* Update unet_2d_blocks.py

proper indentation set in this file.

* Update unet_2d_blocks.py

a little bit of change in the act_fun argument line.

* I run the black command to reformat style in the code

* Update unet_2d_blocks.py

similar doc-string add to have in the original diffusion repository.

* I removed the dummy variable defined in both the encoder and decoder.

* Now, I run black package to reformat my file

* Remove the redundant line from the adapter.py file.

* Black package using to reformated my file

---------
Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
Co-authored-by: default avatarDhruv Nair <dhruv.nair@gmail.com>
parent dd9a5caf
...@@ -456,9 +456,8 @@ class AdapterResnetBlock(nn.Module): ...@@ -456,9 +456,8 @@ class AdapterResnetBlock(nn.Module):
This method takes input tensor x and applies a convolutional layer, ReLU activation, and another convolutional This method takes input tensor x and applies a convolutional layer, ReLU activation, and another convolutional
layer on the input tensor. It returns addition with the input tensor. layer on the input tensor. It returns addition with the input tensor.
""" """
h = x
h = self.block1(h) h = self.act(self.block1(x))
h = self.act(h)
h = self.block2(h) h = self.block2(h)
return h + x return h + x
...@@ -578,9 +577,8 @@ class LightAdapterResnetBlock(nn.Module): ...@@ -578,9 +577,8 @@ class LightAdapterResnetBlock(nn.Module):
This function takes input tensor x and processes it through one convolutional layer, ReLU activation, and This function takes input tensor x and processes it through one convolutional layer, ReLU activation, and
another convolutional layer and adds it to input tensor. another convolutional layer and adds it to input tensor.
""" """
h = x
h = self.block1(h) h = self.act(self.block1(x))
h = self.act(h)
h = self.block2(h) h = self.block2(h)
return h + x return h + x
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment