Commit db91e710 authored by Patrick von Platen's avatar Patrick von Platen
Browse files

make style

parent 2a62aadc
...@@ -260,9 +260,9 @@ class T2IAdapter(ModelMixin, ConfigMixin): ...@@ -260,9 +260,9 @@ class T2IAdapter(ModelMixin, ConfigMixin):
def forward(self, x: torch.Tensor) -> List[torch.Tensor]: def forward(self, x: torch.Tensor) -> List[torch.Tensor]:
r""" r"""
This function processes the input tensor `x` through the adapter model and returns a list of feature tensors, This function processes the input tensor `x` through the adapter model and returns a list of feature tensors,
each representing information extracted at a different scale from the input. each representing information extracted at a different scale from the input. The length of the list is
The length of the list is determined by the number of downsample blocks in the Adapter, as specified determined by the number of downsample blocks in the Adapter, as specified by the `channels` and
by the `channels` and `num_res_blocks` parameters during initialization. `num_res_blocks` parameters during initialization.
""" """
return self.adapter(x) return self.adapter(x)
...@@ -304,9 +304,9 @@ class FullAdapter(nn.Module): ...@@ -304,9 +304,9 @@ class FullAdapter(nn.Module):
def forward(self, x: torch.Tensor) -> List[torch.Tensor]: def forward(self, x: torch.Tensor) -> List[torch.Tensor]:
r""" r"""
This method processes the input tensor `x` through the FullAdapter model and performs operations including This method processes the input tensor `x` through the FullAdapter model and performs operations including
pixel unshuffling, convolution, and a stack of AdapterBlocks. It returns a list of feature tensors, each capturing information pixel unshuffling, convolution, and a stack of AdapterBlocks. It returns a list of feature tensors, each
at a different stage of processing within the FullAdapter model. The number of feature tensors in the list is determined capturing information at a different stage of processing within the FullAdapter model. The number of feature
by the number of downsample blocks specified during initialization. tensors in the list is determined by the number of downsample blocks specified during initialization.
""" """
x = self.unshuffle(x) x = self.unshuffle(x)
x = self.conv_in(x) x = self.conv_in(x)
...@@ -385,8 +385,8 @@ class AdapterBlock(nn.Module): ...@@ -385,8 +385,8 @@ class AdapterBlock(nn.Module):
def forward(self, x): def forward(self, x):
r""" r"""
This method takes tensor x as input and performs operations downsampling and convolutional layers if the This method takes tensor x as input and performs operations downsampling and convolutional layers if the
self.downsample and self.in_conv properties of AdapterBlock model are specified. Then it applies a series self.downsample and self.in_conv properties of AdapterBlock model are specified. Then it applies a series of
of residual blocks to the input tensor. residual blocks to the input tensor.
""" """
if self.downsample is not None: if self.downsample is not None:
x = self.downsample(x) x = self.downsample(x)
...@@ -408,8 +408,8 @@ class AdapterResnetBlock(nn.Module): ...@@ -408,8 +408,8 @@ class AdapterResnetBlock(nn.Module):
def forward(self, x): def forward(self, x):
r""" r"""
This method takes input tensor x and applies a convolutional layer, ReLU activation, This method takes input tensor x and applies a convolutional layer, ReLU activation, and another convolutional
and another convolutional layer on the input tensor. It returns addition with the input tensor. layer on the input tensor. It returns addition with the input tensor.
""" """
h = x h = x
h = self.block1(h) h = self.block1(h)
...@@ -451,8 +451,8 @@ class LightAdapter(nn.Module): ...@@ -451,8 +451,8 @@ class LightAdapter(nn.Module):
def forward(self, x): def forward(self, x):
r""" r"""
This method takes the input tensor x and performs downscaling and appends it in list of feature tensors. This method takes the input tensor x and performs downscaling and appends it in list of feature tensors. Each
Each feature tensor corresponds to a different level of processing within the LightAdapter. feature tensor corresponds to a different level of processing within the LightAdapter.
""" """
x = self.unshuffle(x) x = self.unshuffle(x)
...@@ -480,8 +480,8 @@ class LightAdapterBlock(nn.Module): ...@@ -480,8 +480,8 @@ class LightAdapterBlock(nn.Module):
def forward(self, x): def forward(self, x):
r""" r"""
This method takes tensor x as input and performs downsampling if required. This method takes tensor x as input and performs downsampling if required. Then it applies in convolution
Then it applies in convolution layer, a sequence of residual blocks, and out convolutional layer. layer, a sequence of residual blocks, and out convolutional layer.
""" """
if self.downsample is not None: if self.downsample is not None:
x = self.downsample(x) x = self.downsample(x)
...@@ -502,8 +502,8 @@ class LightAdapterResnetBlock(nn.Module): ...@@ -502,8 +502,8 @@ class LightAdapterResnetBlock(nn.Module):
def forward(self, x): def forward(self, x):
r""" r"""
This function takes input tensor x and processes it through one convolutional layer, ReLU activation, This function takes input tensor x and processes it through one convolutional layer, ReLU activation, and
and another convolutional layer and adds it to input tensor. another convolutional layer and adds it to input tensor.
""" """
h = x h = x
h = self.block1(h) h = self.block1(h)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment