Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
renzhc
diffusers_dcu
Commits
db91e710
Commit
db91e710
authored
Oct 02, 2023
by
Patrick von Platen
Browse files
make style
parent
2a62aadc
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
20 additions
and
20 deletions
+20
-20
src/diffusers/models/adapter.py
src/diffusers/models/adapter.py
+20
-20
No files found.
src/diffusers/models/adapter.py
View file @
db91e710
...
@@ -259,10 +259,10 @@ class T2IAdapter(ModelMixin, ConfigMixin):
...
@@ -259,10 +259,10 @@ class T2IAdapter(ModelMixin, ConfigMixin):
def
forward
(
self
,
x
:
torch
.
Tensor
)
->
List
[
torch
.
Tensor
]:
def
forward
(
self
,
x
:
torch
.
Tensor
)
->
List
[
torch
.
Tensor
]:
r
"""
r
"""
This function processes the input tensor `x` through the adapter model and returns a list of feature tensors,
This function processes the input tensor `x` through the adapter model and returns a list of feature tensors,
each representing information extracted at a different scale from the input.
each representing information extracted at a different scale from the input.
The length of the list is
The length of the list is
determined by the number of downsample blocks in the Adapter, as specified
determined by the number of downsample blocks in the Adapter, as specified
by the `channels` and
by the `channels` and
`num_res_blocks` parameters during initialization.
`num_res_blocks` parameters during initialization.
"""
"""
return
self
.
adapter
(
x
)
return
self
.
adapter
(
x
)
...
@@ -303,10 +303,10 @@ class FullAdapter(nn.Module):
...
@@ -303,10 +303,10 @@ class FullAdapter(nn.Module):
def
forward
(
self
,
x
:
torch
.
Tensor
)
->
List
[
torch
.
Tensor
]:
def
forward
(
self
,
x
:
torch
.
Tensor
)
->
List
[
torch
.
Tensor
]:
r
"""
r
"""
This method processes the input tensor `x` through the FullAdapter model and performs operations including
This method processes the input tensor `x` through the FullAdapter model and performs operations including
pixel unshuffling, convolution, and a stack of AdapterBlocks. It returns a list of feature tensors, each
capturing information
pixel unshuffling, convolution, and a stack of AdapterBlocks. It returns a list of feature tensors, each
at a different stage of processing within the FullAdapter model. The number of feature
tensors in the list is determined
capturing information
at a different stage of processing within the FullAdapter model. The number of feature
by the number of downsample blocks specified during initialization.
tensors in the list is determined
by the number of downsample blocks specified during initialization.
"""
"""
x
=
self
.
unshuffle
(
x
)
x
=
self
.
unshuffle
(
x
)
x
=
self
.
conv_in
(
x
)
x
=
self
.
conv_in
(
x
)
...
@@ -351,7 +351,7 @@ class FullAdapterXL(nn.Module):
...
@@ -351,7 +351,7 @@ class FullAdapterXL(nn.Module):
def
forward
(
self
,
x
:
torch
.
Tensor
)
->
List
[
torch
.
Tensor
]:
def
forward
(
self
,
x
:
torch
.
Tensor
)
->
List
[
torch
.
Tensor
]:
r
"""
r
"""
This method takes the tensor x as input and processes it through FullAdapterXL model. It consists of operations
This method takes the tensor x as input and processes it through FullAdapterXL model. It consists of operations
including unshuffling pixels, applying convolution layer and appending each block into list of feature tensors.
including unshuffling pixels, applying convolution layer and appending each block into list of feature tensors.
"""
"""
x
=
self
.
unshuffle
(
x
)
x
=
self
.
unshuffle
(
x
)
...
@@ -384,9 +384,9 @@ class AdapterBlock(nn.Module):
...
@@ -384,9 +384,9 @@ class AdapterBlock(nn.Module):
def
forward
(
self
,
x
):
def
forward
(
self
,
x
):
r
"""
r
"""
This method takes tensor x as input and performs operations downsampling and convolutional layers if the
This method takes tensor x as input and performs operations downsampling and convolutional layers if the
self.downsample and self.in_conv properties of AdapterBlock model are specified. Then it applies a series
self.downsample and self.in_conv properties of AdapterBlock model are specified. Then it applies a series
of
of
residual blocks to the input tensor.
residual blocks to the input tensor.
"""
"""
if
self
.
downsample
is
not
None
:
if
self
.
downsample
is
not
None
:
x
=
self
.
downsample
(
x
)
x
=
self
.
downsample
(
x
)
...
@@ -408,8 +408,8 @@ class AdapterResnetBlock(nn.Module):
...
@@ -408,8 +408,8 @@ class AdapterResnetBlock(nn.Module):
def
forward
(
self
,
x
):
def
forward
(
self
,
x
):
r
"""
r
"""
This method takes input tensor x and applies a convolutional layer, ReLU activation,
This method takes input tensor x and applies a convolutional layer, ReLU activation,
and another convolutional
and another convolutional
layer on the input tensor. It returns addition with the input tensor.
layer on the input tensor. It returns addition with the input tensor.
"""
"""
h
=
x
h
=
x
h
=
self
.
block1
(
h
)
h
=
self
.
block1
(
h
)
...
@@ -451,8 +451,8 @@ class LightAdapter(nn.Module):
...
@@ -451,8 +451,8 @@ class LightAdapter(nn.Module):
def
forward
(
self
,
x
):
def
forward
(
self
,
x
):
r
"""
r
"""
This method takes the input tensor x and performs downscaling and appends it in list of feature tensors.
This method takes the input tensor x and performs downscaling and appends it in list of feature tensors.
Each
Each
feature tensor corresponds to a different level of processing within the LightAdapter.
feature tensor corresponds to a different level of processing within the LightAdapter.
"""
"""
x
=
self
.
unshuffle
(
x
)
x
=
self
.
unshuffle
(
x
)
...
@@ -480,8 +480,8 @@ class LightAdapterBlock(nn.Module):
...
@@ -480,8 +480,8 @@ class LightAdapterBlock(nn.Module):
def
forward
(
self
,
x
):
def
forward
(
self
,
x
):
r
"""
r
"""
This method takes tensor x as input and performs downsampling if required.
This method takes tensor x as input and performs downsampling if required.
Then it applies in convolution
Then it applies in convolution
layer, a sequence of residual blocks, and out convolutional layer.
layer, a sequence of residual blocks, and out convolutional layer.
"""
"""
if
self
.
downsample
is
not
None
:
if
self
.
downsample
is
not
None
:
x
=
self
.
downsample
(
x
)
x
=
self
.
downsample
(
x
)
...
@@ -502,8 +502,8 @@ class LightAdapterResnetBlock(nn.Module):
...
@@ -502,8 +502,8 @@ class LightAdapterResnetBlock(nn.Module):
def
forward
(
self
,
x
):
def
forward
(
self
,
x
):
r
"""
r
"""
This function takes input tensor x and processes it through one convolutional layer, ReLU activation,
This function takes input tensor x and processes it through one convolutional layer, ReLU activation,
and
and
another convolutional layer and adds it to input tensor.
another convolutional layer and adds it to input tensor.
"""
"""
h
=
x
h
=
x
h
=
self
.
block1
(
h
)
h
=
self
.
block1
(
h
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment