Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
183ce067
Unverified
Commit
183ce067
authored
Jan 21, 2022
by
novice
Committed by
GitHub
Jan 21, 2022
Browse files
Fix (#15276)
* Fix * make style * Remove trailing commas * make style
parent
b4ce313e
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
4 additions
and
14 deletions
+4
-14
src/transformers/models/swin/__init__.py
src/transformers/models/swin/__init__.py
+1
-1
src/transformers/models/swin/configuration_swin.py
src/transformers/models/swin/configuration_swin.py
+1
-1
src/transformers/models/swin/modeling_swin.py
src/transformers/models/swin/modeling_swin.py
+2
-12
No files found.
src/transformers/models/swin/__init__.py
View file @
183ce067
...
@@ -18,7 +18,7 @@
...
@@ -18,7 +18,7 @@
from
typing
import
TYPE_CHECKING
from
typing
import
TYPE_CHECKING
# rely on isort to merge the imports
# rely on isort to merge the imports
from
...file_utils
import
_LazyModule
,
is_
flax_available
,
is_tf_available
,
is_torch_available
,
is_vision
_available
from
...file_utils
import
_LazyModule
,
is_
torch
_available
_import_structure
=
{
_import_structure
=
{
...
...
src/transformers/models/swin/configuration_swin.py
View file @
183ce067
...
@@ -53,7 +53,7 @@ class SwinConfig(PretrainedConfig):
...
@@ -53,7 +53,7 @@ class SwinConfig(PretrainedConfig):
window_size (`int`, *optional*, defaults to 7):
window_size (`int`, *optional*, defaults to 7):
Size of windows.
Size of windows.
mlp_ratio (`float`, *optional*, defaults to 4.0):
mlp_ratio (`float`, *optional*, defaults to 4.0):
Ratio of MLP hidden dimesionality to embedding dimensionality.
Ratio of MLP hidden dime
n
sionality to embedding dimensionality.
qkv_bias (`bool`, *optional*, defaults to True):
qkv_bias (`bool`, *optional*, defaults to True):
Whether or not a learnable bias should be added to the queries, keys and values.
Whether or not a learnable bias should be added to the queries, keys and values.
hidden_dropout_prob (`float`, *optional*, defaults to 0.0):
hidden_dropout_prob (`float`, *optional*, defaults to 0.0):
...
...
src/transformers/models/swin/modeling_swin.py
View file @
183ce067
...
@@ -583,20 +583,10 @@ class SwinEncoder(nn.Module):
...
@@ -583,20 +583,10 @@ class SwinEncoder(nn.Module):
all_hidden_states
=
all_hidden_states
+
(
hidden_states
,)
all_hidden_states
=
all_hidden_states
+
(
hidden_states
,)
if
not
return_dict
:
if
not
return_dict
:
return
tuple
(
return
tuple
(
v
for
v
in
[
hidden_states
,
all_hidden_states
,
all_self_attentions
]
if
v
is
not
None
)
v
for
v
in
[
hidden_states
,
all_hidden_states
,
all_self_attentions
,
]
if
v
is
not
None
)
return
BaseModelOutput
(
return
BaseModelOutput
(
last_hidden_state
=
hidden_states
,
last_hidden_state
=
hidden_states
,
hidden_states
=
all_hidden_states
,
attentions
=
all_self_attentions
hidden_states
=
all_hidden_states
,
attentions
=
all_self_attentions
,
)
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment