Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
d9daad98
Commit
d9daad98
authored
Nov 07, 2019
by
Lysandre
Committed by
Lysandre Debut
Nov 26, 2019
Browse files
Re-ordering of group_idx/layer_idx + Python 2 tests
parent
9d5c4954
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
9 additions
and
3 deletions
+9
-3
transformers/modeling_albert.py
transformers/modeling_albert.py
+9
-3
No files found.
transformers/modeling_albert.py
View file @
d9daad98
...
...
@@ -281,11 +281,17 @@ class AlbertTransformer(nn.Module):
if
self
.
output_hidden_states
:
all_hidden_states
=
(
hidden_states
,)
for
layer_idx
in
range
(
self
.
config
.
num_hidden_layers
):
group_idx
=
int
(
layer_idx
/
self
.
config
.
num_hidden_layers
*
self
.
config
.
num_
hidden
_
group
s
)
for
i
in
range
(
self
.
config
.
num_hidden_layers
):
# Number of layers in a
hidden
group
layers_per_group
=
int
(
self
.
config
.
num_hidden_layers
/
self
.
config
.
num_hidden_groups
)
layer_group_output
=
self
.
albert_layer_groups
[
group_idx
](
hidden_states
,
attention_mask
,
head_mask
[
group_idx
*
layers_per_group
:(
group_idx
+
1
)
*
layers_per_group
])
# Index of the hidden group
group_idx
=
int
(
i
/
(
self
.
config
.
num_hidden_layers
/
self
.
config
.
num_hidden_groups
))
# Index of the layer inside the group
layer_idx
=
int
(
i
-
group_idx
*
layers_per_group
)
layer_group_output
=
self
.
albert_layer_groups
[
group_idx
](
hidden_states
,
attention_mask
,
head_mask
[
group_idx
*
layers_per_group
:(
group_idx
+
1
)
*
layers_per_group
])
hidden_states
=
layer_group_output
[
0
]
if
self
.
output_attentions
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment