Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
1ae132a0
Unverified
Commit
1ae132a0
authored
Jun 23, 2020
by
Patrick von Platen
Committed by
GitHub
Jun 23, 2020
Browse files
[Reformer] Axial Pos Emb Improve mem usage reformer (#5209)
* improve mem handling * improve mem for pos ax encodings
parent
51441040
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
3 deletions
+8
-3
src/transformers/modeling_reformer.py
src/transformers/modeling_reformer.py
+8
-3
No files found.
src/transformers/modeling_reformer.py
View file @
1ae132a0
...
...
@@ -154,9 +154,14 @@ class AxialPositionEmbeddings(nn.Module):
self
.
axial_pos_shape
,
sequence_length
,
self
.
least_common_mult_chunk_length
,
)
# reshape axial encodings and use only until sequence_length
position_encodings
=
torch
.
cat
(
broadcasted_weights
,
dim
=-
1
)
position_encodings
=
position_encodings
.
view
(
batch_size
,
-
1
,
position_encodings
.
shape
[
-
1
])[
# compute how many columns are needed
required_pos_encodings_columns
=
-
(
-
sequence_length
//
self
.
axial_pos_shape
[
1
])
# cut to columns that are needed
position_encodings
=
torch
.
cat
(
[
weight
[:,
:
required_pos_encodings_columns
]
for
weight
in
broadcasted_weights
],
dim
=-
1
)
position_encodings
=
torch
.
reshape
(
position_encodings
,
(
batch_size
,
-
1
,
position_encodings
.
shape
[
-
1
]))[
:,
:
sequence_length
]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment