Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
ecabbf6d
Unverified
Commit
ecabbf6d
authored
Dec 10, 2019
by
Thomas Wolf
Committed by
GitHub
Dec 10, 2019
Browse files
Merge pull request #2107 from huggingface/encoder-mask-shape
create encoder attention mask from shape of hidden states
parents
1d189304
3520be78
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
4 deletions
+6
-4
transformers/modeling_bert.py
transformers/modeling_bert.py
+6
-4
No files found.
transformers/modeling_bert.py
View file @
ecabbf6d
...
...
@@ -692,16 +692,18 @@ class BertModel(BertPreTrainedModel):
# If a 2D ou 3D attention mask is provided for the cross-attention
# we need to make broadcastabe to [batch_size, num_heads, seq_length, seq_length]
if
self
.
config
.
is_decoder
:
if
self
.
config
.
is_decoder
and
encoder_hidden_states
is
not
None
:
encoder_batch_size
,
encoder_sequence_length
,
_
=
encoder_hidden_states
.
size
()
encoder_hidden_shape
=
(
encoder_batch_size
,
encoder_sequence_length
)
if
encoder_attention_mask
is
None
:
encoder_attention_mask
=
torch
.
ones
(
input
_shape
,
device
=
device
)
encoder_attention_mask
=
torch
.
ones
(
encoder_hidden
_shape
,
device
=
device
)
if
encoder_attention_mask
.
dim
()
==
3
:
encoder_extended_attention_mask
=
encoder_attention_mask
[:,
None
,
:,
:]
elif
encoder_attention_mask
.
dim
()
==
2
:
encoder_extended_attention_mask
=
encoder_attention_mask
[:,
None
,
None
,
:]
else
:
raise
ValueError
(
"Wrong shape for
input_ids
(shape {}) or encoder_attention_mask (shape {})"
.
format
(
input
_shape
,
raise
ValueError
(
"Wrong shape for
encoder_hidden_shape
(shape {}) or encoder_attention_mask (shape {})"
.
format
(
encoder_hidden
_shape
,
encoder_attention_mask
.
shape
))
encoder_extended_attention_mask
=
encoder_extended_attention_mask
.
to
(
dtype
=
next
(
self
.
parameters
()).
dtype
)
# fp16 compatibility
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment