Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e535c389
".github/vscode:/vscode.git/clone" did not exist on "50415b84d62cb2bcd41f6a20eed5321f3270f270"
Unverified
Commit
e535c389
authored
Mar 02, 2022
by
Ross Johnstone
Committed by
GitHub
Mar 02, 2022
Browse files
Fix tiny typo (#15884)
parent
2eb7bb15
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py
...projects/bert-loses-patience/pabee/modeling_pabee_bert.py
+1
-1
No files found.
examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py
View file @
e535c389
...
...
@@ -56,7 +56,7 @@ class BertModelWithPabee(BertModel):
the self-attention layers, following the architecture described in `Attention is all you need`_ by Ashish Vaswani,
Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin.
To behave as a
n
decoder the model needs to be initialized with the
To behave as a decoder the model needs to be initialized with the
:obj:`is_decoder` argument of the configuration set to :obj:`True`; an
:obj:`encoder_hidden_states` is expected as an input to the forward pass.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment