Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Fairseq
Commits
7358296b
Commit
7358296b
authored
Jul 25, 2018
by
higgsfield
Committed by
Myle Ott
Jul 25, 2018
Browse files
fixed output_proj's input_dim in attention (#226)
parent
28adb200
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
fairseq/models/lstm.py
fairseq/models/lstm.py
+1
-1
No files found.
fairseq/models/lstm.py
View file @
7358296b
...
@@ -217,7 +217,7 @@ class AttentionLayer(nn.Module):
...
@@ -217,7 +217,7 @@ class AttentionLayer(nn.Module):
super
().
__init__
()
super
().
__init__
()
self
.
input_proj
=
Linear
(
input_embed_dim
,
output_embed_dim
,
bias
=
False
)
self
.
input_proj
=
Linear
(
input_embed_dim
,
output_embed_dim
,
bias
=
False
)
self
.
output_proj
=
Linear
(
2
*
output_embed_dim
,
output_embed_dim
,
bias
=
False
)
self
.
output_proj
=
Linear
(
input_embed_dim
+
output_embed_dim
,
output_embed_dim
,
bias
=
False
)
def
forward
(
self
,
input
,
source_hids
,
encoder_padding_mask
):
def
forward
(
self
,
input
,
source_hids
,
encoder_padding_mask
):
# input: bsz x input_embed_dim
# input: bsz x input_embed_dim
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment