Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
FastMoE
Commits
fb3e3c29
Commit
fb3e3c29
authored
Feb 07, 2021
by
Rick Ho
Browse files
update the mem-transformer example
parent
b3380ec2
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
2 deletions
+5
-2
examples/transformer-xl/mem_transformer.py
examples/transformer-xl/mem_transformer.py
+5
-2
No files found.
examples/transformer-xl/mem_transformer.py
View file @
fb3e3c29
...
...
@@ -825,10 +825,13 @@ class CustomizedMoEPositionwiseFF(FMoETransformerMLP):
super
().
__init__
(
num_expert
=
8
,
d_model
=
d_model
,
d_hidden
=
d_inner
,
pre_lnorm
=
pre_lnorm
,
activation
=
activation
)
self
.
dropout
=
nn
.
Dropout
(
dropout
)
self
.
bias
=
nn
.
Parameter
(
torch
.
zeros
(
d_model
,
dtype
=
torch
.
float32
)
)
def
forward
(
self
,
x
):
x
,
bias
=
super
().
forward
(
x
)
return
x
+
bias
x
=
super
().
forward
(
x
)
return
x
+
self
.
bias
class
DecoderLayer
(
nn
.
Module
):
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment