Unverified Commit 9f7b0255 authored by Kirthi Shankar Sivamani's avatar Kirthi Shankar Sivamani Committed by GitHub
Browse files

Fixes #26 (#29)



addressed LayerNormMLP bias issue #26
Signed-off-by: default avatarKirthi Shankar Sivamani <ksivamani@nvidia.com>
Signed-off-by: default avatarKirthi Shankar Sivamani <ksivamani@nvidia.com>
parent acf98b5c
......@@ -2372,7 +2372,7 @@ class LayerNormMLP(TransformerEngineBaseModule):
self.weight2_fp8 if self.fp8 else None,
self.weight2_t_fp8 if self.fp8 else None,
self.fc2_bias,
False, # use_bias set to False for RPL
self.use_bias,
self.eps,
is_first_microbatch,
self.fp8,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment