[PyTorch] fix fuse_wgrad_accumulation in LayerNormMLP backward (#1618)
* [PyTorch] fix general_gemm argument out_dtype in LayerNormMLP backward Signed-off-by:Markus Schnoes <markus.schnoes@gmx.de> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Signed-off-by:
Markus Schnoes <markus.schnoes@gmx.de> Co-authored-by:
pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Showing
Please register or sign in to comment