"vscode:/vscode.git/clone" did not exist on "9d173c93e67213bb87c7c4286a5543867bd22bdf"
[PyTorch] TransformerLayer: add support for Falcon architecture (#513)
* [PyTorch] TransformerLayer: add parallel_attention_mlp to support Falcon models Signed-off-by:Markus Schnoes <markus.schnoes@gmx.de> * [PyTorch] add test for parallel_attention_mlp to test_numerics Signed-off-by:
Markus Schnoes <markus.schnoes@gmx.de> * [PyTorch] TorchGPT: fix dropout for parallel_attention_mlp Now uses nn.functional.dropout because depending on the path there are one or two dropouts. Signed-off-by:
Markus Schnoes <markus.schnoes@gmx.de> * Apply suggestions from code review Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> * [PyTorch] test_gpt_accuracy: fix spelling in construction of TorchGPT Signed-off-by:
Markus Schnoes <markus.schnoes@gmx.de> --------- Signed-off-by:
Markus Schnoes <markus.schnoes@gmx.de> Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> Co-authored-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com>
Showing
Please register or sign in to comment