"docs/getting_started/getting_started_utils_jax.py" did not exist on "c90a9214091badd1234b2d9ca851bd97f8edb0f6"
feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept...
feat(pytorch): Allow TransformerLayer and MultiheadAttention to accept sequence length parameters (#1066) * Added ability for seqlen for transformer and mha layer Signed-off-by:Lukasz Pierscieniewski <lukaszp@nvidia.com> * Documentation for new parameters Signed-off-by:
Lukasz Pierscieniewski <lukaszp@nvidia.com> * Add tests for THD layout, assert for THD layout with KV-Cache Signed-off-by:
Lukasz Pierscieniewski <lukaszp@nvidia.com> * Fixed tests Signed-off-by:
Lukasz Pierscieniewski <lukaszp@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Move THD logic in shape calculation, add missing optional in params Signed-off-by:
Lukasz Pierscieniewski <lukaszp@nvidia.com> * Skip the THD test on GPUs older than Ampere Signed-off-by:
Przemek Tredak <ptredak@nvidia.com> --------- Signed-off-by:
Lukasz Pierscieniewski <lukaszp@nvidia.com> Signed-off-by:
Przemek Tredak <ptredak@nvidia.com> Co-authored-by:
pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by:
Kirthi Shankar Sivamani <ksivamani@nvidia.com> Co-authored-by:
Przemek Tredak <ptredak@nvidia.com>
Showing
Please register or sign in to comment