-
Kim, Jin (Jay@SKT) authored
* Add sigmoid GLU Signed-off-by:
Kim, Jin <jinn.kim@sk.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci Signed-off-by:
Kim, Jin <jinn.kim@sk.com> * Add test for GLU op Signed-off-by:
Tim Moon <tmoon@nvidia.com> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Fix incorrect reshape Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> * Apply suggestion from @timmoon10 Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> * Add omitted tests for GLU op Signed-off-by:
Kim, Jin <jinn.kim@sk.com> * Add GLU activation type support in JAX extension Signed-off-by:
Kim, Jin <jinn.kim@sk.com> * [PyTorch] Add Sigmoid activation for GLU support in numerics test (#2656) Signed-off-by:
Kim, Jin <jinn.kim@sk.com> --------- Signed-off-by:
Kim, Jin <jinn.kim@sk.com> Signed-off-by:
Tim Moon <tmoon@nvidia.com> Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> Co-authored-by:
pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by:
Tim Moon <tmoon@nvidia.com> Co-authored-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com>
33ca6150