- 08 Nov, 2023 1 commit
-
-
Alp Dener authored
* Fixed minor bug with DistributedConfigsHelper prematurely crashing the test for insufficient GPUs before @pytest.skip condition. Signed-off-by:
Alp Dener <adener@nvidia.com> * Update tests/jax/distributed_configs_helper.py Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> * Debug PyTest errors when running on single-GPU system Signed-off-by:
Tim Moon <tmoon@nvidia.com> --------- Signed-off-by:
Alp Dener <adener@nvidia.com> Signed-off-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> Signed-off-by:
Tim Moon <tmoon@nvidia.com> Co-authored-by:
Tim Moon <4406448+timmoon10@users.noreply.github.com> Co-authored-by:
Tim Moon <tmoon@nvidia.com>
-
- 03 Nov, 2023 1 commit
-
-
Alp Dener authored
[JAX] Regression tests for custom ops sharding with both xmap and custom_partitioning. Coverage: - layernorm: fwd/grad, zero_centered_gamma, DP, TP_COL, DP_TP_COL - rmsnorm: fwd/grad, DP, TP_COL, DP_TP_COL - softmax: fwd/grad, SCALED, SCALED_MASKED, SCALED_UPPER_TRIANG_MASKED, DP, TP_COL, TP_ROW, DP_TP_COL, DP_TP_ROW - self_fused_attn: fwd/grad, NO_BIAS, PRE_SCALE_BIAS, POST_SCALE_BIAS, NO_MASK, CAUSAL_MASK, PADDING_MASK, DP, TP_COL, DP_TP_COL - cross_fused_attn: fwd/grad, NO_BIAS, NO_MASK, PADDING_MASK, DP, TP_COL, DP_TP_COL Signed-off-by:Alp Dener <adener@nvidia.com>
-