============================= test session starts ============================== platform linux -- Python 3.10.12, pytest-8.3.3, pluggy-1.5.0 rootdir: /workspace configfile: pytest.ini plugins: mock-3.14.0 collected 562 items / 3 errors ==================================== ERRORS ==================================== _______ ERROR collecting tests/unit_tests/data/test_preprocess_mmdata.py _______ tests/unit_tests/data/test_preprocess_mmdata.py:14: in from tools.preprocess_mmdata import Encoder tools/preprocess_mmdata.py:12: in from torchvision.transforms import ToTensor /usr/local/lib/python3.10/site-packages/torchvision/__init__.py:10: in from torchvision import _meta_registrations, datasets, io, models, ops, transforms, utils # usort:skip /usr/local/lib/python3.10/site-packages/torchvision/_meta_registrations.py:164: in def meta_nms(dets, scores, iou_threshold): /usr/local/lib/python3.10/site-packages/torch/library.py:654: in register use_lib._register_fake(op_name, func, _stacklevel=stacklevel + 1) /usr/local/lib/python3.10/site-packages/torch/library.py:154: in _register_fake handle = entry.abstract_impl.register(func_to_register, source) /usr/local/lib/python3.10/site-packages/torch/_library/abstract_impl.py:31: in register if torch._C._dispatch_has_kernel_for_dispatch_key(self.qualname, "Meta"): E RuntimeError: operator torchvision::nms does not exist __ ERROR collecting tests/unit_tests/dist_checkpointing/models/test_mamba.py ___ ImportError while importing test module '/workspace/tests/unit_tests/dist_checkpointing/models/test_mamba.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: megatron/core/ssm/mamba_mixer.py:41: in from mamba_ssm.ops.triton.layernorm_gated import RMSNorm as RMSNormGated E ModuleNotFoundError: No module named 'mamba_ssm' During handling of the above exception, another exception occurred: /usr/local/lib/python3.10/importlib/__init__.py:126: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/unit_tests/dist_checkpointing/models/test_mamba.py:17: in from megatron.core.ssm.mamba_mixer import MambaMixer, MambaMixerSubmodules megatron/core/ssm/mamba_mixer.py:47: in raise ImportError("mamba-ssm is required by the Mamba model but cannot be imported") E ImportError: mamba-ssm is required by the Mamba model but cannot be imported _________ ERROR collecting tests/unit_tests/models/test_mamba_model.py _________ ImportError while importing test module '/workspace/tests/unit_tests/models/test_mamba_model.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: megatron/core/ssm/mamba_mixer.py:41: in from mamba_ssm.ops.triton.layernorm_gated import RMSNorm as RMSNormGated E ModuleNotFoundError: No module named 'mamba_ssm' During handling of the above exception, another exception occurred: /usr/local/lib/python3.10/importlib/__init__.py:126: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/unit_tests/models/test_mamba_model.py:7: in from megatron.core.models.mamba.mamba_layer_specs import mamba_stack_spec megatron/core/models/mamba/mamba_layer_specs.py:11: in from megatron.core.ssm.mamba_mixer import MambaMixer, MambaMixerSubmodules megatron/core/ssm/mamba_mixer.py:47: in raise ImportError("mamba-ssm is required by the Mamba model but cannot be imported") E ImportError: mamba-ssm is required by the Mamba model but cannot be imported =============================== warnings summary =============================== megatron/core/tensor_parallel/layers.py:280 /workspace/megatron/core/tensor_parallel/layers.py:280: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead. def forward(ctx, input, weight, bias, allreduce_dgrad): megatron/core/tensor_parallel/layers.py:290 /workspace/megatron/core/tensor_parallel/layers.py:290: FutureWarning: `torch.cuda.amp.custom_bwd(args...)` is deprecated. Please use `torch.amp.custom_bwd(args..., device_type='cuda')` instead. def backward(ctx, grad_output): megatron/core/tensor_parallel/layers.py:381 /workspace/megatron/core/tensor_parallel/layers.py:381: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead. def forward( megatron/core/tensor_parallel/layers.py:420 /workspace/megatron/core/tensor_parallel/layers.py:420: FutureWarning: `torch.cuda.amp.custom_bwd(args...)` is deprecated. Please use `torch.amp.custom_bwd(args..., device_type='cuda')` instead. def backward(ctx, grad_output): megatron/core/transformer/attention.py:29 /workspace/megatron/core/transformer/attention.py:29: DeprecationWarning: The 'megatron.core.transformer.custom_layers.transformer_engine' module is deprecated and will be removed in 0.10.0. Please use 'megatron.core.extensions.transformer_engine' instead. from megatron.core.transformer.custom_layers.transformer_engine import SplitAlongDim megatron/core/dist_checkpointing/strategies/torch.py:17 /workspace/megatron/core/dist_checkpointing/strategies/torch.py:17: DeprecationWarning: `torch.distributed._sharded_tensor` will be deprecated, use `torch.distributed._shard.sharded_tensor` instead from torch.distributed._sharded_tensor import ShardedTensor as TorchShardedTensor tests/unit_tests/dist_checkpointing/test_async_save.py:74 /workspace/tests/unit_tests/dist_checkpointing/test_async_save.py:74: PytestUnknownMarkWarning: Unknown pytest.mark.flaky_in_dev - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.flaky_in_dev tests/unit_tests/dist_checkpointing/test_fp8.py:55 /workspace/tests/unit_tests/dist_checkpointing/test_fp8.py:55: PytestUnknownMarkWarning: Unknown pytest.mark.flaky_in_dev - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.flaky_in_dev tests/unit_tests/test_utilities.py:11 /workspace/tests/unit_tests/test_utilities.py:11: PytestCollectionWarning: cannot collect test class 'TestModel' because it has a __init__ constructor (from: tests/unit_tests/distributed/test_param_and_grad_buffer.py) class TestModel(torch.nn.Module): tests/unit_tests/test_utilities.py:11 /workspace/tests/unit_tests/test_utilities.py:11: PytestCollectionWarning: cannot collect test class 'TestModel' because it has a __init__ constructor (from: tests/unit_tests/test_utilities.py) class TestModel(torch.nn.Module): tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:20 /workspace/tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:20: PytestUnknownMarkWarning: Unknown pytest.mark.timeout - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.timeout(120) tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:36 /workspace/tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:36: PytestUnknownMarkWarning: Unknown pytest.mark.timeout - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.timeout(120) tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:52 /workspace/tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:52: PytestUnknownMarkWarning: Unknown pytest.mark.timeout - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.timeout(120) tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:71 /workspace/tests/unit_tests/transformer/moe/test_a2a_token_dispatcher.py:71: PytestUnknownMarkWarning: Unknown pytest.mark.timeout - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.timeout(120) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ ERROR tests/unit_tests/data/test_preprocess_mmdata.py - RuntimeError: operato... ERROR tests/unit_tests/dist_checkpointing/models/test_mamba.py ERROR tests/unit_tests/models/test_mamba_model.py !!!!!!!!!!!!!!!!!!! Interrupted: 3 errors during collection !!!!!!!!!!!!!!!!!!!! ======================== 14 warnings, 3 errors in 2.70s ========================