Unverified Commit b57d87c2 authored by fzyzcjy's avatar fzyzcjy Committed by GitHub
Browse files

Fix shared experts fusion + weight requant (#7177)

parent 98538822
...@@ -1960,7 +1960,8 @@ class DeepseekV2ForCausalLM(nn.Module): ...@@ -1960,7 +1960,8 @@ class DeepseekV2ForCausalLM(nn.Module):
) )
if layer_id in moe_layers: if layer_id in moe_layers:
shared_experts = layer.mlp.shared_experts shared_experts = getattr(layer.mlp, "shared_experts", None)
if shared_experts is not None:
for module in [ for module in [
shared_experts.gate_up_proj, shared_experts.gate_up_proj,
shared_experts.down_proj, shared_experts.down_proj,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment