Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
change
sglang
Commits
4455b26e
Unverified
Commit
4455b26e
authored
Mar 10, 2025
by
DavidChan
Committed by
GitHub
Mar 10, 2025
Browse files
[Bug fixed] fixed the crash when enable the dp-attention on the single card (#3958)
parent
c553e160
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
python/sglang/srt/models/deepseek_v2.py
python/sglang/srt/models/deepseek_v2.py
+3
-3
No files found.
python/sglang/srt/models/deepseek_v2.py
View file @
4455b26e
...
@@ -848,12 +848,12 @@ class DeepseekV2AttentionMLA(nn.Module):
...
@@ -848,12 +848,12 @@ class DeepseekV2AttentionMLA(nn.Module):
def
all_gather
(
def
all_gather
(
input_tensor
:
torch
.
Tensor
,
forward_batch
:
ForwardBatch
,
rank
,
world_size
,
group
input_tensor
:
torch
.
Tensor
,
forward_batch
:
ForwardBatch
,
rank
,
world_size
,
group
):
):
if
world_size
==
1
:
return
input_tensor
all_lens
=
forward_batch
.
global_num_tokens_cpu
all_lens
=
forward_batch
.
global_num_tokens_cpu
max_len
=
max
(
forward_batch
.
global_num_tokens_cpu
)
max_len
=
max
(
forward_batch
.
global_num_tokens_cpu
)
if
world_size
==
1
:
return
input_tensor
,
0
,
all_lens
[
0
]
padded_tensor
=
torch
.
nn
.
functional
.
pad
(
padded_tensor
=
torch
.
nn
.
functional
.
pad
(
input_tensor
,
(
0
,
0
,
0
,
max_len
-
input_tensor
.
shape
[
0
])
input_tensor
,
(
0
,
0
,
0
,
max_len
-
input_tensor
.
shape
[
0
])
)
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment