Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
zhaoyu6
sglang
Commits
cfe2edac
Unverified
Commit
cfe2edac
authored
Jun 28, 2025
by
Sheng Qi
Committed by
GitHub
Jun 27, 2025
Browse files
[BUG] fix local_rank in initialize_dp_attention (#7584)
parent
2373faa3
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
3 deletions
+1
-3
python/sglang/srt/layers/dp_attention.py
python/sglang/srt/layers/dp_attention.py
+1
-3
No files found.
python/sglang/srt/layers/dp_attention.py
View file @
cfe2edac
...
...
@@ -79,14 +79,12 @@ def initialize_dp_attention(
)
if
enable_dp_attention
:
local_rank
=
tp_rank
%
(
tp_size
//
dp_size
)
_ATTN_DP_SIZE
=
dp_size
if
moe_dense_tp_size
is
None
:
_LOCAL_ATTN_DP_SIZE
=
_ATTN_DP_SIZE
else
:
_LOCAL_ATTN_DP_SIZE
=
max
(
1
,
dp_size
//
(
tp_size
//
moe_dense_tp_size
))
else
:
local_rank
=
tp_rank
_ATTN_DP_SIZE
=
1
_LOCAL_ATTN_DP_SIZE
=
1
...
...
@@ -96,7 +94,7 @@ def initialize_dp_attention(
list
(
range
(
head
,
head
+
_ATTN_TP_SIZE
))
for
head
in
range
(
0
,
pp_size
*
tp_size
,
_ATTN_TP_SIZE
)
],
local_rank
,
tp_group
.
local_rank
,
torch
.
distributed
.
get_backend
(
tp_group
.
device_group
),
use_pynccl
=
SYNC_TOKEN_IDS_ACROSS_TP
,
use_pymscclpp
=
False
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment