Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
change
sglang
Commits
01e59e82
"git@developer.sourcefind.cn:change/sglang.git" did not exist on "c3eac1b010b3da3086457e40af555690da0787a6"
Unverified
Commit
01e59e82
authored
Oct 12, 2025
by
Liangsheng Yin
Committed by
GitHub
Oct 12, 2025
Browse files
Fix CI break by express-laned PRs. (#11499)
parent
99a0704a
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
6 additions
and
2 deletions
+6
-2
python/sglang/srt/layers/attention/flashattention_backend.py
python/sglang/srt/layers/attention/flashattention_backend.py
+2
-1
python/sglang/srt/layers/attention/flashinfer_backend.py
python/sglang/srt/layers/attention/flashinfer_backend.py
+4
-1
No files found.
python/sglang/srt/layers/attention/flashattention_backend.py
View file @
01e59e82
from
__future__
import
annotations
from
dataclasses
import
dataclass
from
typing
import
TYPE_CHECKING
,
Optional
,
Union
from
typing
import
TYPE_CHECKING
,
Optional
import
numpy
as
np
import
torch
...
...
@@ -10,6 +10,7 @@ import triton.language as tl
from
sglang.srt.configs.model_config
import
AttentionArch
from
sglang.srt.layers.attention.base_attn_backend
import
AttentionBackend
from
sglang.srt.layers.radix_attention
import
AttentionType
from
sglang.srt.managers.schedule_batch
import
global_server_args_dict
from
sglang.srt.model_executor.forward_batch_info
import
ForwardBatch
,
ForwardMode
from
sglang.srt.speculative.spec_info
import
SpecInput
...
...
python/sglang/srt/layers/attention/flashinfer_backend.py
View file @
01e59e82
...
...
@@ -728,7 +728,10 @@ class FlashInferAttnBackend(AttentionBackend):
)
else
:
causal
=
True
if
layer
.
is_cross_attention
or
layer
.
attn_type
==
AttentionType
.
ENCODER_ONLY
:
if
(
layer
.
is_cross_attention
or
layer
.
attn_type
==
AttentionType
.
ENCODER_ONLY
):
causal
=
False
if
save_kv_cache
and
layer
.
attn_type
==
AttentionType
.
ENCODER_ONLY
:
save_kv_cache
=
False
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment