Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
text-generation-inference
Commits
d461d955
Commit
d461d955
authored
May 24, 2024
by
huangwb
Browse files
fix baichuan config init bug
parent
52647592
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
server/text_generation_server/models/custom_modeling/flash_llama_modeling.py
...ion_server/models/custom_modeling/flash_llama_modeling.py
+3
-3
No files found.
server/text_generation_server/models/custom_modeling/flash_llama_modeling.py
View file @
d461d955
...
...
@@ -39,7 +39,7 @@ from text_generation_server.utils.layers import (
def
load_attention
(
config
,
prefix
,
weights
):
if
config
.
num_attention_heads
!=
config
.
num_key_value_heads
:
if
hasattr
(
config
,
'num_key_value_heads'
)
and
config
.
num_attention_heads
!=
config
.
num_key_value_heads
:
return
_load_gqa
(
config
,
prefix
,
weights
)
else
:
if
config
.
model_type
==
"baichuan"
:
...
...
@@ -107,7 +107,7 @@ class FlashLlamaAttention(torch.nn.Module):
self
.
rotary_emb
=
PositionRotaryEmbedding
.
static
(
config
=
config
,
dim
=
self
.
head_size
,
base
=
config
.
rope_theta
,
base
=
config
.
rope_theta
if
hasattr
(
config
,
'rope_theta'
)
else
10000
,
device
=
weights
.
device
,
)
...
...
@@ -121,7 +121,7 @@ class FlashLlamaAttention(torch.nn.Module):
self
.
num_heads
=
self
.
num_heads
//
weights
.
process_group
.
size
()
self
.
num_key_value_heads
=
(
config
.
num_key_value_heads
//
weights
.
process_group
.
size
()
)
)
if
hasattr
(
config
,
'num_key_value_heads'
)
else
(
config
.
num_attention_heads
//
weights
.
process_group
.
size
())
self
.
query_key_value
=
load_attention
(
config
,
prefix
,
weights
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment