Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
Qwen_lmdeploy
Commits
70a5c63a
Unverified
Commit
70a5c63a
authored
Oct 19, 2023
by
AllentDan
Committed by
GitHub
Oct 19, 2023
Browse files
add solar chat template (#576)
parent
eb3b4dc9
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
69 additions
and
0 deletions
+69
-0
README.md
README.md
+1
-0
README_zh-CN.md
README_zh-CN.md
+1
-0
lmdeploy/model.py
lmdeploy/model.py
+67
-0
No files found.
README.md
View file @
70a5c63a
...
@@ -63,6 +63,7 @@ LMDeploy is a toolkit for compressing, deploying, and serving LLM, developed by
...
@@ -63,6 +63,7 @@ LMDeploy is a toolkit for compressing, deploying, and serving LLM, developed by
| :----------: | :-------------: | :--: | :-----: | :---: | :--: |
| :----------: | :-------------: | :--: | :-----: | :---: | :--: |
| Llama | Yes | Yes | Yes | Yes | No |
| Llama | Yes | Yes | Yes | Yes | No |
| Llama2 | Yes | Yes | Yes | Yes | No |
| Llama2 | Yes | Yes | Yes | Yes | No |
| SOLAR | Yes | Yes | Yes | Yes | No |
| InternLM-7B | Yes | Yes | Yes | Yes | No |
| InternLM-7B | Yes | Yes | Yes | Yes | No |
| InternLM-20B | Yes | Yes | Yes | Yes | No |
| InternLM-20B | Yes | Yes | Yes | Yes | No |
| QWen-7B | Yes | Yes | Yes | No | No |
| QWen-7B | Yes | Yes | Yes | No | No |
...
...
README_zh-CN.md
View file @
70a5c63a
...
@@ -64,6 +64,7 @@ LMDeploy 由 [MMDeploy](https://github.com/open-mmlab/mmdeploy) 和 [MMRazor](ht
...
@@ -64,6 +64,7 @@ LMDeploy 由 [MMDeploy](https://github.com/open-mmlab/mmdeploy) 和 [MMRazor](ht
| :----------: | :------: | :--: | :-----: | :---: | :--: |
| :----------: | :------: | :--: | :-----: | :---: | :--: |
| Llama | Yes | Yes | Yes | Yes | No |
| Llama | Yes | Yes | Yes | Yes | No |
| Llama2 | Yes | Yes | Yes | Yes | No |
| Llama2 | Yes | Yes | Yes | Yes | No |
| SOLAR | Yes | Yes | Yes | Yes | No |
| InternLM-7B | Yes | Yes | Yes | Yes | No |
| InternLM-7B | Yes | Yes | Yes | Yes | No |
| InternLM-20B | Yes | Yes | Yes | Yes | No |
| InternLM-20B | Yes | Yes | Yes | Yes | No |
| QWen-7B | Yes | Yes | Yes | No | No |
| QWen-7B | Yes | Yes | Yes | No | No |
...
...
lmdeploy/model.py
View file @
70a5c63a
...
@@ -575,6 +575,73 @@ class CodeLlama(Llama2):
...
@@ -575,6 +575,73 @@ class CodeLlama(Llama2):
return
super
().
messages2prompt
(
messages
,
sequence_start
)
return
super
().
messages2prompt
(
messages
,
sequence_start
)
@
MODELS
.
register_module
(
name
=
'solar'
)
class
SOLAR
(
BaseModel
):
"""Chat template of SOLAR model.
`https://huggingface.co/upstage/SOLAR-0-70b-16bit`
"""
def
__init__
(
self
,
b_sys
=
'### System:
\n
'
,
e_sys
=
'
\n\n
'
,
boh
=
'### User:
\n
'
,
eoh
=
'
\n\n
'
,
boa
=
'### Assistant:
\n
'
,
eoa
=
'
\n\n
'
,
system
=
''
,
session_len
=
2048
,
**
kwargs
):
super
().
__init__
(
**
kwargs
)
self
.
b_sys
=
b_sys
self
.
e_sys
=
e_sys
self
.
boh
=
boh
self
.
eoh
=
eoh
self
.
boa
=
boa
self
.
eoa
=
eoa
self
.
system
=
system
self
.
session_len
=
session_len
def
decorate_prompt
(
self
,
prompt
,
sequence_start
=
True
):
"""Return the prompt that is concatenated with other elements in the
chat template.
Args:
prompt (str): user's input prompt
sequence_start (bool): indicator for the first round chat of a
session sequence
Returns:
str: the concatenated prompt
"""
assert
self
.
capability
==
'chat'
,
\
f
'
{
type
(
self
).
__name__
}
has no capability of
{
self
.
capability
}
'
if
sequence_start
:
return
f
'
{
self
.
b_sys
}{
self
.
system
}{
self
.
e_sys
}
'
\
f
'
{
self
.
boh
}{
prompt
}{
self
.
eoh
}{
self
.
boa
}
'
return
f
'
{
self
.
boh
}{
prompt
}
{
self
.
eoh
}{
self
.
boa
}
'
def
messages2prompt
(
self
,
messages
,
sequence_start
=
True
):
"""Return the prompt that is concatenated with other elements in the
chat template.
Args:
messages (str | List): user's input prompt
Returns:
str: the concatenated prompt
"""
if
isinstance
(
messages
,
str
):
return
self
.
get_prompt
(
messages
,
sequence_start
)
system
,
users
,
assistants
=
self
.
_translate_messages
(
messages
)
system
=
self
.
system
if
not
system
else
system
ret
=
f
'
{
self
.
b_sys
}{
system
}{
self
.
e_sys
}
'
for
i
,
(
user
,
assistant
)
in
enumerate
(
zip
(
users
,
assistants
)):
ret
+=
f
'
{
self
.
boh
}{
user
}{
self
.
eoh
}{
self
.
boa
}
'
if
assistant
:
ret
+=
f
'
{
assistant
}{
self
.
eoa
}
'
return
ret
def
main
(
model_name
:
str
=
'test'
):
def
main
(
model_name
:
str
=
'test'
):
assert
model_name
in
MODELS
.
module_dict
.
keys
(),
\
assert
model_name
in
MODELS
.
module_dict
.
keys
(),
\
f
"'
{
model_name
}
' is not supported. "
\
f
"'
{
model_name
}
' is not supported. "
\
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment