Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
SIYIXNI
vllm
Commits
64ca424e
"src/include/threadwise_direct_convolution.hip.hpp" did not exist on "caf4d7e6f5652426c0a61181ed9c25adfbdb5ec5"
Unverified
Commit
64ca424e
authored
Sep 14, 2023
by
Woosuk Kwon
Committed by
GitHub
Sep 14, 2023
Browse files
Fix warning message on LLaMA FastTokenizer (#1037)
parent
b5f93d06
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
5 deletions
+5
-5
vllm/transformers_utils/tokenizer.py
vllm/transformers_utils/tokenizer.py
+5
-5
No files found.
vllm/transformers_utils/tokenizer.py
View file @
64ca424e
...
...
@@ -28,8 +28,8 @@ def get_tokenizer(
if
(
"llama"
in
tokenizer_name
.
lower
()
and
kwargs
.
get
(
"use_fast"
,
True
)
and
tokenizer_name
!=
_FAST_LLAMA_TOKENIZER
):
logger
.
info
(
"For some LLaMA
-based
models, initializing the fast tokenizer may "
"take a long time. To
eliminat
e the initialization time, consider "
"For some LLaMA
V1
models, initializing the fast tokenizer may "
"take a long time. To
reduc
e the initialization time, consider "
f
"using '
{
_FAST_LLAMA_TOKENIZER
}
' instead of the original "
"tokenizer."
)
try
:
...
...
@@ -41,9 +41,9 @@ def get_tokenizer(
except
TypeError
as
e
:
# The LLaMA tokenizer causes a protobuf error in some environments.
err_msg
=
(
"Failed to load the tokenizer. If you are using a LLaMA
-based
"
f
"
model, use
'
{
_FAST_LLAMA_TOKENIZER
}
' instead of the
original
"
"tokenizer."
)
"Failed to load the tokenizer. If you are using a LLaMA
V1 model
"
f
"
consider using
'
{
_FAST_LLAMA_TOKENIZER
}
' instead of the "
"
original
tokenizer."
)
raise
RuntimeError
(
err_msg
)
from
e
except
ValueError
as
e
:
# If the error pertains to the tokenizer class not existing or not
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment