Unverified Commit b6bb8ce2 authored by del-zhenwu's avatar del-zhenwu Committed by GitHub
Browse files

[doc] use internlm-chat-7b (#124)

* Update README.md: use internlm-chat-7b

* Update README_zh-CN.md: use intern-chat-7b
parent db282626
...@@ -60,14 +60,14 @@ pip install -e . ...@@ -60,14 +60,14 @@ pip install -e .
# Make sure you have git-lfs installed (https://git-lfs.com) # Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install git lfs install
git clone https://huggingface.co/internlm/internlm-7b /path/to/internlm-7b git clone https://huggingface.co/internlm/internlm-chat-7b /path/to/internlm-chat-7b
# if you want to clone without large files – just their pointers # if you want to clone without large files – just their pointers
# prepend your git clone with the following env var: # prepend your git clone with the following env var:
GIT_LFS_SKIP_SMUDGE=1 GIT_LFS_SKIP_SMUDGE=1
# 2. Convert InternLM model to turbomind's format, which will be in "./workspace" by default # 2. Convert InternLM model to turbomind's format, which will be in "./workspace" by default
python3 -m lmdeploy.serve.turbomind.deploy internlm-7b /path/to/internlm-7b hf python3 -m lmdeploy.serve.turbomind.deploy internlm-7b /path/to/internlm-chat-7b hf
``` ```
......
...@@ -59,14 +59,14 @@ pip install -e . ...@@ -59,14 +59,14 @@ pip install -e .
# Make sure you have git-lfs installed (https://git-lfs.com) # Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install git lfs install
git clone https://huggingface.co/internlm/internlm-7b /path/to/internlm-7b git clone https://huggingface.co/internlm/internlm-chat-7b /path/to/internlm-chat-7b
# if you want to clone without large files – just their pointers # if you want to clone without large files – just their pointers
# prepend your git clone with the following env var: # prepend your git clone with the following env var:
GIT_LFS_SKIP_SMUDGE=1 GIT_LFS_SKIP_SMUDGE=1
# 2. 转换为 trubomind 要求的格式。默认存放路径为 ./workspace # 2. 转换为 trubomind 要求的格式。默认存放路径为 ./workspace
python3 -m lmdeploy.serve.turbomind.deploy internlm-7b /path/to/internlm-7b hf python3 -m lmdeploy.serve.turbomind.deploy internlm-7b /path/to/internlm-chat-7b hf
``` ```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment