Unverified Commit 7e0b75bb authored by lvhan028's avatar lvhan028 Committed by GitHub
Browse files

bump version to v0.0.2 (#177)

* bump version to v0.0.2

* fix command

* update installation and inference section
parent 859658eb
...@@ -55,9 +55,7 @@ Below are quick steps for installation: ...@@ -55,9 +55,7 @@ Below are quick steps for installation:
```shell ```shell
conda create -n lmdeploy python=3.10 -y conda create -n lmdeploy python=3.10 -y
conda activate lmdeploy conda activate lmdeploy
git clone https://github.com/InternLM/lmdeploy.git pip install lmdeploy
cd lmdeploy
pip install -e .
``` ```
### Deploy InternLM ### Deploy InternLM
...@@ -83,8 +81,7 @@ python3 -m lmdeploy.serve.turbomind.deploy internlm-chat-7b /path/to/internlm-ch ...@@ -83,8 +81,7 @@ python3 -m lmdeploy.serve.turbomind.deploy internlm-chat-7b /path/to/internlm-ch
#### Inference by TurboMind #### Inference by TurboMind
```shell ```shell
docker run --gpus all --rm -v $(pwd)/workspace:/workspace -it openmmlab/lmdeploy:latest \ python -m lmdeploy.turbomind.chat ./workspace
python3 -m lmdeploy.turbomind.chat /workspace
``` ```
```{note} ```{note}
...@@ -109,7 +106,7 @@ python3 -m lmdeploy.serve.client {server_ip_addresss}:33337 ...@@ -109,7 +106,7 @@ python3 -m lmdeploy.serve.client {server_ip_addresss}:33337
or webui, or webui,
```shell ```shell
python3 -m lmdeploy.app {server_ip_addresss}:33337 internlm python3 -m lmdeploy.app {server_ip_addresss}:33337
``` ```
![](https://github.com/InternLM/lmdeploy/assets/67539920/08d1e6f2-3767-44d5-8654-c85767cec2ab) ![](https://github.com/InternLM/lmdeploy/assets/67539920/08d1e6f2-3767-44d5-8654-c85767cec2ab)
......
...@@ -54,9 +54,7 @@ TurboMind 的 output token throughput 超过 2000 token/s, 整体比 DeepSpeed ...@@ -54,9 +54,7 @@ TurboMind 的 output token throughput 超过 2000 token/s, 整体比 DeepSpeed
```shell ```shell
conda create -n lmdeploy python=3.10 -y conda create -n lmdeploy python=3.10 -y
conda activate lmdeploy conda activate lmdeploy
git clone https://github.com/InternLM/lmdeploy.git pip install lmdeploy
cd lmdeploy
pip install -e .
``` ```
### 部署 InternLM ### 部署 InternLM
...@@ -82,8 +80,7 @@ python3 -m lmdeploy.serve.turbomind.deploy internlm-chat-7b /path/to/internlm-ch ...@@ -82,8 +80,7 @@ python3 -m lmdeploy.serve.turbomind.deploy internlm-chat-7b /path/to/internlm-ch
#### 使用 turbomind 推理 #### 使用 turbomind 推理
```shell ```shell
docker run --gpus all --rm -v $(pwd)/workspace:/workspace -it openmmlab/lmdeploy:latest \ python3 -m lmdeploy.turbomind.chat ./workspace
python3 -m lmdeploy.turbomind.chat /workspace
``` ```
```{note} ```{note}
......
# Copyright (c) OpenMMLab. All rights reserved. # Copyright (c) OpenMMLab. All rights reserved.
from typing import Tuple from typing import Tuple
__version__ = '0.0.1' __version__ = '0.0.2'
short_version = __version__ short_version = __version__
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment