# 2. Convert InternLM model to turbomind's format, and save it in the home folder of opencompass
lmdeploy convert internlm /path/to/internlm-20b \
--dst-path{/home/folder/of/opencompass}/turbomind
```
**Note**:
If evaluating the InternLM Chat model, make sure to pass `internlm-chat` as the model name instead of `internlm` when converting the model format. The specific command is:
### Evaluation with Turbomind Python API (recommended)
### Evaluation with Turbomind Python API (recommended)
...
@@ -61,6 +48,22 @@ You are expected to get the evaluation results after the inference and evaluatio
...
@@ -61,6 +48,22 @@ You are expected to get the evaluation results after the inference and evaluatio
### Evaluation with Turbomind gPRC API (optional)
### Evaluation with Turbomind gPRC API (optional)
Convert model to TurboMind format using lmdeploy
```shell
lmdeploy convert internlm /path/to/internlm-20b \
--dst-path{/home/folder/of/opencompass}/turbomind
```
**Note**:
If evaluating the InternLM Chat model, make sure to pass `internlm-chat` as the model name instead of `internlm` when converting the model format. The specific command is: