[Feature] Add a torch client (#19)
* draft torch client * deal with space of tokenizer * support tensor parallel * fix * fix * move folder * move instruction to readme * move to torch/ * rename client to chat * very bad response * stash * rename streamer * support internlm * change default args * remove test * improve instructions * remove module docstring * decrease header level of torch model
Showing
lmdeploy/torch/__init__.py
0 → 100644
lmdeploy/torch/chat.py
0 → 100644
lmdeploy/torch/utils.py
0 → 100644
Please register or sign in to comment