Commit 4f87756c authored by TangJingqi's avatar TangJingqi
Browse files

fix broken link

parent 77a34c28
...@@ -165,7 +165,7 @@ Through these two rules, we place all previously unmatched layers (and their sub ...@@ -165,7 +165,7 @@ Through these two rules, we place all previously unmatched layers (and their sub
## Muti-GPU ## Muti-GPU
If you have multiple GPUs, you can set the device for each module to different GPUs. If you have multiple GPUs, you can set the device for each module to different GPUs.
DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](ktransformers/optimize/optimize_rules). DeepseekV2-Chat got 60 layers, if we got 2 GPUs, we can allocate 30 layers to each GPU. Complete multi GPU rule examples [here](https://github.com/kvcache-ai/ktransformers/blob/main/ktransformers/optimize/optimize_rules/DeepSeek-V2-Chat-multi-gpu.yaml).
<p align="center"> <p align="center">
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment