@@ -343,7 +343,6 @@ We use [img2img-turbo](https://github.com/GaParmar/img2img-turbo) to train the s
...
@@ -343,7 +343,6 @@ We use [img2img-turbo](https://github.com/GaParmar/img2img-turbo) to train the s
Nunchaku is also inspired by many open-source libraries, including (but not limited to) [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM), [vLLM](https://github.com/vllm-project/vllm), [QServe](https://github.com/mit-han-lab/qserve), [AWQ](https://github.com/mit-han-lab/llm-awq), [FlashAttention-2](https://github.com/Dao-AILab/flash-attention), and [Atom](https://github.com/efeslab/Atom).
Nunchaku is also inspired by many open-source libraries, including (but not limited to) [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM), [vLLM](https://github.com/vllm-project/vllm), [QServe](https://github.com/mit-han-lab/qserve), [AWQ](https://github.com/mit-han-lab/llm-awq), [FlashAttention-2](https://github.com/Dao-AILab/flash-attention), and [Atom](https://github.com/efeslab/Atom).
## Star History
## Star History
[](https://www.star-history.com/#mit-han-lab/nunchaku&Date)
[](https://www.star-history.com/#mit-han-lab/nunchaku&Date)