"vllm_flash_attn/flash_attn_interface.py" did not exist on "197f2083a2f0953af9319cf4ce32d0bf2aae4bd8"
-
pppppM authored
* add cal qparams * support offload inference * add collect funtions (mod,weight) * stats kv scales * update init * add user guide * fix hints * fix comments & support turbomind format * update user guide * fix slice kv cache error & support pileval dataset (used in llm-awq) * fix wrong num heads slice * update default dataset * fix conflict * fix hints * fix hints * add gitignore
3fff964d