Merge pull request #75 from InfiniTensor/issue/74
Issue/74 基于InfiniCore::nn::module适配Llama模型
Showing
.gitmodules
0 → 100644
csrc/cache/kv_cache.hpp
0 → 100644
csrc/models/llama/llama.hpp
0 → 100644
Please register or sign in to comment