README.md 3.56 KB
Newer Older
zhanggzh's avatar
zhanggzh committed
1
2
# <div align="center"><strong>KTransformers</strong></div>
## 简介
daill's avatar
daill committed
3
KTransformers 提供了数以千计的预训练模型,支持 100 多种语言的文本分类、信息抽取、问答、摘要、翻译、文本生成。它的宗旨是让最先进的 NLP 技术人人易用。
chenxl's avatar
chenxl committed
4

daill's avatar
daill committed
5
KTransformers 提供了便于快速下载和使用的API,让你可以把预训练模型用在给定文本、在你的数据集上微调然后通过 [model hub](https://huggingface.co/models) 与社区共享。同时,每个定义的 Python 模块都是完全独立的,便于修改和快速进行研究实验。
Mingxing Zhang's avatar
Mingxing Zhang committed
6

daill's avatar
daill committed
7
KTransformers 支持三个最热门的深度学习库: [Jax](https://jax.readthedocs.io/en/latest/), [PyTorch](https://pytorch.org/) 以及 [TensorFlow](https://www.tensorflow.org/) — 并与之无缝整合。你可以直接使用一个框架训练你的模型然后用另一个加载和推理。
UnicornChan's avatar
UnicornChan committed
8

zhanggezhong's avatar
zhanggezhong committed
9
## 安装
zhanggzh's avatar
zhanggzh committed
10
组件支持组合
chenxl's avatar
chenxl committed
11

zhanggzh's avatar
zhanggzh committed
12
13
14
   | PyTorch版本 | fastpt版本  |KTransformers版本      | DTK版本                  | Python版本       | 推荐编译方式 |
   | ----------- | ----------- | ----------- | ------------------------ | -----------------| ------------ |
   | 2.4.1       | 2.0.1       |0.2.3        | >= 25.04                 | 3.8、3.10、3.11  | fastpt不转码 |
chenxl's avatar
chenxl committed
15

zhanggzh's avatar
zhanggzh committed
16
+ pytorch版本大于2.4.1 && dtk版本大于25.04 推荐使用fastpt不转码编译。
chenxl's avatar
chenxl committed
17

zhanggzh's avatar
zhanggzh committed
18
19
20
21
22
23
24
25
26
### 1、使用pip方式安装
KTransformers whl包下载目录:[光和开发者社区](https://download.sourcefind.cn:65024/4/main),选择对应的pytorch版本和python版本下载对应KTransformers的whl包
```shell
pip install torch* (下载torch的whl包)
pip install fastpt* --no-deps (下载fastpt的whl包)
source  /usr/local/bin/fastpt -E
pip install ktransformers* (下载的ktransformers-fastpt的whl包) --no-deps
```
### 2、使用源码编译方式安装
chenxl's avatar
chenxl committed
27

zhanggzh's avatar
zhanggzh committed
28
29
#### 编译环境准备
提供基于fastpt不转码编译:
chenxl's avatar
chenxl committed
30

zhanggzh's avatar
zhanggzh committed
31
1. 基于光源pytorch基础镜像环境:镜像下载地址:[光合开发者社区](https://sourcefind.cn/#/image/dcu/pytorch),根据pytorch、python、dtk及系统下载对应的镜像版本。
Atream's avatar
Atream committed
32

zhanggzh's avatar
zhanggzh committed
33
34
35
36
37
38
39
2. 基于现有python环境:安装pytorch,fastpt whl包下载目录:[光合开发者社区](https://sourcefind.cn/#/image/dcu/pytorch),根据python、dtk版本,下载对应pytorch的whl包。安装命令如下:
```shell
pip install cpufeature
pip install torch* (下载torch的whl包)
pip install fastpt* --no-deps (下载fastpt的whl包, 安装顺序,先安装torch,后安装fastpt)
pip install setuptools==59.5.0 wheel
```
Atream's avatar
Atream committed
40

zhanggzh's avatar
zhanggzh committed
41
42
43
44
45
46
47
48
49
#### 源码编译安装
- 代码下载
```shell
git clone https://developer.sourcefind.cn/codes/OpenDAS/ktransformers.git # 根据编译需要切换分支
```
- 提供2种源码编译方式(进入ktransformers目录):
```
1. 设置不转码编译环境变量
source /usr/local/bin/fastpt -C
Atream's avatar
Atream committed
50

zhanggzh's avatar
zhanggzh committed
51
52
53
54
55
56
57
2. 编译whl包并安装
bash install_dcu.sh
pip3 install dist/ktransformers*.whl --no-deps
```
#### 注意事项
+ 若使用pip install下载安装过慢,可添加pypi清华源:-i https://pypi.tuna.tsinghua.edu.cn/simple/
+ ROCM_PATH为dtk的路径,默认为/opt/dtk
Azure's avatar
Azure committed
58

zhanggzh's avatar
zhanggzh committed
59
60
61
62
63
64
65
66
67
68
69
## 验证
```
python3
Python 3.10.12 (main, Feb  4 2025, 14:57:36) [GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import ktransformers
>>> ktransformers.__version__
'0.2.3post1'
>>>
```
版本号与官方版本同步,查询该软件的版本号,例如0.2.3post1;
Azure's avatar
Azure committed
70

zhanggzh's avatar
zhanggzh committed
71
72
## Known Issue
-
zhanggezhong's avatar
zhanggezhong committed
73

zhanggzh's avatar
zhanggzh committed
74
75
76
77
## 参考资料
- [README_ORIGIN](README_ORIGIN.md)
- [README_zh-CN](README_zh-CN.md)
- [https://github.com/kvcache-ai/ktransformers](https://github.com/kvcache-ai/ktransformers)