Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
jerrrrry
infinicore
Commits
11d8d0bb
Commit
11d8d0bb
authored
Mar 05, 2026
by
wooway777
Browse files
issue/1033 - update readme
parent
d790f7b4
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
15 additions
and
0 deletions
+15
-0
README.md
README.md
+15
-0
No files found.
README.md
View file @
11d8d0bb
...
@@ -107,6 +107,7 @@ python scripts/install.py [XMAKE_CONFIG_FLAGS]
...
@@ -107,6 +107,7 @@ python scripts/install.py [XMAKE_CONFIG_FLAGS]
|
`--ali-ppu=[y\|n]`
| 是否编译阿里 PPU 接口实现 | n
|
`--ali-ppu=[y\|n]`
| 是否编译阿里 PPU 接口实现 | n
|
`--ninetoothed=[y\|n]`
| 是否编译九齿实现 | n
|
`--ninetoothed=[y\|n]`
| 是否编译九齿实现 | n
|
`--ccl=[y\|n]`
| 是否编译 InfiniCCL 通信库接口实现 | n
|
`--ccl=[y\|n]`
| 是否编译 InfiniCCL 通信库接口实现 | n
|
`--graph=[y\|n]`
| 是否编译 cuda graph 接口实现 | n
##### 手动安装底层库
##### 手动安装底层库
...
@@ -154,6 +155,20 @@ python scripts/install.py [XMAKE_CONFIG_FLAGS]
...
@@ -154,6 +155,20 @@ python scripts/install.py [XMAKE_CONFIG_FLAGS]
xmake f --ascend-npu=true -cv
xmake f --ascend-npu=true -cv
```
```
##### 试验功能 -- 使用flash attention库中的算子
```
shell
(
1
)
在third_party目录拉取cutlass和flash attn库的源码
(
不需要--recursive
)
(
2
)
设置
(
1
)
中cutlass路径的环境变量CUTLASS_ROOT
(
3
)
xmake配置环节额外打开
--aten
开关,并设置
--flash-attn
库位置,例:
xmake f
--nv-gpu
=
y
--ccl
=
y
--cuda
=
$CUDA_HOME
--aten
=
y
--flash-attn
=
<path-to>/InfiniCore/third_party/flash-attention
-cv
(
4
)
flash attenion库会伴随infinicore_cpp_api一同编译安装
```
2.
编译安装
2.
编译安装
默认安装路径为
`$HOME/.infini`
。
默认安装路径为
`$HOME/.infini`
。
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment