Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
733117ea
Commit
733117ea
authored
Apr 22, 2024
by
Woosuk Kwon
Browse files
Add build script
parent
cb02853e
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
14 additions
and
0 deletions
+14
-0
build.sh
build.sh
+14
-0
No files found.
build.sh
0 → 100755
View file @
733117ea
#!/bin/bash
eval
"
$(
conda shell.bash hook
)
"
PYTORCH_VERSION
=
"2.2.1"
for
PYTHON_VERSION
in
38 39 310 311
;
do
source
~/.bashrc
;
conda activate vllm-flash-py
${
PYTHON_VERSION
}
;
conda
env
list
;
pip
install
packaging ninja
;
pip
install
torch
==
${
PYTORCH_VERSION
}
;
time
python setup.py bdist_wheel
--dist-dir
=
dist
;
done
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment