Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
9af165c3
Commit
9af165c3
authored
Jun 07, 2023
by
Pierce Freeman
Browse files
Clean setup.py imports
parent
eb812c20
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
3 deletions
+1
-3
setup.py
setup.py
+1
-3
No files found.
setup.py
View file @
9af165c3
...
...
@@ -9,9 +9,7 @@ from packaging.version import parse, Version
import
platform
from
setuptools
import
setup
,
find_packages
from
setuptools.command.build
import
build
import
subprocess
from
setuptools.command.bdist_egg
import
bdist_egg
import
urllib.request
import
urllib.error
...
...
@@ -214,7 +212,7 @@ class CachedWheelsCommand(_bdist_wheel):
"""
def
run
(
self
):
if
FORCE_BUILD
:
return
build
.
run
(
self
)
return
super
()
.
run
()
raise_if_cuda_home_none
(
"flash_attn"
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment