Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
38b792aa
Unverified
Commit
38b792aa
authored
Aug 04, 2023
by
flybird1111
Committed by
GitHub
Aug 04, 2023
Browse files
[coloattention] fix import error (#4380)
fixed an import error
parent
25c57b9f
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
4 additions
and
1 deletion
+4
-1
colossalai/kernel/cuda_native/mha/__init__.py
colossalai/kernel/cuda_native/mha/__init__.py
+3
-0
tests/test_utils/test_flash_attention.py
tests/test_utils/test_flash_attention.py
+1
-1
No files found.
colossalai/kernel/cuda_native/mha/__init__.py
0 → 100644
View file @
38b792aa
from
.mha
import
ColoAttention
__all__
=
[
'ColoAttention'
]
tests/test_utils/test_flash_attention.py
View file @
38b792aa
...
...
@@ -9,7 +9,7 @@ from colossalai.kernel.cuda_native.mha.mem_eff_attn import HAS_MEM_EFF_ATTN
from
colossalai.testing
import
clear_cache_before_run
,
parameterize
if
HAS_MEM_EFF_ATTN
or
HAS_FLASH_ATTN
:
from
colossalai.kernel.cuda_native
.mha.mha
import
ColoAttention
from
colossalai.kernel.cuda_native
import
ColoAttention
from
colossalai.kernel.cuda_native.scaled_softmax
import
AttnMaskType
DTYPE
=
[
torch
.
float16
,
torch
.
bfloat16
,
torch
.
float32
]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment