Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
torch-harmonics
Commits
584e1bd6
Unverified
Commit
584e1bd6
authored
Jun 13, 2025
by
Thorsten Kurth
Committed by
GitHub
Jun 13, 2025
Browse files
Merge pull request #78 from NVIDIA/tkurth/attention-perf-test-fix
fixing attention perf test attempt 1
parents
26ce5cb5
47beb41a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
tests/test_attention.py
tests/test_attention.py
+1
-1
No files found.
tests/test_attention.py
View file @
584e1bd6
...
...
@@ -214,7 +214,6 @@ class TestNeighborhoodAttentionS2(unittest.TestCase):
self
.
assertTrue
(
torch
.
allclose
(
grad
,
grad_ref
,
atol
=
atol
,
rtol
=
rtol
),
f
"Parameter gradient mismatch"
)
@
unittest
.
skipUnless
((
torch
.
cuda
.
is_available
()
and
_cuda_extension_available
),
"skipping performance test because CUDA is not available"
)
@
parameterized
.
expand
(
[
# self attention
...
...
@@ -223,6 +222,7 @@ class TestNeighborhoodAttentionS2(unittest.TestCase):
],
skip_on_empty
=
True
,
)
@
unittest
.
skipUnless
((
torch
.
cuda
.
is_available
()
and
_cuda_extension_available
),
"skipping performance test because CUDA is not available"
)
def
test_perf
(
self
,
batch_size
,
channels
,
heads
,
in_shape
,
out_shape
,
grid_in
,
grid_out
,
atol
,
rtol
,
verbose
=
True
):
# extract some parameters
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment