Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
torch-scatter
Commits
8e0a7b60
".github/git@developer.sourcefind.cn:change/sglang.git" did not exist on "a20fc7b7dc3cb58c94f2622b0dc47d56c15f7887"
Commit
8e0a7b60
authored
Apr 23, 2020
by
rusty1s
Browse files
clean up [skip ci]
parent
dc82ebfb
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
6 additions
and
5 deletions
+6
-5
README.md
README.md
+2
-2
torch_scatter/__init__.py
torch_scatter/__init__.py
+0
-1
torch_scatter/scatter.py
torch_scatter/scatter.py
+4
-2
No files found.
README.md
View file @
8e0a7b60
...
@@ -40,10 +40,10 @@ All included operations are broadcastable, work on varying data types, are imple
...
@@ -40,10 +40,10 @@ All included operations are broadcastable, work on varying data types, are imple
### Binaries
### Binaries
#### PyTorch 1.5.0
We provide pip wheels for all major OS/PyTorch/CUDA combinations, see
[
here
](
https://pytorch-geometric.com/whl
)
.
We provide pip wheels for all major OS/PyTorch/CUDA combinations, see
[
here
](
https://pytorch-geometric.com/whl
)
.
#### PyTorch 1.5.0
To install the binaries for PyTorch 1.5.0, simply run
To install the binaries for PyTorch 1.5.0, simply run
```
```
...
...
torch_scatter/__init__.py
View file @
8e0a7b60
...
@@ -5,7 +5,6 @@ import os.path as osp
...
@@ -5,7 +5,6 @@ import os.path as osp
import
torch
import
torch
__version__
=
'2.0.4'
__version__
=
'2.0.4'
expected_torch_version
=
(
1
,
4
)
try
:
try
:
for
library
in
[
'_version'
,
'_scatter'
,
'_segment_csr'
,
'_segment_coo'
]:
for
library
in
[
'_version'
,
'_scatter'
,
'_segment_csr'
,
'_segment_coo'
]:
...
...
torch_scatter/scatter.py
View file @
8e0a7b60
...
@@ -50,9 +50,11 @@ def scatter_mean(src: torch.Tensor, index: torch.Tensor, dim: int = -1,
...
@@ -50,9 +50,11 @@ def scatter_mean(src: torch.Tensor, index: torch.Tensor, dim: int = -1,
count
.
clamp_
(
1
)
count
.
clamp_
(
1
)
count
=
broadcast
(
count
,
out
,
dim
)
count
=
broadcast
(
count
,
out
,
dim
)
if
torch
.
is_floating_point
(
out
):
if
torch
.
is_floating_point
(
out
):
out
.
true_divide_
(
count
)
out
.
div_
(
count
)
# out.true_divide_(count)
else
:
else
:
out
.
floor_divide_
(
count
)
out
.
div_
(
count
)
# out.floor_divide_(count)
return
out
return
out
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment