Commit 0799bc08 authored by limm's avatar limm
Browse files

suport v2.1.0

parent 50e05e1e
version: 2
build:
image: latest
python:
version: 3.8
system_packages: true
install:
- requirements: docs/requirements.txt
- method: setuptools
path: .
formats: []
[metadata] [metadata]
description-file = README.md long_description=file: README.md
long_description_content_type=text/markdown
classifiers =
Development Status :: 5 - Production/Stable
License :: OSI Approved :: MIT License
Programming Language :: Python
Programming Language :: Python :: 3.7
Programming Language :: Python :: 3.8
Programming Language :: Python :: 3.9
Programming Language :: Python :: 3.10
Programming Language :: Python :: 3 :: Only
[aliases] [aliases]
test = pytest test = pytest
[tool:pytest] [tool:pytest]
addopts = --capture=no --cov addopts = --capture=no
[egg_info]
tag_build =
tag_date = 0
This diff is collapsed.
import torch
from torch_scatter import scatter_logsumexp
def test_logsumexp():
inputs = torch.tensor([
0.5, 0.5, 0.0, -2.1, 3.2, 7.0, -1.0, -100.0,
float('-inf'),
float('-inf'), 0.0
])
inputs.requires_grad_()
index = torch.tensor([0, 0, 1, 1, 1, 2, 4, 4, 5, 6, 6])
splits = [2, 3, 1, 0, 2, 1, 2]
outputs = scatter_logsumexp(inputs, index)
for src, out in zip(inputs.split(splits), outputs.unbind()):
assert out.tolist() == torch.logsumexp(src, dim=0).tolist()
outputs.backward(torch.randn_like(outputs))
jit = torch.jit.script(scatter_logsumexp)
assert jit(inputs, index).tolist() == outputs.tolist()
import torch
from torch_scatter import scatter_log_softmax, scatter_softmax
def test_softmax():
src = torch.tensor([0.2, 0, 0.2, -2.1, 3.2, 7, -1, float('-inf')])
src.requires_grad_()
index = torch.tensor([0, 1, 0, 1, 1, 2, 4, 4])
out = scatter_softmax(src, index)
out0 = torch.softmax(torch.tensor([0.2, 0.2]), dim=-1)
out1 = torch.softmax(torch.tensor([0, -2.1, 3.2]), dim=-1)
out2 = torch.softmax(torch.tensor([7], dtype=torch.float), dim=-1)
out4 = torch.softmax(torch.tensor([-1, float('-inf')]), dim=-1)
expected = torch.stack([
out0[0], out1[0], out0[1], out1[1], out1[2], out2[0], out4[0], out4[1]
], dim=0)
assert torch.allclose(out, expected)
out.backward(torch.randn_like(out))
jit = torch.jit.script(scatter_softmax)
assert jit(src, index).tolist() == out.tolist()
def test_log_softmax():
src = torch.tensor([0.2, 0, 0.2, -2.1, 3.2, 7, -1, float('-inf')])
src.requires_grad_()
index = torch.tensor([0, 1, 0, 1, 1, 2, 4, 4])
out = scatter_log_softmax(src, index)
out0 = torch.log_softmax(torch.tensor([0.2, 0.2]), dim=-1)
out1 = torch.log_softmax(torch.tensor([0, -2.1, 3.2]), dim=-1)
out2 = torch.log_softmax(torch.tensor([7], dtype=torch.float), dim=-1)
out4 = torch.log_softmax(torch.tensor([-1, float('-inf')]), dim=-1)
expected = torch.stack([
out0[0], out1[0], out0[1], out1[1], out1[2], out2[0], out4[0], out4[1]
], dim=0)
assert torch.allclose(out, expected)
out.backward(torch.randn_like(out))
jit = torch.jit.script(scatter_log_softmax)
assert jit(src, index).tolist() == out.tolist()
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
[test] [test]
pytest pytest
pytest-runner
pytest-cov pytest-cov
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment