Fix dtype propagation (#1141)
Summary: Previously, dtypes were not propagated correctly in composed transforms, resulting in errors when different dtypes were mixed. Even specifying a dtype in the constructor does not fix this. Neither does specifying the dtype for each composition function invocation (e.g. as a `kwarg` in `rotate_axis_angle`). With the change, I also had to modify the default dtype of `RotateAxisAngle`, which was `torch.float64`; it is now `torch.float32` like for all other transforms. This was required because the fix in propagation broke some tests due to dtype mismatches. This change in default dtype in turn broke two tests due to precision changes (calculations that were previously done in `torch.float64` were now done in `torch.float32`), so I changed the precision tolerances to be less strict. I chose the lowest power of ten that passed the tests here. Pull Request resolved: https://github.com/facebookresearch/pytorch3d/pull/1141 Reviewed By: patricklabatut Differential Revision: D35192970 Pulled By: bottler fbshipit-source-id: ba0293e8b3595dfc94b3cf8048e50b7a5e5ed7cf
Showing
Please register or sign in to comment