- 25 Nov, 2022 1 commit
-
-
David Novotny authored
Summary: Addresses the following issue: https://github.com/facebookresearch/pytorch3d/issues/1345#issuecomment-1272881244 I.e., when installed from conda, `pytorch3d_implicitron_visualizer` crashes since it invokes `main()` while `main` requires a single positional arg `argv`. Reviewed By: shapovalov Differential Revision: D41533497 fbshipit-source-id: e53a923eb8b2f0f9c0e92e9c0866d9cb310c4799
-
- 08 Nov, 2022 1 commit
-
-
Roman Shapovalov authored
Summary: Enum fields cause the following to crash since they are loaded as strings: ``` config = OmegaConf.load(autodumped_cfg_file) Experiment(**config) ``` It would be good to come up with the general solution but for now just fixing the visualisation script. Reviewed By: bottler Differential Revision: D41140426 fbshipit-source-id: 71c1c6b1fffe3b5ab1ca0114cfa3f0d81160278f
-
- 07 Nov, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: Allow a module's param_group member to specify overrides to the param groups of its members or their members. Also logging for param group assignments. This allows defining `params.basis_matrix` in the param_groups of a voxel_grid. Reviewed By: shapovalov Differential Revision: D41080667 fbshipit-source-id: 49f3b0e5b36e496f78701db0699cbb8a7e20c51e
-
- 02 Nov, 2022 1 commit
-
-
David Novotny authored
Summary: Allows loading of multiple categories. Multiple categories are provided in a comma-separated list of category names. Reviewed By: bottler, shapovalov Differential Revision: D40803297 fbshipit-source-id: 863938be3aa6ffefe9e563aede4a2e9e66aeeaa8
-
- 31 Oct, 2022 1 commit
-
-
David Novotny authored
Summary: see title Reviewed By: shapovalov Differential Revision: D40803670 fbshipit-source-id: 211189167837af577d6502a698e2f3fb3aec3e30
-
- 23 Oct, 2022 3 commits
-
-
Jeremy Reizenstein authored
Summary: Yaml bool case fix Reviewed By: shapovalov Differential Revision: D40623031 fbshipit-source-id: 29b2fba171c2cbebfa03834e38b614d07275c997
-
Jeremy Reizenstein authored
Reviewed By: shapovalov Differential Revision: D40622304 fbshipit-source-id: 277515a55c46d9b8300058b439526539a7fe00a0
-
Jeremy Reizenstein authored
Summary: Add option to flat pad the last delta. Might to help when training on rgb only. Reviewed By: shapovalov Differential Revision: D40587475 fbshipit-source-id: c763fa38948600ea532c730538dc4ff29d2c3e0a
-
- 22 Oct, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: Make Implicitron run without visdom installed. Reviewed By: shapovalov Differential Revision: D40587974 fbshipit-source-id: dc319596c7a4d10a4c54c556dabc89ad9d25c2fb
-
- 18 Oct, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: Adds the ability to have different learning rates for different parts of the model. The trainable parts of the implicitron have a new member param_groups: dictionary where keys are names of individual parameters, or module’s members and values are the parameter group where the parameter/member will be sorted to. "self" key is used to denote the parameter group at the module level. Possible keys, including the "self" key do not have to be defined. By default all parameters are put into "default" parameter group and have the learning rate defined in the optimizer, it can be overriden at the: - module level with “self” key, all the parameters and child module s parameters will be put to that parameter group - member level, which is the same as if the `param_groups` in that member has key=“self” and value equal to that parameter group. This is useful if members do not have `param_groups`, for example torch.nn.Linear. - parameter level, parameter with the same name as the key will be put to that parameter group. And in the optimizer factory, parameters and their learning rates are recursively gathered. Reviewed By: shapovalov Differential Revision: D40145802 fbshipit-source-id: 631c02b8d79ee1c0eb4c31e6e42dbd3d2882078a
-
- 03 Oct, 2022 2 commits
-
-
Darijan Gudelj authored
Summary: Loads the whole dataset and moves it to the device and sends it to for sampling to enable full dataset heterogeneous raysampling. Reviewed By: bottler Differential Revision: D39263009 fbshipit-source-id: c527537dfc5f50116849656c9e171e868f6845b1
-
Darijan Gudelj authored
Summary: Changed ray_sampler and metrics to be able to use mixed frame raysampling. Ray_sampler now has a new member which it passes to the pytorch3d raysampler. If the raybundle is heterogeneous metrics now samples images by padding xys first. This reduces memory consumption. Reviewed By: bottler, kjchalup Differential Revision: D39542221 fbshipit-source-id: a6fec23838d3049ae5c2fd2e1f641c46c7c927e3
-
- 22 Sep, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: Allow using the new `foreach` option on optimizers. Reviewed By: shapovalov Differential Revision: D39694843 fbshipit-source-id: 97109c245b669bc6edff0f246893f95b7ae71f90
-
- 08 Sep, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: Various fixes to get visualize_reconstruction running, and an interactive test for it. Reviewed By: kjchalup Differential Revision: D39286691 fbshipit-source-id: 88735034cc01736b24735bcb024577e6ab7ed336
-
- 07 Sep, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: Workaround for oddity with new hydra. Reviewed By: davnov134 Differential Revision: D39280639 fbshipit-source-id: 76e91947f633589945446db93cf2dbc259642f8a
-
- 06 Sep, 2022 1 commit
-
-
David Novotny authored
Summary: Move the flyaround rendering function into core implicitron. The unblocks an example in the facebookresearch/co3d repo. Reviewed By: bottler Differential Revision: D39257801 fbshipit-source-id: 6841a88a43d4aa364dd86ba83ca2d4c3cf0435a4
-
- 01 Sep, 2022 1 commit
-
-
Pyre Bot Jr authored
Reviewed By: kjchalup Differential Revision: D39198333 fbshipit-source-id: 3f4ebcf625215f21d165073837578ff69b05f72d
-
- 31 Aug, 2022 1 commit
-
-
Sergii Dymchenko authored
Summary: torch.symeig is deprecated for a long time and is being removed by https://github.com/pytorch/pytorch/pull/70988. Created from CodeHub with https://fburl.com/edit-in-codehub Reviewed By: bottler Differential Revision: D39153103 fbshipit-source-id: 3a1397b6d86fb3e45e4777e06a4da3ee76591b32
-
- 30 Aug, 2022 1 commit
-
-
David Novotny authored
Summary: Adds yaml configs to train selected methods on CO3Dv2. Few more updates: 1) moved some fields to base classes so that we can check is_multisequence in experiment.py 2) skip loading all train cameras for multisequence datasets, without this, co3d-fewview is untrainable 3) fix bug in json index dataset provider v2 Reviewed By: kjchalup Differential Revision: D38952755 fbshipit-source-id: 3edac6fc8e20775aa70400bd73a0e6d52b091e0c
-
- 19 Aug, 2022 1 commit
-
-
David Novotny authored
Summary: Fixes the blender synthetic configs. Reviewed By: kjchalup Differential Revision: D38786095 fbshipit-source-id: 6d0784ced41a3f2904f074221108cdb56bd20e7f
-
- 18 Aug, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: generic_model_args no longer exists. Update some references to it, mostly in doc. This fixes the testing of all the yaml files in test_forward pass. Reviewed By: shapovalov Differential Revision: D38789202 fbshipit-source-id: f11417efe772d7f86368b3598aa66c52b1309dbf
-
- 15 Aug, 2022 1 commit
-
-
David Novotny authored
Summary: Adds additional source views to the eval batches for evaluating many-view models on CO3D Challenge Reviewed By: bottler Differential Revision: D38705904 fbshipit-source-id: cf7d00dc7db926fbd1656dd97a729674e9ff5adb
-
- 11 Aug, 2022 1 commit
-
-
Luca Di Grazia authored
Summary: **"filename"**: "projects/nerf/nerf/implicit_function.py" **"warning_type"**: "Incompatible variable type [9]", **"warning_message"**: " input_skips is declared to have type `Tuple[int]` but is used as type `Tuple[]`.", **"warning_line"**: 256, **"fix"**: input_skips: Tuple[int,...] = () Pull Request resolved: https://github.com/facebookresearch/pytorch3d/pull/1288 Reviewed By: kjchalup Differential Revision: D38615188 Pulled By: bottler fbshipit-source-id: a014344dd6cf2125f564f948a3c905ceb84cf994
-
- 10 Aug, 2022 3 commits
-
-
Jeremy Reizenstein authored
Summary: add link in main readme Reviewed By: kjchalup Differential Revision: D38560053 fbshipit-source-id: 0814febb67d0580394cfa2664e49e31ff7254bd4
-
Jeremy Reizenstein authored
Summary: Updates for recent replaceables. Reviewed By: kjchalup Differential Revision: D38437370 fbshipit-source-id: 00d600aa451e5849ba48107cd7a4319e9fc8549f
-
Jeremy Reizenstein authored
Summary: Linear followed by exponential LR progression. Needed for making Blender scenes converge. Reviewed By: kjchalup Differential Revision: D38557007 fbshipit-source-id: ad630dbc5b8fabcb33eeb5bdeed5e4f31360bac2
-
- 09 Aug, 2022 1 commit
-
-
Krzysztof Chalupka authored
Summary: LLFF (and most/all non-synth datasets) will have no background/foreground distinction. Add support for data with no fg mask. Also, we had a bug in stats loading, like this: * Load stats * One of the stats has a history of length 0 * That's fine, e.g. maybe it's fg_error but the dataset has no notion of fg/bg. So leave it as len 0 * Check whether all the stats have the same history length as an arbitrarily chosen "reference-stat" * Ooops the reference-stat happened to be the stat with length 0 * assert (legit_stat_len == reference_stat_len (=0)) ---> failed assert Also some minor fixes (from Jeremy's other diff) to support LLFF Reviewed By: davnov134 Differential Revision: D38475272 fbshipit-source-id: 5b35ac86d1d5239759f537621f41a3aa4eb3bd68
-
- 05 Aug, 2022 1 commit
-
-
Jeremy Reizenstein authored
Summary: remove n_instances==0 special case, standardise args for GlobalEncoderBase's forward. Reviewed By: shapovalov Differential Revision: D37817340 fbshipit-source-id: 0aac5fbc7c336d09be9d412cffff5712bda27290
-
- 03 Aug, 2022 3 commits
-
-
Jeremy Reizenstein authored
Summary: continued - avoid duplicate inputs Reviewed By: davnov134 Differential Revision: D38248827 fbshipit-source-id: 91ed398e304496a936f66e7a70ab3d189eeb5c70
-
Jeremy Reizenstein authored
Summary: continued - don't duplicate inputs Reviewed By: kjchalup Differential Revision: D38248829 fbshipit-source-id: 2d56418ecbec9cc597c3cf0c122199e274661516
-
Jeremy Reizenstein authored
Summary: Don't copy from one part of config to another, rather do the copy within GenericModel. Reviewed By: davnov134 Differential Revision: D38248828 fbshipit-source-id: ff8af985c37ea1f7df9e0aa0a45a58df34c3f893
-
- 02 Aug, 2022 7 commits
-
-
David Novotny authored
Summary: Stats are logically connected to the training loop, not to the model. Hence, moving to the training loop. Also removing resume_epoch from OptimizerFactory in favor of a single place - ModelFactory. This removes the need for config consistency checks etc. Reviewed By: kjchalup Differential Revision: D38313475 fbshipit-source-id: a1d188a63e28459df381ff98ad8acdcdb14887b7
-
Krzysztof Chalupka authored
Summary: Blender data doesn't have depths or crops. Reviewed By: shapovalov Differential Revision: D38345583 fbshipit-source-id: a19300daf666bbfd799d0038aeefa14641c559d7
-
Jeremy Reizenstein authored
Summary: Simple DataLoaderMapProvider instance Reviewed By: davnov134 Differential Revision: D38326719 fbshipit-source-id: 58556833e76fae5790d25a59bea0aac4ce046bf1
-
Krzysztof Chalupka authored
Summary: Before this diff, train_stats.py would not be created by default, EXCEPT when resuming training. This makes them appear from start. Reviewed By: shapovalov Differential Revision: D38320341 fbshipit-source-id: 8ea5b99ec81c377ae129f58e78dc2eaff94821ad
-
Jeremy Reizenstein authored
Summary: Remove the dataset's need to provide the task type. Reviewed By: davnov134, kjchalup Differential Revision: D38314000 fbshipit-source-id: 3805d885b5d4528abdc78c0da03247edb9abf3f7
-
Darijan Gudelj authored
Summary: Added _NEED_CONTROL to JsonIndexDatasetMapProviderV2 and made dataset_tweak_args use it. Reviewed By: bottler Differential Revision: D38313914 fbshipit-source-id: 529847571065dfba995b609a66737bd91e002cfe
-
Jeremy Reizenstein authored
Summary: Only import it if you ask for it. Reviewed By: kjchalup Differential Revision: D38327167 fbshipit-source-id: 3f05231f26eda582a63afc71b669996342b0c6f9
-
- 01 Aug, 2022 2 commits
-
-
David Novotny authored
Summary: Currently, seeds are set only inside the train loop. But this does not ensure that the model weights are initialized the same way everywhere which makes all experiments irreproducible. This diff fixes it. Reviewed By: bottler Differential Revision: D38315840 fbshipit-source-id: 3d2ecebbc36072c2b68dd3cd8c5e30708e7dd808
-
Jeremy Reizenstein authored
Summary: Make a dummy single-scene dataset using the code from generate_cow_renders (used in existing NeRF tutorials) Reviewed By: kjchalup Differential Revision: D38116910 fbshipit-source-id: 8db6df7098aa221c81d392e5cd21b0e67f65bd70
-