- 03 Aug, 2021 1 commit
-
-
Hang Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/105 exploring deformable attention in transformer Reviewed By: bichenwu09 Differential Revision: D29093714 fbshipit-source-id: dd691754d9e439661e2eddecb3a1e7cefc8fe568
-
- 01 Aug, 2021 1 commit
-
-
Zhicheng Yan authored
Summary: Deformable DETR training can be unstable due to iterative box refinement in the transformer decoder. To stabilize the training, introduce two changes - Remove the unnecessary use of inverse sigmoid. It is possible to completely avoid using inverse sigmoid when box refinement is turned on. - In `DeformableTransformer` class, detach `init_reference_out` before passing it into decoder to update memory and computer per-decoder-layer reference points/ Reviewed By: zhanghang1989 Differential Revision: D29903599 fbshipit-source-id: a374ba161be0d7bcdfb42553044c4c6700e92623
-
- 29 Jul, 2021 1 commit
-
-
Hang Zhang authored
Summary: Add new backbone Experimental results are https://fburl.com/7fyecmrc Reviewed By: bichenwu09 Differential Revision: D26877909 fbshipit-source-id: ba3f97a1e4d84bec22d6a345f1fca06c741010cc
-
- 08 Jul, 2021 1 commit
-
-
Zhicheng Yan authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/96 In `DETRRunner`, the method `build_optimizer` customized the following logics, which are actually redundant to parent class implementation and can be removed. - Discount LR for certain modules, such as those with name `reference_points`, `backbone`, and `sampling_offsets`. - Those can be achieved by `SOLVER.LR_MULTIPLIER_OVERWRITE` after we update `get_default_optimizer_params` in `mobile-vision/d2go/d2go/optimizer/build.py`. - Full model gradient clipping - This is also implemented in `mobile-vision/d2go/d2go/optimizer/build.py` It also has minor issues - It ignores `SOLVER.WEIGHT_DECAY_NORM` which can set a different weight decay for affine parameters in the norm modules. Reviewed By: zhanghang1989 Differential Revision: D29420642 fbshipit-source-id: deeb9348c9d282231c540dde6161acedd8e3a119
-
- 02 Jul, 2021 1 commit
-
-
Zhicheng Yan authored
Summary: In D29048363 (https://github.com/facebookresearch/d2go/commit/c480d4e4e213a850cced7758f7b62c20caad8820) we make the detaching of `reference_points` earlier in the hope of allowing more gradient flow to update weights in `self.bbox_embed`. In this diff, we revert the changes as i) it does not improve box AP ii) it may potential cause in-stable optimization when iterative box refinement is turned on. Reviewed By: zhanghang1989 Differential Revision: D29530735 fbshipit-source-id: 3217c863343836e129d53e07c0eedb2db8164fe6
-
- 30 Jun, 2021 1 commit
-
-
Zhicheng Yan authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/97 Major changes - Fix a bug within `inference()` function - Refactor code to remove redundant code between `SetCriterion` and `FocalLossSetCriterion`. Reviewed By: zhanghang1989 Differential Revision: D29481067 fbshipit-source-id: 64788f1ff331177db964eb36d380430799d1d2f2
-
- 24 Jun, 2021 1 commit
-
-
Zhicheng Yan authored
Summary: Major changes - As described in details in appendix A.4 in deformable DETR paper (https://arxiv.org/abs/2010.04159), the gradient back-propagation is blocked at inverse_sigmoid(bounding box x/y/w/h from last decoder layer). This can be implemented by detaching tensor from compute graph in pytorch. However, currently we detach at an incorrect tensor, preventing update the layers which predicts delta x/y/w/h. Fix this bug. - Add more comments to annotate data types and tensor shape in the code. This should NOT affect the actual implementation. Reviewed By: zhanghang1989 Differential Revision: D29048363 fbshipit-source-id: c5b5e89793c86d530b077a7b999769881f441b69
-
- 20 Jun, 2021 1 commit
-
-
Albert Pumarola authored
Summary: Add create and train unit tests to OSS runner Reviewed By: zhanghang1989 Differential Revision: D29254417 fbshipit-source-id: f7c52b90b2bc7afa83a204895be149664c675e52
-
- 12 Jun, 2021 1 commit
-
-
Zhicheng Yan authored
Summary: Major changes - Add a new runner `EgoDETRRunner` which inherit from existing `DETRRunner` in D2 (https://github.com/facebookresearch/d2go/commit/62c21f252ad314961cf0157ee8f37cc4f7835e1d)GO repo. - Add a new data mapper `EgoDETRDatasetMapper` which has custom crop transform generator and supports generic data augmentation. Reviewed By: zhanghang1989 Differential Revision: D28895225 fbshipit-source-id: 4181ff8fce81df22a01d355fdff7e81e83d69e64
-
- 06 Apr, 2021 1 commit
-
-
Hang Zhang authored
Summary: TorchVision recently upgrade their version to 0.10.0 which causes issues in the version check in detr. Reviewed By: wat3rBro Differential Revision: D27575085 fbshipit-source-id: 75f459fe7a711161e908609fcf2f2d28a01a6c74
-
- 03 Mar, 2021 1 commit
-
-
facebook-github-bot authored
fbshipit-source-id: f4a8ba78691d8cf46e003ef0bd2e95f170932778
-