- 28 Nov, 2021 1 commit
-
-
Hang Zhang authored
Summary: Experimental models from Xiaoliang [D31749820] Pretrained weights: fbnet_vit_tiny_v3_lepe n/a fbnet_deit_v0 f298782311 Reviewed By: XiaoliangDai Differential Revision: D32054949 fbshipit-source-id: 7c2aa0679a545ed814ba1db421408a5f9a59a2c8
-
- 25 Nov, 2021 1 commit
-
-
Yuxin Wu authored
Summary: make it an option Differential Revision: D32601981 fbshipit-source-id: 308a0c49939531d840914aa8e256aae6db463929
-
- 21 Nov, 2021 1 commit
-
-
Hang Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/mobile-vision/pull/56 Reviewed By: ppwwyyxx Differential Revision: D32576986 fbshipit-source-id: 1b20d1927a36ac80e33b51ff971b54767f647d43
-
- 20 Nov, 2021 1 commit
-
-
Haroun Habeeb authored
Summary: for sythetic data, we want to enable having different transforms for different dataloaders. To do that, we need to be able to construct different kinds of transforms. This means that using the cfg's hard-coded location isn't convenient - we'd have to edit the cfg during run time and call the build function multiple times Differential Revision: D32486576 fbshipit-source-id: 767b63c5c787e31a67dbf8710ab9bab84a0651db
-
- 18 Nov, 2021 1 commit
-
-
Ananth Subramaniam authored
Summary: ### New commit log messages fa0ed17f8 remove deprecated train_loop (#10482) Reviewed By: kandluis Differential Revision: D32454980 fbshipit-source-id: a35237dde06cc9ddac5373b75992ce88a6771c76
-
- 12 Nov, 2021 1 commit
-
-
Yanghan Wang authored
Reviewed By: newstzpz Differential Revision: D32301322 fbshipit-source-id: a9e951b9de600012125b8b94c0c1ace929b491b8
-
- 09 Nov, 2021 4 commits
-
-
Sam Tsai authored
Summary: fvcore flops calculator throws on this error: KeyError: 'Only support flattening dictionaries if keys are str.' Setting flops to some value so it doesn't enter pdb mode. Reviewed By: stephenyan1231 Differential Revision: D32144492 fbshipit-source-id: 604cd4660cea9ffbfb3f1da35d32e06ccf607a50
-
Yuxin Wu authored
Reviewed By: newstzpz Differential Revision: D31209906 fbshipit-source-id: 0be4e3c1db623e3c1fba8ba4259840d34192a77e
-
Albert Pumarola authored
Summary: Extended Pix2Pix to allow for input extra data Reviewed By: tax313 Differential Revision: D31469054 fbshipit-source-id: 790543f214ea9fa0158e509acb27193916bf17ce
-
CodemodService Bot authored
Reviewed By: zertosh Differential Revision: D32270982 fbshipit-source-id: 8767b469fe5404a882257c0c5209b34ed0c327dc
-
- 08 Nov, 2021 3 commits
-
-
Yanghan Wang authored
Summary: code was kept for short term to support loading old training jobs during the period when the default config is polluted; now it should be safe to remove this BC support and dead code Differential Revision: D32218217 fbshipit-source-id: 3772477653151ccbcb4ae7098b9414853b581ad1
-
Yanghan Wang authored
Reviewed By: sstsai-adl Differential Revision: D32216605 fbshipit-source-id: bebee1edae85e940c7dcc6a64dbe341a2fde36a2
-
Tim Hatch authored
Reviewed By: jreese, ppwwyyxx Differential Revision: D32191010 fbshipit-source-id: 1e40b7a090be3a0e25b930fb908ec177719fce50
-
- 04 Nov, 2021 1 commit
-
-
Tsahi Glik authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/134 Updating `all_steps_qat` example config to use learnable QAT method. And add logic in `GeneralizedRCNNPatch.prepare_for_quant` to call the new `d2go.utils.qat_utils.get_qat_qconfig` to properly support QAT in D2 (https://github.com/facebookresearch/d2go/commit/7992f91324aee6ae59795063a007c6837e60cdb8)Go training workflow Differential Revision: D32147216 fbshipit-source-id: 32831c6156bc5c0775196ad8edc890a5292d204f
-
- 29 Oct, 2021 1 commit
-
-
Owen Wang authored
Summary: Allow reading `.npy` format binary masks shaped (H, W,) in addition to `.png` image masks shaped (H, W, C). Reviewed By: wat3rBro Differential Revision: D30136542 fbshipit-source-id: 56df5a766ab15b6808a1327815857e5d38eac910
-
- 28 Oct, 2021 1 commit
-
-
Kai Zhang authored
Summary: In quantization callback, we prepare the model with FX quantization API and only use the prepared model in training. However, when training in DDP, the parameters in the origin model still require grad, causing unused parameters RuntimeError. Previously, Lightning trainer train the model with find_unused_param flag, but if user manually disable it, they will get the runtime error. In this diff, the parameters in the origin model are frozen. We could consider deleting the origin model after preparation to save memory, but we might have to make some assumption on Lightning module structure, for example, `.model` is the origin model, so that we could `delattr(pl_module, "model")`. Reviewed By: wat3rBro Differential Revision: D31902368 fbshipit-source-id: 56eabb6b2296278529dd2b94d6aa4c9ec9e9ca6b
-
- 26 Oct, 2021 4 commits
-
-
Yanghan Wang authored
Summary: as title Reviewed By: Cysu Differential Revision: D31901433 fbshipit-source-id: 1749527c04c392c830e1a49bca8313ddf903d7b1
-
Yanghan Wang authored
Summary: FCOS is registered only because we make an import from `get_default_cfg`, if user don't call it (eg. using their own runner), they might find that the meta-arch is not registered. Reviewed By: ppwwyyxx Differential Revision: D31920026 fbshipit-source-id: 59eeeb3d1bf30d6b08463c2814930b1cadd7d549
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/130 We want to make sure that after importing `d2go.modeling` all the meta-arch is registered. Reviewed By: Maninae Differential Revision: D31904303 fbshipit-source-id: 3f32b65b764b2458e2fb9c4e0bbd99824b37ecfc
-
Binh Tang authored
Summary: ### New commit log messages 1f7bd6650 Mark accelerator connector as protected (#10032) Reviewed By: yifuwang Differential Revision: D31905981 fbshipit-source-id: a7f0f03033b02b603d28203ae2c8e8df4933fb23
-
- 22 Oct, 2021 3 commits
-
-
Yanghan Wang authored
Reviewed By: sstsai-adl Differential Revision: D31806054 fbshipit-source-id: 4ea98405e1f94176cb77ca69077adf9f4d22e77e
-
Binh Tang authored
Summary: ### New commit log messages 0aa220b46 Remove deprecated `distributed_backend` from `Trainer` (#10017) Reviewed By: kandluis Differential Revision: D31788128 fbshipit-source-id: 4d9394e119b3122014fc9681a5c56aac8df49141
-
Yuxin Wu authored
Summary: this utility function was added in D30272112 (https://github.com/facebookresearch/d2go/commit/737d099b0a8b0fb1f548435e73f95e1252442827) and is useful to all D2 (https://github.com/facebookresearch/d2go/commit/7992f91324aee6ae59795063a007c6837e60cdb8) users as well Differential Revision: D31833523 fbshipit-source-id: 0adfc612adb8b448fa7f3dbec1b1278c309554c5
-
- 21 Oct, 2021 1 commit
-
-
Yanghan Wang authored
Summary: see bottom diff Reviewed By: newstzpz Differential Revision: D31781835 fbshipit-source-id: 501b51e7bf92cf3505060a62822fa36f1ed3a7d4
-
- 20 Oct, 2021 5 commits
-
-
Yuxin Wu authored
Summary: helps debugging Reviewed By: zhanghang1989 Differential Revision: D31806396 fbshipit-source-id: 870308990c4c0c71453d107628b8adcb9edcf391
-
Yanghan Wang authored
Summary: Add toy example to illustrate the Turing workflow. - modify the model building, add converting to helios step. Note that we need to hide this from OSS, so create FB version of the runner, in order to modify `build_model` and `get_default_cfg`. - make the `D2 (https://github.com/facebookresearch/d2go/commit/7992f91324aee6ae59795063a007c6837e60cdb8)GoCompatibleMNISTRunner` up-to-date, and use the "tutorial" meta-arch for writing unit test since it's the simplest model. Note that even `TutorialNet` is very simple, there's still a constraint that the FC has to run on 4D tensor with 1x1 spatial dimension because it's been mapped to 1x1 Conv by Helios, modify the `TutorialNet` to make it compatible. Reviewed By: newstzpz Differential Revision: D31705305 fbshipit-source-id: 77949dfbf08252be5495e9273210274c8ad86abb
-
Yanghan Wang authored
Summary: see bottom diff Reviewed By: newstzpz Differential Revision: D31780235 fbshipit-source-id: ec1285c4c5457a631e1eb88bebd47c9f41b47e12
-
Peizhao Zhang authored
Summary: Supported learnable qat. * Added a config key `QUANTIZATION.QAT.FAKE_QUANT_METHOD` to specify the qat metod (`default` or `learnable`). * Added a config key `QUANTIZATION.QAT.ENABLE_LEARNABLE_OBSERVER_ITER` to specify the start iteration for learnable observers (before that it is using static observers). * Custom quantization code needs to call ` d2go.utils.qat_utils.get_qat_qconfig()` to get proper qconfig for learnable qat. An exception will raise if qat method is learnable but no learnable observers are used in the model. * Set the weight decay for scale/zero_point to 0 for the optimizer automatically. * The way to use larnable qat: enable static observers -> enable fake quant -> enable learnable observers -> freeze bn. Differential Revision: D31370822 fbshipit-source-id: a5a5044a539d0d7fe1cc6b36e6821fc411ce752a
-
Peizhao Zhang authored
Summary: Refactored qat related code. * Moved `_prepare_model_for_qat` related code to a function. * Moved `_setup_non_qat_to_qat_state_dict_map` related code to a function. * Moved QATHook related code to the quantization file and implemented as a class. Differential Revision: D31370819 fbshipit-source-id: 836550b2c8d68cd93a84d5877ad9cef6f0f0eb39
-
- 16 Oct, 2021 1 commit
-
-
Yuxin Wu authored
Summary: D2 (https://github.com/facebookresearch/d2go/commit/7992f91324aee6ae59795063a007c6837e60cdb8) does not add new yacs config for new models, but this simple wrapper with configs->arguments mapping is enough to make the model work with yacs config. Reviewed By: zhanghang1989 Differential Revision: D30980180 fbshipit-source-id: 75a0cc66051800a3e9d553bb650ca5c900c0ffa3
-
- 15 Oct, 2021 2 commits
-
-
Peizhao Zhang authored
Summary: Supported specifying customized parameter groups from model. * Allow model to specify customized parameter groups by implementing a function `model.get_optimizer_param_groups(cfg)` * Supported model with ddp. Reviewed By: zhanghang1989 Differential Revision: D31289315 fbshipit-source-id: c91ba8014508e9fd5f172601b9c1c83c188338fd
-
Peizhao Zhang authored
Summary: Refactor for get_optimizer_param_groups. * Split `get_default_optimizer_params()` into multiple functions: * `get_optimizer_param_groups_default()` * `get_optimizer_param_groups_lr()` * `get_optimizer_param_groups_weight_decay()` * Regroup the parameters to create the minimal amount of groups. * Print all parameter groups when the optimizer is created. Param group 0: {amsgrad: False, betas: (0.9, 0.999), eps: 1e-08, lr: 10.0, params: 1, weight_decay: 1.0} Param group 1: {amsgrad: False, betas: (0.9, 0.999), eps: 1e-08, lr: 1.0, params: 1, weight_decay: 1.0} Param group 2: {amsgrad: False, betas: (0.9, 0.999), eps: 1e-08, lr: 1.0, params: 2, weight_decay: 0.0} * Add some unit tests. Reviewed By: zhanghang1989 Differential Revision: D31287783 fbshipit-source-id: e87df0ae0e67343bb2130db945d8faced44d7411
-
- 14 Oct, 2021 1 commit
-
-
Yuxin Wu authored
Summary: Also modify launch() because it should not assume it's always called with a CfgNode object. Differential Revision: D31494215 fbshipit-source-id: 8f07e9cb64969f8a14641956f7ef7c7160748bd9
-
- 13 Oct, 2021 2 commits
-
-
Daniel Haziza authored
Summary: The assert just below fails because `backend = "NCCL"` and we don't have a GPU Reviewed By: ppwwyyxx Differential Revision: D31506095 fbshipit-source-id: c1199eeb732d098c02fe5cd40efb85284deaa3b9
-
Yanghan Wang authored
Summary: No usage: https://www.internalfb.com/code/search?q=filepath%3Ad2go%2F%20repo%3Afbcode%20_mock_func Differential Revision: D31591868 fbshipit-source-id: 3fc6103c40713fa7bf278fd57a3e8fb4436a0902
-
- 09 Oct, 2021 1 commit
-
-
Tao Xu authored
Summary: Fix a failure bug in real image driving generating Reviewed By: yc-fb Differential Revision: D31362721 fbshipit-source-id: b222745aada1bd6680ca931d49a70d8b428828a6
-
- 07 Oct, 2021 2 commits
-
-
Yanghan Wang authored
Summary: EMA is only applicable when testing non-predictor based models, this diff simply add a check so it won't evaluate ema models. Side note: `do_test` should probably just handle single model, in the case of EMA, we could let `do_train` to return two models with and without ema, and call `do_test` on each of them. Then the temporary fix in this diff is not needed at all. Reviewed By: wrlife Differential Revision: D31450572 fbshipit-source-id: 8696922a9fd194f91315d2f3480dc8bfd8f36a3d
-
Yuxin Wu authored
Summary: the LR scheduler is cosine, so this config has no effect. Remove it to avoid confusion. Reviewed By: sstsai-adl Differential Revision: D31444047 fbshipit-source-id: b40e0d7d923c3b55dfe23353050ea0238b3afd16
-
- 06 Oct, 2021 1 commit
-
-
Supriya Rao authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/124 Update callsites from torch.quantization to torch.ao.quantization Reviewed By: z-a-f, jerryzh168 Differential Revision: D31286125 fbshipit-source-id: ef24ca87d8db398c65bb5b89f035afe0423a5685
-
- 01 Oct, 2021 1 commit
-
-
Hang Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/116 Reviewed By: newstzpz Differential Revision: D30860098 fbshipit-source-id: 5c9422dd91d305193f9b43869f12423660217010
-