- 03 Oct, 2022 1 commit
-
-
Francisc Bungiu authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/378 Some hooks need access to cfg to be initialized correctly. Pass cfg down the hook registration method. Reviewed By: ertrue, miqueljubert Differential Revision: D39303862 fbshipit-source-id: 931c356c7045f95fc0af5b20c7782ea4d1aff138
-
- 01 Oct, 2022 2 commits
-
-
Peizhao Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/376 Make sure the return result is consistent for ResultCache * When gather = True, it will always return a list. Reviewed By: itomatik Differential Revision: D39874274 fbshipit-source-id: ce042261432cdce5544d10b2d4b88f5e2d0d1b68
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/379 This diff ~~~prototypes~~~ implements replacing the `custom_convert_fx` API with a callback. Reviewed By: LiamZhuuu Differential Revision: D39859228 fbshipit-source-id: 34719d1758c4afa7e47930c12d3443813d3f4546
-
- 29 Sep, 2022 1 commit
-
-
Peizhao Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/377 Only automatically rescale the lr for sgd optimizers. * Seems that only sgd needs scaling the lr, so we change to not to scale lr automatically by default. This will work better for newly added optimizers (like adam). Reviewed By: itomatik, lg-zhang Differential Revision: D39899434 fbshipit-source-id: d6eebc5b07d4489b401c1fc3cea00f5a060fe19d
-
- 28 Sep, 2022 4 commits
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/375 `patch_d2_meta_arch` had already been removed in D37246483 (https://github.com/facebookresearch/d2go/commit/d1518ff66205089115df56010e8eafcf94efb08d), need to also update the `beginner.ipynb`. Reviewed By: itomatik Differential Revision: D39861148 fbshipit-source-id: bac80efbff1f99a023d604a66c3667bc94f8c6f4
-
Artsiom Sanakoyeu authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/374 AMP trained with mixed precision is implemented for the Native d2go Runner, but not for Lightning Tasks. Now we pass params SOLVER.AMP* and SOLVER.CLIP_GRADIENTS* to the lightning Trainer as well. Reviewed By: wat3rBro Differential Revision: D39798007 fbshipit-source-id: e48560a91d37c21c56d953eed141876d8c759329
-
Matthew Yu authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/371 In a previous iteration of this diff, we were specifying the teacher model in the same config as the student model, something like: ``` # config.py MODEL: FBNET_V2: ... DISTILLATION: TEACHER: MODEL: FBNET_V2: ... WEIGHTS: /path/to/teacher/weights ... ``` This leads to some oddities in the code, like we have to have a default config that adds all the required keys in the distillation teacher model. In this diff, we just let the user supply a teacher config (and optionally runner_name and overwrite opts) and use the supplied runner to build the model: ``` # new_config.py MODEL: FBNET_V2: ... DISTILLATION: TEACHER: CONFIG_FNAME: /path/to/teacher/config RUNNER_NAME: ... ``` This should make it very easy to specify the teacher as the user could potentially just reuse the trained_config generated in d2go. Reviewed By: newstzpz Differential Revision: D37640041 fbshipit-source-id: 088a636c96f98279c9a04e32d1674f703451aec3
-
Zhanibek Datbayev authored
Summary: X-link: https://github.com/facebookresearch/mobile-vision/pull/108 Pull Request resolved: https://github.com/facebookresearch/d2go/pull/373 Asserting that methods replaced by fb_overwritable are also annotated with corresponding decorator. Reviewed By: wat3rBro Differential Revision: D39674347 fbshipit-source-id: a4cf007419760aa07d929ab1cf819ba54b11b9da
-
- 22 Sep, 2022 1 commit
-
-
Dark Knight authored
Summary: This diff is reverting D38436675 (https://github.com/facebookresearch/d2go/commit/d98f74aa467a6576c9905accbf4a2b6279599f9c) D38436675 (https://github.com/facebookresearch/d2go/commit/d98f74aa467a6576c9905accbf4a2b6279599f9c) has been identified to be causing the following test or build failures: Tests affected: - https://www.internalfb.com/intern/test/844425001950025/ - https://www.internalfb.com/intern/test/844425001950027/ Here's the Multisect link: https://www.internalfb.com/intern/testinfra/multisect/1259258 Here are the tasks that are relevant to this breakage: T120995919: 51 tests started failing for oncall d2go in the last 2 weeks We're generating a revert to back out the changes in this diff, please note the backout may land if someone accepts it. Reviewed By: wat3rBro Differential Revision: D39594147 fbshipit-source-id: 56c489bb9feea2d60a2a5f0e89941ed7c0f3f675
-
- 21 Sep, 2022 1 commit
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/372 Reviewed By: itomatik Differential Revision: D39628884 fbshipit-source-id: bb1d5d77eeb965dff675c17a8fbc36e4da4e25cd
-
- 15 Sep, 2022 2 commits
-
-
Amy Kolesnichenko authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/369 Pull Request resolved: https://github.com/facebookresearch/d2go/pull/368 Adding image subsampling in evaluation to avoid saving to TB sequential frames for video-like data. Reviewed By: wat3rBro Differential Revision: D39233843 fbshipit-source-id: a49bbfe1b9114c11565ed04db1f2f186675ea9ed
-
Jiaxu Zhu authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/364 As title, when `QUANTIZATION.BACKEND` is set to `turing`, call `odai.transforms` APIs instead of OSS quantization. Also updated `image_classification` as an example to get Turing quantized model via `custom_prepare_fx/custom_convert_fx` Reviewed By: wat3rBro Differential Revision: D38436675 fbshipit-source-id: e4f0e02290512bce8b18c2369a67ed9b0f116825
-
- 10 Sep, 2022 1 commit
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/367 EZ Reviewed By: xiecong Differential Revision: D39407416 fbshipit-source-id: d0e6fa09ff926780e98c210bfce955e6b8eec7f6
-
- 08 Sep, 2022 1 commit
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/366 Users often time just use the default value without knowing they need to change this config, and getting low accuracy after PTQ. Therefore increase the default value. Reviewed By: miqueljubert Differential Revision: D39331680 fbshipit-source-id: 6b05773c3c4cd48e6298d341d77a0e373d5439f6
-
- 31 Aug, 2022 2 commits
-
-
Peizhao Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/355 switch to use inference_on_dataset_with_checkpointing in default runner. Reviewed By: HarounH Differential Revision: D37215292 fbshipit-source-id: c006784ce0b31700bcbb1f79c303fd791f1561ff
-
Peizhao Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/354 Allow skipping inference when running evaluation. * `inference_on_dataset_with_checkpointing` works similar to `inference_on_dataset` in d2 but allows skipping the inference step if the evaluator has cached the results. * If the evaluator has a function `could_skip_process` and returns True, inference will be skipped and only `evaluator. reset()` and `evaluator.evaluate()` are called. Reviewed By: wat3rBro Differential Revision: D37213004 fbshipit-source-id: d12cc480589ff04fd8dbb42b22633ab34bc4bf63
-
- 23 Aug, 2022 1 commit
-
-
Simon Hollis authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/362 X-link: https://github.com/facebookresearch/detectron2/pull/4491 Recently landed D35518556 (https://github.com/facebookresearch/d2go/commit/1ffc801bf5a1d4fe926b815ba93f21632f0980f9) / Github: 36a65a0907d90ed591479b2ebaa8b61cfa0b4ef0 throws an exception with older versions of PyTorch, due to a missing library for import. This has been reported by multiple members of the PyTorch community at https://github.com/facebookresearch/detectron2/commit/36a65a0907d90ed591479b2ebaa8b61cfa0b4ef0 This change uses `try/except` to check for libraries and set flags on presence/absence to later guard code that would use them. Reviewed By: wat3rBro Differential Revision: D38879134 fbshipit-source-id: 72f5a7a8d350eb82be87567f006368bf207f5a74
-
- 20 Aug, 2022 1 commit
-
-
Xiaofang Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/358 Avoid calling scheduler.step() after the last training iteration is done Reviewed By: wat3rBro Differential Revision: D38605135 fbshipit-source-id: 87a55309bf6d1f7e598b567cc2372b00b8885c7c
-
- 18 Aug, 2022 1 commit
-
-
Simon Hollis authored
Enable torch tracing by changing assertions in d2go forwards to allow for torch.fx.proxy.Proxy type. Summary: X-link: https://github.com/facebookresearch/detectron2/pull/4227 Pull Request resolved: https://github.com/facebookresearch/d2go/pull/241 Torch FX tracing propagates a type of `torch.fx.proxy.Proxy` through the graph. Existing type assertions in the d2go code base trigger during torch FX tracing, causing tracing to fail. This adds a check for FX tracing in progress and adds a helper function `assert_fx_safe()`, that can be used in place of a standard assertion. This function only applies the assertion if one is not tracing, allowing d2go assertion tests to be compatible with FX tracing. Reviewed By: wat3rBro Differential Revision: D35518556 fbshipit-source-id: a9b5d3d580518ca74948544973ae89f8b9de3282
-
- 12 Aug, 2022 1 commit
-
-
Pascual Martinez Gomez authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/359 Currently, D2 (https://github.com/facebookresearch/d2go/commit/87374efb134e539090e0b5c476809dc35bf6aedb)Go is missing the Adam optimizer. This Diff addresses the gap. Reviewed By: tglik, asanakoy Differential Revision: D38492151 fbshipit-source-id: 27791c23c73942b7a466f2ca91f6b3631733ba16
-
- 10 Aug, 2022 1 commit
-
-
Xiaoliang Dai authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/352 Reviewed By: newstzpz Differential Revision: D37872639 fbshipit-source-id: 61acdaa669bc541dcb715af1172926efb53c0b2b
-
- 09 Aug, 2022 2 commits
-
-
Mik Vyatskov authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/357 This change makes it possible to unpickle TrainNetOutput which is currently cannot be unpickled because it's a part of main module which can be different for the binary that's unpickling this dataclass. Reviewed By: miqueljubert Differential Revision: D38536040 fbshipit-source-id: 856594251b2eca7630d69c7917bc4746859dab9f
-
Mik Vyatskov authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/356 Attaching PDB on failure is not working when running in distributed environment. This change allows to disable this behavior by passing a command line argument. Reviewed By: miqueljubert Differential Revision: D38514736 fbshipit-source-id: 2e0008d6fbc6a4518a605debe67d76f8354364fc
-
- 04 Aug, 2022 1 commit
-
-
Yanghan Wang authored
Summary: X-link: https://github.com/facebookresearch/detectron2/pull/4458 Pull Request resolved: https://github.com/facebookresearch/d2go/pull/353 - The QAT was using old code prior to D36786902, update to use public API - Make `trainer:reset_data_loader` to take lazy lambda expression in order to delay the creation of dataloader. It's possible that we don't have enough RAM to hold two data loader at the same time, so we need to delete the first one, then create the second one. Differential Revision: D38330148 fbshipit-source-id: aae28a48eabf211fe00cafe5d9ea8aeaf56e4e0c
-
- 28 Jul, 2022 1 commit
-
-
Mircea Cimpoi authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/349 This is to allow None, meaning model_configs is not used. Added tasks for the other TODO. Reviewed By: wat3rBro Differential Revision: D38199075 fbshipit-source-id: 774ca42a82a972b7e4c642cc4306aec39e2c2f7f
-
- 27 Jul, 2022 5 commits
-
-
Mircea Cimpoi authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/344 we need access to the modeling hooks in EMA, e.g. build trainer. Reviewed By: wat3rBro Differential Revision: D37997773 fbshipit-source-id: bf4372cd310605fa35aa70f0604b084b047001d8
-
Peizhao Zhang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/278 Allow skipping do_test after do_train. Reviewed By: wat3rBro Differential Revision: D36786790 fbshipit-source-id: 785556b5743ee9af2abfe6c0e9e78c7055697048
-
Mircea Cimpoi authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/348 Add testcase to ensure loading from config in eval_only is covered. Reviewed By: wat3rBro Differential Revision: D38001319 fbshipit-source-id: e6a2edb5001ae87606a3bf48e1355037aee0f9a0
-
Kevin Chih-Yao Ma authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/342 Add a cfg option to control the frequency of the writers. Currently, the default writers include: ``` writers = [ CommonMetricPrinter(max_iter), JSONWriter(os.path.join(cfg.OUTPUT_DIR, "metrics.json")), tbx_writer, ] ``` Reviewed By: wat3rBro Differential Revision: D38065583 fbshipit-source-id: ebdc20aab71e03b4e18772af78b410f17ba4216d
-
Hongyu Fu authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/347 land the fbnet architectures for video logo detection project in RCNN, FCOS and YOLO Reviewed By: wat3rBro Differential Revision: D38139055 fbshipit-source-id: 6ba21f482ed067c52d438e0c217e523896c2131c
-
- 26 Jul, 2022 2 commits
-
-
Vasilis Vryniotis authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/322 TorchVision has recently added the AugMix Augmentantion. This diff adds support of the specific transform to D2 (https://github.com/facebookresearch/d2go/commit/87374efb134e539090e0b5c476809dc35bf6aedb)go Reviewed By: newstzpz Differential Revision: D37578243 fbshipit-source-id: b793715ccb24a3bd999a40c51d8c9a75f22110a3
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/345 Reviewed By: xiecong Differential Revision: D38086885 fbshipit-source-id: 808e104ee50c8870ae091533ac67b440e1bb8351
-
- 25 Jul, 2022 1 commit
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/343 Reviewed By: miqueljubert Differential Revision: D38077850 fbshipit-source-id: a79541d899ce2b49a30c7f2a81a616f76321026f
-
- 22 Jul, 2022 1 commit
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/340 Reviewed By: miqueljubert Differential Revision: D37968017 fbshipit-source-id: a3953fdbb2c48ceaffcf94df081c0b3253d247d5
-
- 19 Jul, 2022 2 commits
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/339 add is_qat to lightning codepath Reviewed By: jerryzh168 Differential Revision: D37937336 fbshipit-source-id: 68debe57c7f7dcf8647fad6ab9e34eff2aaa851c
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/338 Now we should've separated all the `prepare_for_quant` for eager and FX mode, we can remove this branch. Reviewed By: jerryzh168 Differential Revision: D37865628 fbshipit-source-id: cd8f3aa7c90201f44bcfdbd65eb2edf5eded0e0c
-
- 14 Jul, 2022 4 commits
-
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/336 Reviewed By: jerryzh168 Differential Revision: D37860495 fbshipit-source-id: 1ce0bc7bc8071d3bfbe53cd61ed180da62e29327
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/335 Manually remove `example_input` from eager-mode-only `prepare_for_quant`. Reviewed By: jerryzh168 Differential Revision: D37838155 fbshipit-source-id: 2d98e0264fc0c40dcf1b6f28f7fc635c52acd75e
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/333 Follow D36916149. Reviewed By: jerryzh168 Differential Revision: D37830568 fbshipit-source-id: dbeb204ccf96dd2e90a6509f24a2864503083f60
-
Yanghan Wang authored
Summary: Pull Request resolved: https://github.com/facebookresearch/d2go/pull/332 Solve: "we have decoupled qat with `.training` in quantization, maybe we should use some flags in `cfg` instead of checking this attribute here as well" Reviewed By: jerryzh168 Differential Revision: D37801241 fbshipit-source-id: ed9884d7b462da195ed2e07c42634acfe5beefb2
-