1. 19 Jul, 2022 1 commit
  2. 13 Jul, 2022 1 commit
  3. 08 Jul, 2022 1 commit
    • Yanghan Wang's avatar
      prepare_for_quant_convert -> custom_covert_fx · 97904ba4
      Yanghan Wang authored
      Summary:
      Pull Request resolved: https://github.com/facebookresearch/d2go/pull/325
      
      `prepare_for_quant_convert` is a confusing name because it only does `convert`, there's no "prepare" in it. It's actually for fx only, because eager mode always calls `torch.quantization.convert`, there's no way to customize it. So just call this `custom_convert_fx`. The new name is also unique in fbcode, so easy to do codemod later on.
      
      This diff simply does the renaming by biggrep + replace.
      
      Reviewed By: jerryzh168
      
      Differential Revision: D37676717
      
      fbshipit-source-id: e7d05eaafddc383dd432986267c945c8ebf94df4
      97904ba4
  4. 02 Jul, 2022 1 commit
  5. 21 May, 2022 1 commit
  6. 15 May, 2022 1 commit
    • John Reese's avatar
      apply import merging for fbcode (7 of 11) · b3a9204c
      John Reese authored
      Summary:
      Applies new import merging and sorting from µsort v1.0.
      
      When merging imports, µsort will make a best-effort to move associated
      comments to match merged elements, but there are known limitations due to
      the diynamic nature of Python and developer tooling. These changes should
      not produce any dangerous runtime changes, but may require touch-ups to
      satisfy linters and other tooling.
      
      Note that µsort uses case-insensitive, lexicographical sorting, which
      results in a different ordering compared to isort. This provides a more
      consistent sorting order, matching the case-insensitive order used when
      sorting import statements by module name, and ensures that "frog", "FROG",
      and "Frog" always sort next to each other.
      
      For details on µsort's sorting and merging semantics, see the user guide:
      https://usort.readthedocs.io/en/stable/guide.html#sorting
      
      Reviewed By: lisroach
      
      Differential Revision: D36402205
      
      fbshipit-source-id: a4efc688d02da80c6e96685aa8eb00411615a366
      b3a9204c
  7. 28 Feb, 2022 1 commit
  8. 28 Oct, 2021 1 commit
    • Kai Zhang's avatar
      Fix unused param in QAT training · 8b03f9aa
      Kai Zhang authored
      Summary:
      In quantization callback, we prepare the model with FX quantization API and only use the prepared model in training.
      However, when training in DDP, the parameters in the origin model still require grad, causing unused parameters RuntimeError.
      Previously, Lightning trainer train the model with find_unused_param flag, but if user manually disable it, they will get the runtime error.
      
      In this diff, the parameters in the origin model are frozen. We could consider deleting the origin model after preparation to save memory, but we might have to make some assumption on Lightning module structure, for example, `.model` is the origin model, so that we could `delattr(pl_module, "model")`.
      
      Reviewed By: wat3rBro
      
      Differential Revision: D31902368
      
      fbshipit-source-id: 56eabb6b2296278529dd2b94d6aa4c9ec9e9ca6b
      8b03f9aa
  9. 06 Oct, 2021 1 commit
  10. 09 Sep, 2021 1 commit
  11. 30 Jun, 2021 1 commit
    • Kai Zhang's avatar
      Fix typo in quantization callback · e830629a
      Kai Zhang authored
      Summary: "fb" -> "fn"
      
      Reviewed By: ananthsub
      
      Differential Revision: D29480559
      
      fbshipit-source-id: 78a0cd3ddd25df2c877514d4a5c0c29c248267a2
      e830629a
  12. 26 Jun, 2021 1 commit
    • Kai Zhang's avatar
      Fix quantization test failure · 1894f8a3
      Kai Zhang authored
      Summary:
      # Context
      In post training quantization callback, we make a deepcopy of the Lightning module before validation start and prepare the copy with FX quantization API. The callback keeps the prepared model inside it.
      
      # The problem
      By the second time we run the validation epoch, we try to make a copy of the Lightning module, which has a reference to trainer, which has a reference to quantization callback, which has a prepared model, which is not deepcopiable.
      
      # Mitigation
      Delete the trainer before making a deepcopy.
      We're already doing that in stl/callbacks/quantization, but the changes were not ported into D2 (https://github.com/facebookresearch/d2go/commit/4169abc18ec539a24081b179fcbbc5a5754d102b)Go.
      
      Reviewed By: zhanghang1989
      
      Differential Revision: D29409085
      
      fbshipit-source-id: 24550124181673b2e567b2a04563bcdfb440e145
      1894f8a3
  13. 17 Apr, 2021 2 commits
    • Kai Zhang's avatar
      Delegate to model's customization · aeb24a92
      Kai Zhang authored
      Summary: Delegate FX quantization callback's customization to model.
      
      Reviewed By: wat3rBro
      
      Differential Revision: D27669212
      
      fbshipit-source-id: 2715546cf03134896da6f95ecddaf8503ff95d0b
      aeb24a92
    • Kai Zhang's avatar
      E2E QAT Workflow on Lightning · 845d0b2c
      Kai Zhang authored
      Summary:
      As per title and sanity test E2E QAT workflow on Lightning Trainer.
      
      - add `post_training_opts`. This is required to use `all_steps_qat.json` with Lightning. We don't actually support the post_training_opts in this diff though - we leave it part of T83437359.
      - Update .yaml to specify the Quantize-able modules.
      - Update `lightning_train_net.py` to use the QuantizationAwareTraining callback.
      
      Reviewed By: kandluis
      
      Differential Revision: D26304879
      
      fbshipit-source-id: 948bef4817d385d8a0969e4990d7f17ecd6994b7
      845d0b2c
  14. 31 Mar, 2021 1 commit
  15. 03 Mar, 2021 1 commit
    • Kai Zhang's avatar
      Copy quantization callback to D2go · 5d8068d8
      Kai Zhang authored
      Summary: As titled. Make a copy of quantization callback to unblock D2go OSS.
      
      Reviewed By: zhanghang1989
      
      Differential Revision: D26735525
      
      fbshipit-source-id: 12b77f04cfa1361e856b26ea218a262da1fadd88
      5d8068d8