1. 15 May, 2022 1 commit
    • John Reese's avatar
      [codemod][usort] apply import merging for fbcode (8 of 11) · d62875cc
      John Reese authored
      Summary:
      Applies new import merging and sorting from µsort v1.0.
      
      When merging imports, µsort will make a best-effort to move associated
      comments to match merged elements, but there are known limitations due to
      the diynamic nature of Python and developer tooling. These changes should
      not produce any dangerous runtime changes, but may require touch-ups to
      satisfy linters and other tooling.
      
      Note that µsort uses case-insensitive, lexicographical sorting, which
      results in a different ordering compared to isort. This provides a more
      consistent sorting order, matching the case-insensitive order used when
      sorting import statements by module name, and ensures that "frog", "FROG",
      and "Frog" always sort next to each other.
      
      For details on µsort's sorting and merging semantics, see the user guide:
      https://usort.readthedocs.io/en/stable/guide.html#sorting
      
      Reviewed By: lisroach
      
      Differential Revision: D36402214
      
      fbshipit-source-id: b641bfa9d46242188524d4ae2c44998922a62b4c
      d62875cc
  2. 12 May, 2022 1 commit
  3. 11 May, 2022 1 commit
    • hwangjeff's avatar
      Refactor LibriSpeech Conformer RNN-T recipe (#2366) · 69467ea5
      hwangjeff authored
      Summary:
      Modifies the example LibriSpeech Conformer RNN-T recipe as follows:
      - Moves data loading and transforms logic from lightning module to data module (improves generalizability and reusability of lightning module and data module).
      - Moves transforms logic from dataloader collator function to dataset (resolves dataloader multiprocessing issues on certain platforms).
      - Replaces lambda functions with `partial` equivalents (resolves pickling issues in certain runtime environments).
      - Modifies training script to allow for specifying path model checkpoint to restart training from.
      
      Pull Request resolved: https://github.com/pytorch/audio/pull/2366
      
      Reviewed By: mthrok
      
      Differential Revision: D36305028
      
      Pulled By: hwangjeff
      
      fbshipit-source-id: 0b768da5d5909136c55418bf0a3c2ddd0c5683ba
      69467ea5
  4. 21 Apr, 2022 1 commit
    • hwangjeff's avatar
      Change underlying implementation of RNN-T hypothesis to tuple (#2339) · 6b242c29
      hwangjeff authored
      Summary:
      PyTorch Lite, which is becoming a standard for mobile PyTorch usage, does not support containers containing custom classes. Consequently, because TorchAudio's RNN-T decoder currently returns and accepts lists of `Hypothesis` namedtuples, it is not compatible with PyTorch Lite. This PR resolves said incompatibility by changing the underlying implementation of `Hypothesis` to tuple.
      
      Pull Request resolved: https://github.com/pytorch/audio/pull/2339
      
      Reviewed By: nateanl
      
      Differential Revision: D35806529
      
      Pulled By: hwangjeff
      
      fbshipit-source-id: 9cbae5504722390511d35e7f9966af2519ccede5
      6b242c29
  5. 13 Apr, 2022 1 commit
    • hwangjeff's avatar
      Add Conformer RNN-T LibriSpeech training recipe (#2329) · c262758b
      hwangjeff authored
      Summary:
      Adds Conformer RNN-T LibriSpeech training recipe to examples directory.
      
      Produces 30M-parameter model that achieves the following WER:
      
      |                     |          WER |
      |:-------------------:|-------------:|
      | test-clean          |       0.0310 |
      | test-other          |       0.0805 |
      | dev-clean           |       0.0314 |
      | dev-other           |       0.0827 |
      
      Pull Request resolved: https://github.com/pytorch/audio/pull/2329
      
      Reviewed By: xiaohui-zhang
      
      Differential Revision: D35578727
      
      Pulled By: hwangjeff
      
      fbshipit-source-id: afa9146c5b647727b8605d104d928110a1d3976d
      c262758b
  6. 04 Apr, 2022 2 commits
  7. 17 Feb, 2022 1 commit
  8. 16 Feb, 2022 6 commits
  9. 11 Feb, 2022 5 commits
  10. 10 Feb, 2022 1 commit
  11. 04 Feb, 2022 1 commit
  12. 03 Feb, 2022 2 commits
  13. 02 Feb, 2022 1 commit
  14. 01 Feb, 2022 3 commits
  15. 27 Jan, 2022 2 commits
  16. 18 Jan, 2022 1 commit
  17. 05 Jan, 2022 1 commit
  18. 30 Dec, 2021 1 commit
  19. 29 Dec, 2021 3 commits
  20. 23 Dec, 2021 1 commit
  21. 03 Dec, 2021 1 commit