"torchvision/models/vscode:/vscode.git/clone" did not exist on "413b71032e20df1a1a99b0a63ee7b227a89b5683"
  1. 30 Jul, 2019 1 commit
  2. 26 Nov, 2018 1 commit
    • Myle Ott's avatar
      Refactor BacktranslationDataset to be more reusable (#354) · 3c19878f
      Myle Ott authored
      Summary:
      - generalize AppendEosDataset -> TransformEosDataset
      - remove EOS logic from BacktranslationDataset (use TransformEosDataset instead)
      - BacktranslationDataset takes a backtranslation_fn instead of building the SequenceGenerator itself
      Pull Request resolved: https://github.com/pytorch/fairseq/pull/354
      
      Reviewed By: liezl200
      
      Differential Revision: D12970233
      
      Pulled By: myleott
      
      fbshipit-source-id: d5c5b0e0a75eca1bd3a50382ac24621f35c32f36
      3c19878f
  3. 18 Nov, 2018 1 commit
  4. 07 Nov, 2018 1 commit
    • Liezl Puzon's avatar
      Support BPE end of word marker suffix in fairseq noising module · 2b13f3c0
      Liezl Puzon authored
      Summary:
      There are 2 ways to implement BPE:
      1. use a continuation marker suffix to indicate that there is at least one more subtoken left in the word
      2. use a end of word marker suffix to indicate that there is no more subtokens left in the word
      
      This adds some logic to account for either kind of BPE marker suffix. This diff adds a corresponding test. I also refactored the test setup to reduce the number of boolean args when setting up test data.
      
      Reviewed By: xianxl
      
      Differential Revision: D12919428
      
      fbshipit-source-id: 405e9f346dce6e736c1305288721dfc7b63e872a
      2b13f3c0
  5. 02 Nov, 2018 2 commits
  6. 01 Nov, 2018 1 commit
  7. 27 Oct, 2018 1 commit
    • Xian Li's avatar
      Extend WordShuffle noising function to apply to non-bpe tokens · 90c01b3a
      Xian Li authored
      Summary:
      We'd like to resue the noising functions and DenoisingDataset in
      adversarial training. However, current noising functions assume the input are
      subword tokens. The goal of this diff is to extend it so the noising can be
      applied to word tokens. Since we're mostly interested in the word shuffle
      noising, so I only modified the WordShuffle class.
      
      Reviewed By: liezl200
      
      Differential Revision: D10523177
      
      fbshipit-source-id: 1e5d27362850675010e73cd38850c890d42652ab
      90c01b3a
  8. 06 Oct, 2018 2 commits
  9. 30 Sep, 2018 1 commit