"examples/movement-pruning/masked_run_squad.py" did not exist on "631be27078fe394fdd8f98b9475ca87026f8044d"
  1. 02 Mar, 2020 1 commit
    • Lysandre Debut's avatar
      Pipeline doc (#3055) · d3eb7d23
      Lysandre Debut authored
      * Pipeline doc initial commit
      
      * pipeline abstraction
      
      * Remove modelcard argument from pipeline
      
      * Task-specific pipelines can be instantiated with no model or tokenizer
      
      * All pipelines doc
      d3eb7d23
  2. 19 Feb, 2020 1 commit
  3. 18 Feb, 2020 1 commit
  4. 13 Feb, 2020 1 commit
    • Joe Davison's avatar
      Preserve spaces in GPT-2 tokenizers (#2778) · f1e8a51f
      Joe Davison authored
      * Preserve spaces in GPT-2 tokenizers
      
      Preserves spaces after special tokens in GPT-2 and inhereted (RoBERTa)
      tokenizers, enabling correct BPE encoding. Automatically inserts a space
      in front of first token in encode function when adding special tokens.
      
      * Add tokenization preprocessing method
      
      * Add framework argument to pipeline factory
      
      Also fixes pipeline test issue. Each test input now treated as a
      distinct sequence.
      f1e8a51f
  5. 07 Feb, 2020 2 commits
  6. 30 Jan, 2020 1 commit
    • Julien Chaumond's avatar
      fill_mask helper (#2576) · 9fa836a7
      Julien Chaumond authored
      * fill_mask helper
      
      * [poc] FillMaskPipeline
      
      * Revert "[poc] FillMaskPipeline"
      
      This reverts commit 67eeea55b0f97b46c2b828de0f4ee97d87338335.
      
      * Revert "fill_mask helper"
      
      This reverts commit cacc17b884e14bb6b07989110ffe884ad9e36eaa.
      
      * README: clarify that Pipelines can also do text-classification
      
      cf. question at the AI&ML meetup last week, @mfuntowicz
      
      * Fix test: test feature-extraction pipeline
      
      * Test tweaks
      
      * Slight refactor of existing pipeline (in preparation of new FillMaskPipeline)
      
      * Extraneous doc
      
      * More robust way of doing this
      
      @mfuntowicz as we don't rely on the model name anymore (see AutoConfig)
      
      * Also add RobertaConfig as a quickfix for wrong token_type_ids
      
      * cs
      
      * [BIG] FillMaskPipeline
      9fa836a7
  7. 15 Jan, 2020 2 commits
  8. 06 Jan, 2020 2 commits
  9. 22 Dec, 2019 4 commits
  10. 21 Dec, 2019 1 commit
    • Aymeric Augustin's avatar
      Reformat source code with black. · fa84ae26
      Aymeric Augustin authored
      This is the result of:
      
          $ black --line-length 119 examples templates transformers utils hubconf.py setup.py
      
      There's a lot of fairly long lines in the project. As a consequence, I'm
      picking the longest widely accepted line length, 119 characters.
      
      This is also Thomas' preference, because it allows for explicit variable
      names, to make the code easier to understand.
      fa84ae26
  11. 20 Dec, 2019 3 commits
  12. 10 Dec, 2019 2 commits