1. 11 Jan, 2022 1 commit
    • Virus's avatar
      Adds IBERT to models exportable with ONNX (#14868) · c4fa908f
      Virus authored
      * Add IBertOnnxConfig and tests
      
      * add all the supported features for IBERT and remove outputs in IbertOnnxConfig
      
      * use OnnxConfig
      
      * fix codestyle
      
      * remove serialization.rst
      
      * codestyle
      c4fa908f
  2. 07 Jan, 2022 1 commit
  3. 28 Dec, 2021 1 commit
    • Sylvain Gugger's avatar
      Doc styler examples (#14953) · b5e2b183
      Sylvain Gugger authored
      * Fix bad examples
      
      * Add black formatting to style_doc
      
      * Use first nonempty line
      
      * Put it at the right place
      
      * Don't add spaces to empty lines
      
      * Better templates
      
      * Deal with triple quotes in docstrings
      
      * Result of style_doc
      
      * Enable mdx treatment and fix code examples in MDXs
      
      * Result of doc styler on doc source files
      
      * Last fixes
      
      * Break copy from
      b5e2b183
  4. 23 Dec, 2021 1 commit
    • lewtun's avatar
      Add ONNX support for MarianMT models (#14586) · 6b655cc6
      lewtun authored
      * First commit to add MarianMT to ONNX
      
      * Now MarianModel.forward() automatically generates decoder_input_ids, like BartModel.forward()
      
      * Adjusted MarianOnnxConfig.inputs and outputs to work with seq2seq-lm feature
      
      * Style fix
      
      * Added support for other features for already supported models
      
      * Partial support for causal and seq2seq models
      
      * Partial support for causal and seq2seq models
      
      * Add default task for MarianMT ONNX
      
      * Remove automatic creation of decoder_input_ids
      
      * Extend inputs and outputs for MarianMT ONNX config
      
      * Add MarianMT to ONNX unit tests
      
      * Refactor
      
      * OnnxSeq2SeqConfigWithPast to support seq2seq models
      
      * Parameterized the onnx tests
      
      * Restored run_mlm.py
      
      * Restored run_mlm.py
      
      * [WIP] BART update
      
      * BART and MBART
      
      * Add past_key_values and fix dummy decoder inputs
      
      Using a sequence length of 1 in generate_dummy_outputs() produces large discrepancies, presumably due to some hidden optimisations.
      
      * Refactor MarianOnnxConfig to remove custom past_key_values logic
      
      * Fix quality
      
      * Revert "Revert "Added support for other features for already supported models (#14358)" (#14679)"
      
      This reverts commit 0f4e39c5.
      
      * is_torch_available test to avoid failing imports
      
      * sorting parameterize parameters to solve ERROR gw0 gw1
      
      * tests fix
      
      * tests fix
      
      * GPT2 with past fix
      
      * Fixed stateful class attribute change that was breaking things when converting multiple models sequentially
      
      * Removed onnx file
      
      * Refactor Marian export to account for base changes
      
      * Fix copies
      
      * Implemented suggestions
      
      * Extend support for causal LM
      
      * Revert "Revert "Added support for other features for already supported models (#14358)" (#14679)"
      
      This reverts commit 0f4e39c5.
      
      * is_torch_available test to avoid failing imports
      
      * sorting parameterize parameters to solve ERROR gw0 gw1
      
      * tests fix
      
      * tests fix
      
      * GPT2 with past fix
      
      * Fixed stateful class attribute change that was breaking things when converting multiple models sequentially
      
      * Removed onnx file
      
      * Implemented suggestions
      
      * Fixed __init__ to resolve conflict with master
      
      * Revert "Revert "Added support for other features for already supported models (#14358)" (#14679)"
      
      This reverts commit 0f4e39c5
      
      .
      
      * is_torch_available test to avoid failing imports
      
      * sorting parameterize parameters to solve ERROR gw0 gw1
      
      * tests fix
      
      * tests fix
      
      * GPT2 with past fix
      
      * Fixed stateful class attribute change that was breaking things when converting multiple models sequentially
      
      * Removed onnx file
      
      * Implemented suggestions
      
      * Fixed __init__ to resolve conflict with master
      
      * Remove commented import
      
      * Remove ONNX model
      
      * Remove redundant class method
      
      * Tidy up imports
      
      * Fix quality
      
      * Refactor dummy input function
      
      * Add copied from statements to Marian config functions
      
      * Remove false copied from comments
      
      * Fix copy from comment
      Co-authored-by: default avatarMassimiliano Bruni <massimiliano.bruni@hcl.com>
      Co-authored-by: default avatarMichael Benayoun <mickbenayoun@gmail.com>
      6b655cc6
  5. 22 Dec, 2021 1 commit