1. 17 Aug, 2023 1 commit
  2. 23 Jun, 2023 1 commit
  3. 06 Jun, 2023 1 commit
    • Matt's avatar
      Move TF building to an actual build() method (#23760) · 4a55e478
      Matt authored
      * A fun new PR where I break the entire codebase again
      
      * A fun new PR where I break the entire codebase again
      
      * Handle cross-attention
      
      * Move calls to model(model.dummy_inputs) to the new build() method
      
      * Seeing what fails with the build context thing
      
      * make fix-copies
      
      * Let's see what fails with new build methods
      
      * Fix the pytorch crossload build calls
      
      * Fix the overridden build methods in vision_text_dual_encoder
      
      * Make sure all our build methods set self.built or call super().build(), which also sets it
      
      * make fix-copies
      
      * Remove finished TODO
      
      * Tentatively remove unneeded (?) line
      
      * Transpose b in deberta correctly and remove unused threading local
      
      * Get rid of build_with_dummies and all it stands for
      
      * Rollback some changes to TF-PT crossloading
      
      * Correctly call super().build()
      4a55e478
  4. 24 May, 2023 1 commit
    • Matt's avatar
      Better TF docstring types (#23477) · f8b25744
      Matt authored
      * Rework TF type hints to use | None instead of Optional[] for tf.Tensor
      
      * Rework TF type hints to use | None instead of Optional[] for tf.Tensor
      
      * Don't forget the imports
      
      * Add the imports to tests too
      
      * make fixup
      
      * Refactor tests that depended on get_type_hints
      
      * Better test refactor
      
      * Fix an old hidden bug in the test_keras_fit input creation code
      
      * Fix for the Deit tests
      f8b25744
  5. 11 May, 2023 1 commit
  6. 18 Apr, 2023 1 commit
    • Joao Gante's avatar
      Generate: Add assisted generation (#22211) · 78cda46f
      Joao Gante authored
      * working mvp
      
      * remove breakpoint
      
      * fix commit
      
      * standardize outputs
      
      * tmp commit
      
      * tests almost ready
      
      * tmp commit
      
      * skip a few models
      
      * Add streaming; Docs and examples
      
      * document limitations
      
      * PR commits
      
      * Amy PR comments
      78cda46f
  7. 28 Feb, 2023 1 commit
    • Yih-Dar's avatar
      馃敟Rework pipeline testing by removing `PipelineTestCaseMeta` 馃殌 (#21516) · 871c31a6
      Yih-Dar authored
      
      
      * Add PipelineTesterMixin
      
      * remove class PipelineTestCaseMeta
      
      * move validate_test_components
      
      * Add for ViT
      
      * Add to SPECIAL_MODULE_TO_TEST_MAP
      
      * style and quality
      
      * Add feature-extraction
      
      * update
      
      * raise instead of skip
      
      * add tiny_model_summary.json
      
      * more explicit
      
      * skip tasks not in mapping
      
      * add availability check
      
      * Add Copyright
      
      * A way to diable irrelevant tests
      
      * update with main
      
      * remove disable_irrelevant_tests
      
      * skip tests
      
      * better skip message
      
      * better skip message
      
      * Add all pipeline task tests
      
      * revert
      
      * Import PipelineTesterMixin
      
      * subclass test classes with PipelineTesterMixin
      
      * Add pipieline_model_mapping
      
      * Fix import after adding pipieline_model_mapping
      
      * Fix style and quality after adding pipieline_model_mapping
      
      * Fix one more import after adding pipieline_model_mapping
      
      * Fix style and quality after adding pipieline_model_mapping
      
      * Fix test issues
      
      * Fix import requirements
      
      * Fix mapping for MobileViTModelTest
      
      * Update
      
      * Better skip message
      
      * pipieline_model_mapping could not be None
      
      * Remove some PipelineTesterMixin
      
      * Fix typo
      
      * revert tests_fetcher.py
      
      * update
      
      * rename
      
      * revert
      
      * Remove PipelineTestCaseMeta from ZeroShotAudioClassificationPipelineTests
      
      * style and quality
      
      * test fetcher for all pipeline/model tests
      
      ---------
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      871c31a6
  8. 21 Feb, 2023 1 commit
  9. 13 Feb, 2023 1 commit
  10. 06 Feb, 2023 1 commit
    • Sylvain Gugger's avatar
      Update quality tooling for formatting (#21480) · 6f79d264
      Sylvain Gugger authored
      * Result of black 23.1
      
      * Update target to Python 3.7
      
      * Switch flake8 to ruff
      
      * Configure isort
      
      * Configure isort
      
      * Apply isort with line limit
      
      * Put the right black version
      
      * adapt black in check copies
      
      * Fix copies
      6f79d264
  11. 25 Jan, 2023 1 commit
    • Arthur's avatar
      [WHISPER] Small patch (#21307) · 6f3faf38
      Arthur authored
      * add small patch
      
      * update tests, forced decoder ids is not prioritary against generation config
      
      * fix two new tests
      6f3faf38
  12. 30 Nov, 2022 1 commit
  13. 15 Nov, 2022 1 commit
  14. 21 Oct, 2022 1 commit
  15. 18 Oct, 2022 1 commit
  16. 10 Oct, 2022 1 commit
    • amyeroberts's avatar
      Add TF whisper (#19378) · e3f028f3
      amyeroberts authored
      
      
      * simplify loop
      
      * add featur extractor
      
      * add model
      
      * start conversion
      
      * add dropout
      
      * initial commit of test files
      
      * copnversion for all models
      
      * update processor for correct padding
      
      * update feature extraction
      
      * update integration test logits match
      
      * fmnt: off for the logits
      
      * on the fly mel bank
      
      * small nit
      
      * update test
      
      * update tokenizer
      
      * nit feature extraction
      
      * update
      
      * update tokenizer test
      
      * adds logit processor and update tokenizer to get supress tokens
      
      * style
      
      * clean convert
      
      * revert to original modeling tf utils
      
      * Update
      
      * update
      
      * nit
      
      * clean convert file
      
      * update tests and nits
      
      * quality
      
      * slow generation test
      
      * ffn_dim to allow customization
      
      * update readme
      
      * add to toctreee
      
      * start fixing integration tests
      
      * update tests and code
      
      * fix feature extractor
      
      * fix config tests common
      
      * update code to fix tests
      
      * fix feature exctractor
      
      * nit feature extraction
      
      * update test for new feature extractor
      
      * style
      
      * add absrtact
      
      * large logits wioth custom decoder input ids
      
      * wraap around is otrch available
      
      * fix feature extractor
      
      * correct logits for whisper small.en
      
      * nit
      
      * fix encoder_attentino_mask
      
      * some fixes
      
      * remove unnecessary inputs
      
      * nits
      
      * add normalizer file
      
      * update etst tokenization
      
      * fix attention mask not defined
      
      * fix generate
      
      * remove uncoder attention mask useless
      
      * update test modeling whisper
      
      * update condfig to add second non supress tokens
      
      * nits on feature exrtactor
      
      * nit for test tokenizers
      
      * update etsts
      
      * update tests
      
      * update tokenization test
      
      * fixup
      
      * invalidated hf token. Clean convert openai to whisper
      
      * fix logit tests
      
      * fixup
      
      * Add model to README
      
      * Fix doc tests
      
      * clean merge
      
      * revert toc_tree changes
      
      * remove useless LogitProcessor
      
      * Update whisper .mdx
      
      * update config file doc
      
      * update configuration docstring
      
      * update test tokenization
      
      * update test tokenization
      
      * update tokenization whisper
      Added copied from where needed
      
      * update feature extraction
      
      * nit test name
      
      * style
      
      * quality
      
      * remove get suppress tokens and update non_speech tokens global variables
      
      * Update src/transformers/models/whisper/feature_extraction_whisper.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * clean modeling whisper and test
      Removed the attention mask arguments that are deprecated
      
      * fix large test
      
      * Add multilingual audio test, and translate test
      
      * style
      
      * fix larg multilingual test
      
      * nits
      
      * add copied from for attention layer
      
      * remove attention masks in doc
      
      * add english normalizer
      
      * Update docs/source/en/model_doc/whisper.mdx
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * update tokenization test
      
      * remove copied from in whisper attention : no bias in k_proj only
      
      * wrap around dependencies in english normalizer
      
      * style
      
      * correct import generation logits
      
      * for now, wrap feature extractor with torch
      
      * remove torch depencies for feature extraction and style
      
      * Update src/transformers/models/whisper/convert_openai_whisper_to_tfms.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update docs/source/en/model_doc/whisper.mdx
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * fixup
      
      * nit
      
      * update logitds
      
      * style
      
      * nit
      
      * nits and fix final tests
      
      * add `is_more_itertools_available` to utils
      
      * quality
      
      * add begin supress tokens, supress tokens to generate args and config
      
      * clean supressTokensLogitProcessor in generation logits
      
      * Nit naming
      
      * add supressTokensAtBegin
      
      * udpate tests, supress tokens to None or correct values
      
      * nit and style
      
      * update RAG to fit test and generate_logit
      
      * add copy pasted statment on english normalizer
      
      * add arguments to config_common_kwargs
      
      * Update src/transformers/generation_utils.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/generation_logits_process.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * revert changes based on reviews
      
      * update doc and nits
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * more nits
      
      * last nits
      
      * update test configuration common
      
      * add BART name in decoder attention mask documentation
      
      * Update src/transformers/models/whisper/modeling_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * style
      
      * nit
      
      * nit
      
      * add english.json file to git
      
      * nits on documentation
      
      * nit
      
      * nits
      
      * last styling
      
      * add main toctree file
      
      * remove sentence piece dependency
      
      * clean init file
      
      * fix tokenizer that has no dependencies on sentencepiece
      
      * update whisper init file, nit
      
      * remove english.json file
      
      * add get decoder prompt id
      
      * All weights loading
      
      * Remove hanging pdb
      
      * Fixup and tidy up
      
      * Use same copied from as PT model
      
      * Remove whitespace changes
      
      * Remove torch references
      
      * Tie embeddings
      
      * Remove logits processor input to generate
      
      * Update logit values
      
      * revert changes and add forced logit processor
      
      * nit
      
      * clean normalizer
      
      * remove protected
      
      * Add logit processors and update generation code & tests
      
      * Some tidy up
      
      * Update docstring
      
      * update
      
      * update based on review
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * Update to reflect changes on the PT model branch
      
      * Tidy up
      
      * Remove extra whitespace
      
      * Fix test - make input ids small enough we can append
      
      * Include upstream changes on main
      
      * PR comments - add batch tests, remove comments & defaults
      
      * Fix model output imports
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/generation_tf_logits_process.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update tests/models/whisper/test_modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update docstring example
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarMatt <Rocketknight1@users.noreply.github.com>
      
      * Remove changes to adjust_logits_during_generation function
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * Tidy up imports that don't require TF
      
      * Update tests - skip and no more skip
      
      * Update tests/generation/test_generation_tf_logits_process.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarMatt <Rocketknight1@users.noreply.github.com>
      
      * Add training flags
      
      * Add (skipped) XLA generation tests
      
      * Add embedding correctness test
      
      * Add constant ids for generation tests
      
      * Make logits finding a bit tidier
      
      * Remove unused args
      
      * xla generation enabled
      
      * Don't skip XLA tests anymore
      
      * Fix tests - add position ids to expected signature and update rag generation
      
      * Undo method reorder
      
      * Remove added whitespace
      
      * Remove copy-paste gradient checkopint ref
      
      * Remove
      
      * Trigger CI - (issue with refs when pulling)
      Co-authored-by: default avatarArthur Zucker <arthur.zucker@gmail.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarNielsRogge <niels.rogge1@gmail.com>
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      Co-authored-by: default avatarMatt <Rocketknight1@users.noreply.github.com>
      Co-authored-by: default avatarJoao Gante <joao@huggingface.co>
      e3f028f3
  17. 05 Oct, 2022 1 commit
    • Arthur's avatar
      Add WhisperModel to transformers (#19166) · 45e14038
      Arthur authored
      
      
      * simplify loop
      
      * add featur extractor
      
      * add model
      
      * start conversion
      
      * add dropout
      
      * initial commit of test files
      
      * copnversion for all models
      
      * update processor for correct padding
      
      * update feature extraction
      
      * update integration test logits match
      
      * fmnt: off for the logits
      
      * on the fly mel bank
      
      * small nit
      
      * update test
      
      * update tokenizer
      
      * nit feature extraction
      
      * update
      
      * update tokenizer test
      
      * adds logit processor and update tokenizer to get supress tokens
      
      * style
      
      * clean convert
      
      * revert to original modeling tf utils
      
      * Update
      
      * update
      
      * nit
      
      * clean convert file
      
      * update tests and nits
      
      * quality
      
      * slow generation test
      
      * ffn_dim to allow customization
      
      * update readme
      
      * add to toctreee
      
      * start fixing integration tests
      
      * update tests and code
      
      * fix feature extractor
      
      * fix config tests common
      
      * update code to fix tests
      
      * fix feature exctractor
      
      * nit feature extraction
      
      * update test for new feature extractor
      
      * style
      
      * add absrtact
      
      * large logits wioth custom decoder input ids
      
      * wraap around is otrch available
      
      * fix feature extractor
      
      * correct logits for whisper small.en
      
      * nit
      
      * fix encoder_attentino_mask
      
      * some fixes
      
      * remove unnecessary inputs
      
      * nits
      
      * add normalizer file
      
      * update etst tokenization
      
      * fix attention mask not defined
      
      * Add model to README
      
      * Fix doc tests
      
      * fix generate
      
      * remove uncoder attention mask useless
      
      * update test modeling whisper
      
      * update condfig to add second non supress tokens
      
      * nits on feature exrtactor
      
      * nit for test tokenizers
      
      * update etsts
      
      * update tests
      
      * update tokenization test
      
      * fixup
      
      * invalidated hf token. Clean convert openai to whisper
      
      * fix logit tests
      
      * fixup
      
      * clean merge
      
      * revert toc_tree changes
      
      * remove useless LogitProcessor
      
      * Update whisper .mdx
      
      * update config file doc
      
      * update configuration docstring
      
      * update test tokenization
      
      * update test tokenization
      
      * update tokenization whisper
      Added copied from where needed
      
      * update feature extraction
      
      * nit test name
      
      * style
      
      * quality
      
      * remove get suppress tokens and update non_speech tokens global variables
      
      * Update src/transformers/models/whisper/feature_extraction_whisper.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * clean modeling whisper and test
      Removed the attention mask arguments that are deprecated
      
      * fix large test
      
      * Add multilingual audio test, and translate test
      
      * style
      
      * fix larg multilingual test
      
      * nits
      
      * Update docs/source/en/model_doc/whisper.mdx
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * add copied from for attention layer
      
      * remove attention masks in doc
      
      * add english normalizer
      
      * update tokenization test
      
      * remove copied from in whisper attention : no bias in k_proj only
      
      * wrap around dependencies in english normalizer
      
      * style
      
      * correct import generation logits
      
      * for now, wrap feature extractor with torch
      
      * Update src/transformers/models/whisper/convert_openai_whisper_to_tfms.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update docs/source/en/model_doc/whisper.mdx
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * remove torch depencies for feature extraction and style
      
      * fixup
      
      * nit
      
      * update logitds
      
      * style
      
      * nit
      
      * nits and fix final tests
      
      * add `is_more_itertools_available` to utils
      
      * quality
      
      * add begin supress tokens, supress tokens to generate args and config
      
      * clean supressTokensLogitProcessor in generation logits
      
      * Nit naming
      
      * add supressTokensAtBegin
      
      * udpate tests, supress tokens to None or correct values
      
      * nit and style
      
      * update RAG to fit test and generate_logit
      
      * add copy pasted statment on english normalizer
      
      * add arguments to config_common_kwargs
      
      * Update src/transformers/generation_utils.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/generation_logits_process.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * revert changes based on reviews
      
      * update doc and nits
      
      * more nits
      
      * last nits
      
      * update test configuration common
      
      * add BART name in decoder attention mask documentation
      
      * Update src/transformers/models/whisper/modeling_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * style
      
      * nit
      
      * nit
      
      * add english.json file to git
      
      * nits on documentation
      
      * nit
      
      * nits
      
      * last styling
      
      * add main toctree file
      
      * remove sentence piece dependency
      
      * clean init file
      
      * fix tokenizer that has no dependencies on sentencepiece
      
      * update whisper init file, nit
      
      * remove english.json file
      
      * add get decoder prompt id
      
      * revert changes and add forced logit processor
      
      * nit
      
      * clean normalizer
      
      * remove protected
      
      * update
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * update based on review
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * add batched tests
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarNielsRogge <niels.rogge1@gmail.com>
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      45e14038
  18. 07 Sep, 2022 1 commit
  19. 31 May, 2022 1 commit
  20. 12 May, 2022 1 commit
  21. 03 May, 2022 1 commit
    • Yih-Dar's avatar
      Move test model folders (#17034) · 19420fd9
      Yih-Dar authored
      
      
      * move test model folders (TODO: fix imports and others)
      
      * fix (potentially partially) imports (in model test modules)
      
      * fix (potentially partially) imports (in tokenization test modules)
      
      * fix (potentially partially) imports (in feature extraction test modules)
      
      * fix import utils.test_modeling_tf_core
      
      * fix path ../fixtures/
      
      * fix imports about generation.test_generation_flax_utils
      
      * fix more imports
      
      * fix fixture path
      
      * fix get_test_dir
      
      * update module_to_test_file
      
      * fix get_tests_dir from wrong transformers.utils
      
      * update config.yml (CircleCI)
      
      * fix style
      
      * remove missing imports
      
      * update new model script
      
      * update check_repo
      
      * update SPECIAL_MODULE_TO_TEST_MAP
      
      * fix style
      
      * add __init__
      
      * update self-scheduled
      
      * fix add_new_model scripts
      
      * check one way to get location back
      
      * python setup.py build install
      
      * fix import in test auto
      
      * update self-scheduled.yml
      
      * update slack notification script
      
      * Add comments about artifact names
      
      * fix for yolos
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      19420fd9
  22. 11 Apr, 2022 1 commit
  23. 23 Mar, 2022 1 commit
    • Sylvain Gugger's avatar
      Reorganize file utils (#16264) · 4975002d
      Sylvain Gugger authored
      * Split file_utils in several submodules
      
      * Fixes
      
      * Add back more objects
      
      * More fixes
      
      * Who exactly decided to import that from there?
      
      * Second suggestion to code with code review
      
      * Revert wront move
      
      * Fix imports
      
      * Adapt all imports
      
      * Adapt all imports everywhere
      
      * Revert this import, will fix in a separate commit
      4975002d
  24. 03 Mar, 2022 1 commit
  25. 23 Feb, 2022 1 commit
  26. 18 Oct, 2021 1 commit
  27. 14 Oct, 2021 1 commit
  28. 21 Sep, 2021 1 commit
  29. 01 Sep, 2021 1 commit
  30. 21 Jul, 2021 1 commit
  31. 23 Apr, 2021 1 commit
    • Daniel Stancl's avatar
      Fix cross-attention head mask for Torch encoder-decoder models (#10605) · e3ff165a
      Daniel Stancl authored
      * Fix cross-attention head mask for Torch BART models
      
      * Fix head masking for cross-attention module for the following
      models: BART, Blenderbot, Blenderbot_small, M2M_100, Marian, MBart,
      Pegasus
      
      * Enable test_headmasking for M2M_100 model
      
      * Fix cross_head_mask for FSMT, LED and T5
      
      * This commit fixes `head_mask` for cross-attention modules
      in the following models: FSMT, LED, T5
      
      * It also contains some smaller changes in doc so that
      it is be perfectly clear the shape of `cross_head_mask`
      is the same as of `decoder_head_mask`
      
      * Update template
      
      * Fix template for BartForCausalLM
      
      * Fix cross_head_mask for Speech2Text models
      
      * Fix cross_head_mask in templates
      
      * Fix args order in BartForCausalLM template
      
      * Fix doc in BART templates
      
      * Make more explicit naming
      
      * `cross_head_mask` -> `cross_attn_head_mask`
      
      * `cross_layer_head_mask` -> `cross_attn_layer_head_mask`
      
      * Fix doc
      
      * make style quality
      
      * Fix speech2text docstring
      e3ff165a
  32. 10 Mar, 2021 1 commit
    • Suraj Patil's avatar
      Speech2TextTransformer (#10175) · d26b37e7
      Suraj Patil authored
      
      
      * s2t
      
      * fix config
      
      * conversion script
      
      * fix import
      
      * add tokenizer
      
      * fix tok init
      
      * fix tokenizer
      
      * first version working
      
      * fix embeds
      
      * fix lm head
      
      * remove extra heads
      
      * fix convert script
      
      * handle encoder attn mask
      
      * style
      
      * better enc attn mask
      
      * override _prepare_attention_mask_for_generation
      
      * handle attn_maks in encoder and decoder
      
      * input_ids => input_features
      
      * enable use_cache
      
      * remove old code
      
      * expand embeddings if needed
      
      * remove logits bias
      
      * masked_lm_loss => loss
      
      * hack tokenizer to support feature processing
      
      * fix model_input_names
      
      * style
      
      * fix error message
      
      * doc
      
      * remove inputs_embeds
      
      * remove input_embeds
      
      * remove unnecessary docstring
      
      * quality
      
      * SpeechToText => Speech2Text
      
      * style
      
      * remove shared_embeds
      
      * subsample => conv
      
      * remove Speech2TextTransformerDecoderWrapper
      
      * update output_lengths formula
      
      * fix table
      
      * remove max_position_embeddings
      
      * update conversion scripts
      
      * add possibility to do upper case for now
      
      * add FeatureExtractor and Processor
      
      * add tests for extractor
      
      * require_torch_audio => require_torchaudio
      
      * add processor test
      
      * update import
      
      * remove classification head
      
      * attention mask is now 1D
      
      * update docstrings
      
      * attention mask should be of type long
      
      * handle attention mask from generate
      
      * alwyas return attention_mask
      
      * fix test
      
      * style
      
      * doc
      
      * Speech2TextTransformer => Speech2Text
      
      * Speech2TextTransformerConfig => Speech2TextConfig
      
      * remove dummy_inputs
      
      * nit
      
      * style
      
      * multilinguial tok
      
      * fix tokenizer
      
      * add tgt_lang setter
      
      * save lang_codes
      
      * fix tokenizer
      
      * add forced_bos_token_id to tokenizer
      
      * apply review suggestions
      
      * add torchaudio to extra deps
      
      * add speech deps to CI
      
      * fix dep
      
      * add libsndfile to ci
      
      * libsndfile1
      
      * add speech to extras all
      
      * libsndfile1 -> libsndfile1
      
      * libsndfile
      
      * libsndfile1-dev
      
      * apt update
      
      * add sudo to install
      
      * update deps table
      
      * install libsndfile1-dev on CI
      
      * tuple to list
      
      * init conv layer
      
      * add model tests
      
      * quality
      
      * add integration tests
      
      * skip_special_tokens
      
      * add speech_to_text_transformer in toctree
      
      * fix tokenizer
      
      * fix fp16 tests
      
      * add tokenizer tests
      
      * fix copyright
      
      * input_values => input_features
      
      * doc
      
      * add model in readme
      
      * doc
      
      * change checkpoint names
      
      * fix copyright
      
      * fix code example
      
      * add max_model_input_sizes in tokenizer
      
      * fix integration tests
      
      * add do_lower_case to tokenizer
      
      * remove clamp trick
      
      * fix "Add modeling imports here"
      
      * fix copyrights
      
      * fix tests
      
      * SpeechToTextTransformer => SpeechToText
      
      * fix naming
      
      * fix table formatting
      
      * fix typo
      
      * style
      
      * fix typos
      
      * remove speech dep from extras[testing]
      
      * fix copies
      
      * rename doc file,
      
      * put imports under is_torch_available
      
      * run feat extract tests when torch is available
      
      * dummy objects for processor and extractor
      
      * fix imports in tests
      
      * fix import in modeling test
      
      * fxi imports
      
      * fix torch import
      
      * fix imports again
      
      * fix positional embeddings
      
      * fix typo in import
      
      * adapt new extractor refactor
      
      * style
      
      * fix torchscript test
      
      * doc
      
      * doc
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * fix docs, copied from, style
      
      * fix docstring
      
      * handle imports
      
      * remove speech from all extra deps
      
      * remove s2t from seq2seq lm mapping
      
      * better names
      
      * skip training tests
      
      * add install instructions
      
      * List => Tuple
      
      * doc
      
      * fix conversion script
      
      * fix urls
      
      * add instruction for libsndfile
      
      * fix fp16 test
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      d26b37e7