1. 31 Jan, 2023 1 commit
  2. 30 Jan, 2023 1 commit
  3. 24 Jan, 2023 1 commit
  4. 23 Jan, 2023 1 commit
  5. 20 Jan, 2023 1 commit
  6. 19 Jan, 2023 1 commit
    • Karim Foda's avatar
      Add hallucination filter (#18675) · b9403e95
      Karim Foda authored
      
      
      * Add hallucination penalty
      
      * Make quality changes
      
      * Inverse penalty
      
      * Fix imports & quality
      
      * Fix name spelling issue
      
      * set encoder_repetition_penalty and fix quality
      
      * Fix failing test
      
      * Add to config_common_kwargs
      
      * Fix modelling_rag error
      
      * Update src/transformers/generation_logits_process.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Remove breakpoint
      
      * Make style fixes
      
      * Update encoder_repetition_penalty default value
      
      * Merge latest main changes
      
      * Make fixup changes
      
      * Add EncoderRepetitionPenaltyLogitsProcessor to generation/__init__.py
      
      * Fix repo-inconsistency
      
      * Remove venv
      
      * Remove tensorflow-macos & add tests
      
      * Add documentation
      
      * Fix quality issues
      
      * move encoder_repetition_penalty to config
      
      * Update src/transformers/configuration_utils.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/generation/configuration_utils.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Remove encoder_repetition_penalty from tests
      
      * Fix type error
      
      * Fix format error
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      b9403e95
  7. 17 Jan, 2023 1 commit
    • Sherman Siu's avatar
      Add Epsilon- and Eta-Sampling (#21121) · 865da84a
      Sherman Siu authored
      * Add epsilon- and eta-sampling.
      
      Add epsilon- and eta-sampling, following the official code from https://github.com/john-hewitt/truncation-sampling and adapting to be more configurable, as required by Huggingface transformers.
      
      * Add unit tests for epsilon- and eta-sampling.
      
      * Black: fix code formatting.
      
      * Fix docstring spacing.
      
      * Clean up newlines.
      
      * Fix implementation bugs and their associated tests.
      
      * Remove epsilon- and eta-sampling parameters from PretrainedConfig.
      
      * Clarify and clean up the documentation.
      
      * Remove parameters for PretrainedConfig test.
      865da84a
  8. 04 Jan, 2023 1 commit
  9. 03 Jan, 2023 2 commits
    • Motoki Wu's avatar
      Add custom stop token ids for generation (#20727) · 45da7cec
      Motoki Wu authored
      * Add StopIdStoppingCriteria
      
      * add a working test for stop id criteria
      
      * add to global scope
      
      * add stop_ids to generate
      
      * add pipeline test
      
      * use tokenizer encode in test
      
      * add test to generation utils
      
      * reformat
      
      * fixup
      
      * make-fix-copies
      
      * rename to stop_token_id
      
      * use stop_tokens instead
      
      * add to text to text generation
      
      * make fixup
      
      * make repo-consistency
      
      * Add support for list of ints for eos_token_id inside generation/utils.py
      
      * Instead of having if elses, cast the eos_token_id into a List[int]
      
      * Add List[int] support for logits_process.py
      
      * add List[int] for beam_search.py
      
      * add List[int] for forced_eos_token_id
      
      * revert stop token id stopping criteria changes
      
      * make fixup
      
      * fix tests
      
      * add eos_token_id to generation/utils.py and added tests test_utils.py
      
      * add eos_token_id type hints and fix for pad tokens
      
      * add comments
      
      * remove some prints and remove forced false test
      
      * fix
      
      * put back test_stop_sequence_stopping_criteria
      
      * remove unused import and make fixup
      
      * add a none check
      
      * update docstring
      
      * add more docstring for list ints
      
      * make fixup
      45da7cec
    • Konstantin Kotik's avatar
      `MinNewTokensLengthLogitsProcessor` for `.generate` method #20814 (#20892) · 367fdf33
      Konstantin Kotik authored
      
      
      * feat: add min new length logit processor
      
      * test: add min new length logit processor
      
      * docs: add MinNewTokensLengthLogitsProcessor
      
      * feat: import MinNewTokensLengthLogitsProcessor
      
      * fix: update pytorch dummy objects
      
      * refactor & fix: rename attributes and var and get rid of dynamic attribute
      
      * tests: align test with new interface
      
      * docs: fix typo
      
      * docs: minor clarification
      
      * Empty-Commit
      
      * empty commit
      
      * run automated quality edits
      Co-authored-by: default avatarJoao Gante <joao@huggingface.co>
      367fdf33
  10. 20 Dec, 2022 1 commit
    • fzyzcjy's avatar
      Fix tiny typo (#20841) · ae3cbbca
      fzyzcjy authored
      * Fix typo
      
      * Update README.md
      
      * Update run_mlm_flax_stream.py
      
      * Update README.md
      ae3cbbca
  11. 15 Dec, 2022 1 commit
  12. 21 Nov, 2022 2 commits
  13. 14 Nov, 2022 1 commit
  14. 09 Nov, 2022 1 commit
  15. 01 Nov, 2022 1 commit
  16. 21 Oct, 2022 1 commit
  17. 19 Oct, 2022 1 commit
    • GMFTBY's avatar
      Adding the state-of-the-art contrastive search decoding methods for the... · 71786b10
      GMFTBY authored
      Adding the state-of-the-art contrastive search decoding methods for the codebase of generation_utils.py (#19477)
      
      * add: the contrastive search for generaton_utils
      
      * add: testing scripts for contrastive search under examples/text-generation
      
      * update the quality of codes
      
      * revise the docstring; make the generation_contrastive_search.py scripts;
      
      * revise the examples/pytorch/text-generation/run_generation_contrastive_search.py to the auto-APIs format
      
      * revise the necessary documents
      
      * fix: revise the docstring of generation_contrastive_search.py
      
      * Fix the code indentation
      
      * fix: revise the nits and examples in contrastive_search docstring.
      
      * fix the copyright
      
      * delete generation_contrastive_search.py
      
      * revise the logic in contrastive_search
      
      * update the intergration test and the docstring
      
      * run the tests over
      
      * add the slow decorate to the contrastive_search intergrate test
      
      * add more test
      
      * do the style, quality, consistency checks
      71786b10
  18. 10 Oct, 2022 1 commit
    • amyeroberts's avatar
      Add TF whisper (#19378) · e3f028f3
      amyeroberts authored
      
      
      * simplify loop
      
      * add featur extractor
      
      * add model
      
      * start conversion
      
      * add dropout
      
      * initial commit of test files
      
      * copnversion for all models
      
      * update processor for correct padding
      
      * update feature extraction
      
      * update integration test logits match
      
      * fmnt: off for the logits
      
      * on the fly mel bank
      
      * small nit
      
      * update test
      
      * update tokenizer
      
      * nit feature extraction
      
      * update
      
      * update tokenizer test
      
      * adds logit processor and update tokenizer to get supress tokens
      
      * style
      
      * clean convert
      
      * revert to original modeling tf utils
      
      * Update
      
      * update
      
      * nit
      
      * clean convert file
      
      * update tests and nits
      
      * quality
      
      * slow generation test
      
      * ffn_dim to allow customization
      
      * update readme
      
      * add to toctreee
      
      * start fixing integration tests
      
      * update tests and code
      
      * fix feature extractor
      
      * fix config tests common
      
      * update code to fix tests
      
      * fix feature exctractor
      
      * nit feature extraction
      
      * update test for new feature extractor
      
      * style
      
      * add absrtact
      
      * large logits wioth custom decoder input ids
      
      * wraap around is otrch available
      
      * fix feature extractor
      
      * correct logits for whisper small.en
      
      * nit
      
      * fix encoder_attentino_mask
      
      * some fixes
      
      * remove unnecessary inputs
      
      * nits
      
      * add normalizer file
      
      * update etst tokenization
      
      * fix attention mask not defined
      
      * fix generate
      
      * remove uncoder attention mask useless
      
      * update test modeling whisper
      
      * update condfig to add second non supress tokens
      
      * nits on feature exrtactor
      
      * nit for test tokenizers
      
      * update etsts
      
      * update tests
      
      * update tokenization test
      
      * fixup
      
      * invalidated hf token. Clean convert openai to whisper
      
      * fix logit tests
      
      * fixup
      
      * Add model to README
      
      * Fix doc tests
      
      * clean merge
      
      * revert toc_tree changes
      
      * remove useless LogitProcessor
      
      * Update whisper .mdx
      
      * update config file doc
      
      * update configuration docstring
      
      * update test tokenization
      
      * update test tokenization
      
      * update tokenization whisper
      Added copied from where needed
      
      * update feature extraction
      
      * nit test name
      
      * style
      
      * quality
      
      * remove get suppress tokens and update non_speech tokens global variables
      
      * Update src/transformers/models/whisper/feature_extraction_whisper.py
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * clean modeling whisper and test
      Removed the attention mask arguments that are deprecated
      
      * fix large test
      
      * Add multilingual audio test, and translate test
      
      * style
      
      * fix larg multilingual test
      
      * nits
      
      * add copied from for attention layer
      
      * remove attention masks in doc
      
      * add english normalizer
      
      * Update docs/source/en/model_doc/whisper.mdx
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * update tokenization test
      
      * remove copied from in whisper attention : no bias in k_proj only
      
      * wrap around dependencies in english normalizer
      
      * style
      
      * correct import generation logits
      
      * for now, wrap feature extractor with torch
      
      * remove torch depencies for feature extraction and style
      
      * Update src/transformers/models/whisper/convert_openai_whisper_to_tfms.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update docs/source/en/model_doc/whisper.mdx
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * fixup
      
      * nit
      
      * update logitds
      
      * style
      
      * nit
      
      * nits and fix final tests
      
      * add `is_more_itertools_available` to utils
      
      * quality
      
      * add begin supress tokens, supress tokens to generate args and config
      
      * clean supressTokensLogitProcessor in generation logits
      
      * Nit naming
      
      * add supressTokensAtBegin
      
      * udpate tests, supress tokens to None or correct values
      
      * nit and style
      
      * update RAG to fit test and generate_logit
      
      * add copy pasted statment on english normalizer
      
      * add arguments to config_common_kwargs
      
      * Update src/transformers/generation_utils.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Update src/transformers/generation_logits_process.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * revert changes based on reviews
      
      * update doc and nits
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * more nits
      
      * last nits
      
      * update test configuration common
      
      * add BART name in decoder attention mask documentation
      
      * Update src/transformers/models/whisper/modeling_whisper.py
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      
      * style
      
      * nit
      
      * nit
      
      * add english.json file to git
      
      * nits on documentation
      
      * nit
      
      * nits
      
      * last styling
      
      * add main toctree file
      
      * remove sentence piece dependency
      
      * clean init file
      
      * fix tokenizer that has no dependencies on sentencepiece
      
      * update whisper init file, nit
      
      * remove english.json file
      
      * add get decoder prompt id
      
      * All weights loading
      
      * Remove hanging pdb
      
      * Fixup and tidy up
      
      * Use same copied from as PT model
      
      * Remove whitespace changes
      
      * Remove torch references
      
      * Tie embeddings
      
      * Remove logits processor input to generate
      
      * Update logit values
      
      * revert changes and add forced logit processor
      
      * nit
      
      * clean normalizer
      
      * remove protected
      
      * Add logit processors and update generation code & tests
      
      * Some tidy up
      
      * Update docstring
      
      * update
      
      * update based on review
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * Update src/transformers/models/whisper/configuration_whisper.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * Update to reflect changes on the PT model branch
      
      * Tidy up
      
      * Remove extra whitespace
      
      * Fix test - make input ids small enough we can append
      
      * Include upstream changes on main
      
      * PR comments - add batch tests, remove comments & defaults
      
      * Fix model output imports
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/generation_tf_logits_process.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update tests/models/whisper/test_modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update docstring example
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarMatt <Rocketknight1@users.noreply.github.com>
      
      * Remove changes to adjust_logits_during_generation function
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * Tidy up imports that don't require TF
      
      * Update tests - skip and no more skip
      
      * Update tests/generation/test_generation_tf_logits_process.py
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      
      * Update src/transformers/models/whisper/modeling_tf_whisper.py
      Co-authored-by: default avatarMatt <Rocketknight1@users.noreply.github.com>
      
      * Add training flags
      
      * Add (skipped) XLA generation tests
      
      * Add embedding correctness test
      
      * Add constant ids for generation tests
      
      * Make logits finding a bit tidier
      
      * Remove unused args
      
      * xla generation enabled
      
      * Don't skip XLA tests anymore
      
      * Fix tests - add position ids to expected signature and update rag generation
      
      * Undo method reorder
      
      * Remove added whitespace
      
      * Remove copy-paste gradient checkopint ref
      
      * Remove
      
      * Trigger CI - (issue with refs when pulling)
      Co-authored-by: default avatarArthur Zucker <arthur.zucker@gmail.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarNielsRogge <niels.rogge1@gmail.com>
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      Co-authored-by: default avatarNielsRogge <48327001+NielsRogge@users.noreply.github.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      Co-authored-by: default avatarMatt <Rocketknight1@users.noreply.github.com>
      Co-authored-by: default avatarJoao Gante <joao@huggingface.co>
      e3f028f3
  19. 30 Sep, 2022 1 commit
  20. 15 Sep, 2022 1 commit
  21. 05 Sep, 2022 1 commit
  22. 02 Sep, 2022 1 commit
  23. 19 Aug, 2022 1 commit
  24. 18 Aug, 2022 1 commit
  25. 12 Aug, 2022 1 commit
  26. 23 Jul, 2022 1 commit
  27. 28 Jun, 2022 1 commit
  28. 21 Jun, 2022 1 commit
  29. 10 Jun, 2022 1 commit
  30. 19 May, 2022 1 commit
  31. 12 May, 2022 1 commit
  32. 29 Apr, 2022 1 commit
  33. 25 Apr, 2022 2 commits
  34. 22 Apr, 2022 1 commit
  35. 13 Apr, 2022 1 commit
  36. 12 Apr, 2022 1 commit
  37. 11 Apr, 2022 1 commit