"ollama/llm/ext_server/server.cpp" did not exist on "22cb4ffc026b1fb71549031f174dc92f3751db56"
  1. 05 Jul, 2024 1 commit
  2. 14 Dec, 2023 1 commit
    • Matt's avatar
      Proper build() methods for TF (#27794) · 050e0b44
      Matt authored
      * Add a convenience method for building in your own name scope
      
      * Second attempt at auto layer building
      
      * Revert "Second attempt at auto layer building"
      
      This reverts commit e03a3aaecf9ec41a805582b83cbdfe3290a631be.
      
      * Attempt #3
      
      * Revert "Attempt #3"
      
      This reverts commit b9df7a0857560d29b5abbed6127d9e9eca77cf47.
      
      * Add missing attributes that we're going to need later
      
      * Add some attributes we're going to need later
      
      * A fourth attempt! Feel the power flow through you!
      
      * Revert "A fourth attempt! Feel the power flow through you!"
      
      This reverts commit 6bf4aaf3875d6f28485f50187617a4c616c8aff7.
      
      * Add more values we'll need later
      
      * TF refactor that we'll need later
      
      * Revert "TF refactor that we'll need later"
      
      This reverts commit ca07202fb5b7b7436b893baa8d688b4f348ea7b9.
      
      * Revert "Revert "TF refactor that we'll need later""
      
      This reverts commit 1beb0f39f293ed9c27594575e1c849aadeb15c13.
      
      * make fixup
      
      * Attempt five!
      
      * Revert "Attempt five!"
      
      This reverts commit 3302207958dfd0374b0447a51c06eea51a506044.
      
      * Attempt six - this time don't add empty methods
      
      * Revert "Attempt six - this time don't add empty methods"
      
      This reverts commit 67d60129be75416b6beb8f47c7d38d77b18d79bb.
      
      * Attempt seven - better base model class detection!
      
      * Revert "Attempt seven - better base model class detection!"
      
      This reverts commit 5f14845e92ea0e87c598da933bfbfee10f553bc9.
      
      * Another attribute we'll need later
      
      * Try again with the missing attribute!
      
      * Revert "Try again with the missing attribute!"
      
      This reverts commit 760c6f30c5dffb3e04b0e73c34a77d1882a0fef7.
      
      * This is the attempt that will pierce the heavens!
      
      * Revert "This is the attempt that will pierce the heavens!"
      
      This reverts commit c868bb657de057aca7a5260350a3f831fc4dfee6.
      
      * Attempt seven - snag list is steadily decreasing
      
      * Revert "Attempt seven - snag list is steadily decreasing"
      
      This reverts commit 46fbd975deda64429bfb3e5fac4fc0370c00d316.
      
      * Attempt eight - will an empty snag list do it?
      
      * Revert "Attempt eight - will an empty snag list do it?"
      
      This reverts commit 7c8a3c2b083253649569e9877e02054ae5cec67b.
      
      * Fixes to Hubert issues that cause problems later
      
      * Trying again with Conv1D/SeparableConv fixes
      
      * Revert "Trying again with Conv1D/SeparableConv fixes"
      
      This reverts commit 55092bca952bc0f750aa1ffe246a640bf1e2036e.
      
      * Apply the build shape fixes to Wav2Vec2 as well
      
      * One more attempt!
      
      * Revert "One more attempt!"
      
      This reverts commit 5ac3e4cb01b9458cc93312873725f9444ae7261c.
      
      * Another attempt!
      
      * Revert "Another attempt!"
      
      This reverts commit ea16d890e019d7de8792a3b8e72f3b1c02adae50.
      
      * Let's see how many failures we get without the internal build method
      
      * Fix OpenAI
      
      * Fix MobileBERT
      
      * (Mostly) fix GroupVIT
      
      * Fix BLIP
      
      * One more BLIP fix
      
      * One more BLIP fix!
      
      * Fix Regnet
      
      * Finally fully fix GroupViT
      
      * Fix Data2Vec and add the new AdaptivePool
      
      * Fix Segformer
      
      * Fix Albert
      
      * Fix Deberta/DebertaV2
      
      * Fix XLM
      
      * Actually fix XLM
      
      * Fix Flaubert
      
      * Fix lxmert
      
      * Fix Resnet
      
      * Fix ConvBERT
      
      * Fix ESM
      
      * Fix Convnext / ConvnextV2
      
      * Fix SAM
      
      * Fix Efficientformer
      
      * Fix LayoutLMv3
      
      * Fix speech_to_text
      
      * Fix mpnet and mobilevit
      
      * Fix Swin
      
      * Fix CTRL
      
      * Fix CVT
      
      * Fix DPR
      
      * Fix Wav2Vec2
      
      * Fix T5
      
      * Fix Hubert
      
      * Fix GPT2
      
      * Fix Whisper
      
      * Fix DeiT
      
      * Fix the encoder-decoder / dual-encoder classes
      
      * make fix-copies
      
      * build in name scope
      
      * Fix summarization test
      
      * Fix tied weight names for BART + Blenderbot
      
      * Fix tied weight name building
      
      * Fix to TFESM weight building
      
      * Update TF SAM
      
      * Expand all the shapes out into Big Boy Shapes
      050e0b44
  3. 31 Oct, 2023 1 commit
  4. 02 Mar, 2023 1 commit
  5. 28 Feb, 2023 1 commit
    • Yih-Dar's avatar
      馃敟Rework pipeline testing by removing `PipelineTestCaseMeta` 馃殌 (#21516) · 871c31a6
      Yih-Dar authored
      
      
      * Add PipelineTesterMixin
      
      * remove class PipelineTestCaseMeta
      
      * move validate_test_components
      
      * Add for ViT
      
      * Add to SPECIAL_MODULE_TO_TEST_MAP
      
      * style and quality
      
      * Add feature-extraction
      
      * update
      
      * raise instead of skip
      
      * add tiny_model_summary.json
      
      * more explicit
      
      * skip tasks not in mapping
      
      * add availability check
      
      * Add Copyright
      
      * A way to diable irrelevant tests
      
      * update with main
      
      * remove disable_irrelevant_tests
      
      * skip tests
      
      * better skip message
      
      * better skip message
      
      * Add all pipeline task tests
      
      * revert
      
      * Import PipelineTesterMixin
      
      * subclass test classes with PipelineTesterMixin
      
      * Add pipieline_model_mapping
      
      * Fix import after adding pipieline_model_mapping
      
      * Fix style and quality after adding pipieline_model_mapping
      
      * Fix one more import after adding pipieline_model_mapping
      
      * Fix style and quality after adding pipieline_model_mapping
      
      * Fix test issues
      
      * Fix import requirements
      
      * Fix mapping for MobileViTModelTest
      
      * Update
      
      * Better skip message
      
      * pipieline_model_mapping could not be None
      
      * Remove some PipelineTesterMixin
      
      * Fix typo
      
      * revert tests_fetcher.py
      
      * update
      
      * rename
      
      * revert
      
      * Remove PipelineTestCaseMeta from ZeroShotAudioClassificationPipelineTests
      
      * style and quality
      
      * test fetcher for all pipeline/model tests
      
      ---------
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      871c31a6
  6. 30 Jan, 2023 1 commit
  7. 25 Jan, 2023 1 commit
  8. 13 Dec, 2022 1 commit
  9. 07 Dec, 2022 1 commit
  10. 15 Nov, 2022 1 commit
    • Younes Belkada's avatar
      Add Switch transformers (#19323) · 163ac3d3
      Younes Belkada authored
      
      
      * first commit
      
      * add more comments
      
      * add router v1
      
      * clean up
      
      - remove `tf` modeling files
      
      * clean up
      
      - remove `tf` modeling files
      
      * clean up
      
      * v0 routers
      
      * added more router
      
      - Implemented `ExpertsChooseMaskedRouter`
      
      - added tests
      - 2 more routers to implement
      
      * last router
      
      * improved docstring
      
      - completed the docstring in `router.py`
      - added more args in the config
      
      * v0 sparse mlp
      
      * replace wrong naming
      
      * forward pass run
      
      * update MOE layer
      
      * small router update
      
      * fixup
      
      * consistency
      
      * remove scatter router
      
      * remove abstract layer
      
      * update test and model for integration testing
      
      * v1 conversion
      
      * update
      
      * hardcode hack
      
      * all keys match
      
      * add gin conversion, without additional libraries
      
      * update conversion sctipy
      
      * delete router file
      
      * update tests wrt router deletion
      
      * fix router issues
      
      * update expert code
      
      * update, logits match, code needsREFACTORING
      
      * Refactor code
      Co-authored-by: default avatarYounes Belkada <younesbelkada@users.noreply.github.com>
      
      * add generate tests
      Co-authored-by: default avataryounesbelkada <younesbelkada@gmail.com>
      
      * add support for router loss
      Co-authored-by: default avatarYounes Belkada <younesbelkada@users.noreply.github.com>
      
      * fix forward error
      
      * refactor a bit
      
      * remove `FlaxSwitchTransformers` modules
      
      * more tests pass
      
      * Update code
      Co-authored-by: default avatarYounes Belkada <younesbelkada@users.noreply.github.com>
      
      * fixup
      
      * fix tests
      
      * fix doc
      
      * fix doc + tokenization
      
      * fix tokenizer test
      
      * fix test
      
      * fix loss output
      
      * update code for backward pass
      
      * add loss support
      
      * update documentation
      
      * fix documentation, clean tokenizer
      
      * more doc fix, cleanup example_switch
      
      * fix failing test
      
      * fix test
      
      * fix test
      
      * fix loss issue
      
      * move layer
      
      * update doc and fix router capacity usage
      
      * fixup
      
      * add sparse mlp index for documentation on hub
      
      * fixup
      
      * test sparse mix architecture
      
      * Apply suggestions from code review
      
      * Update docs/source/en/model_doc/switch_transformers.mdx
      
      * fixup on update
      
      * fix tests
      
      * fix another test
      
      * attempt fix
      
      * Update src/transformers/models/switch_transformers/configuration_switch_transformers.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * Update src/transformers/models/switch_transformers/convert_switch_transformers_original_flax_checkpoint_to_pytorch.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * try
      
      * all tests pass
      
      * fix jitter noise
      
      * Apply suggestions from code review
      
      * doc tests pass
      
      * Update src/transformers/models/switch_transformers/modeling_switch_transformers.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * Update src/transformers/models/switch_transformers/modeling_switch_transformers.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * remove assert
      
      * change config order
      
      * fix readme japanese
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * remove parallelizable tests + add one liners
      
      * remove ONNX config
      
      * fix nits
      
      - add `T5Tokenizer` in auto mapping
      - remove `Switch Transformers` from ONNX supported models
      
      * remove `_get_router`
      
      * remove asserts
      
      * add check in test for `router_dtype`
      
      * add `SwitchTransformersConfig` in `run_pipeline_test`
      
      * Update tests/pipelines/test_pipelines_summarization.py
      
      * add huge model conversion script
      
      * fix slow tests
      
      - add better casting for `Linear8bitLt`
      - remove `torchscript` tests
      
      * add make dir
      
      * style on new script
      
      * fix nits
      
      - doctest
      - remove `_keys_to_ignore_on_load_unexpected`
      
      * Update src/transformers/models/switch_transformers/configuration_switch_transformers.py
      
      * add google as authors
      
      * fix year
      
      * remove last `assert` statements
      
      * standardize vertical spaces
      
      * fix failing import
      
      * fix another failing test
      
      * Remove strange 脿uthorized_keys`
      
      * removing todo and padding that is never used
      Co-authored-by: default avatarArthur Zucker <arthur.zucker@gmail.com>
      Co-authored-by: default avatarybelkada <younes@huggingface.co>
      Co-authored-by: default avatarYounes Belkada <younesbelkada@users.noreply.github.com>
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarArthur Zucker <arthur@huggingface.co>
      163ac3d3
  11. 07 Oct, 2022 1 commit
    • Sylvain Gugger's avatar
      Rework pipeline tests (#19366) · 9ac586b3
      Sylvain Gugger authored
      * Rework pipeline tests
      
      * Try to fix Flax tests
      
      * Try to put it before
      
      * Use a new decorator instead
      
      * Remove ignore marker since it doesn't work
      
      * Filter pipeline tests
      
      * Woopsie
      
      * Use the fitlered list
      
      * Clean up and fake modif
      
      * Remove init
      
      * Revert fake modif
      9ac586b3
  12. 13 Jun, 2022 1 commit
    • Daniel Stancl's avatar
      Add `LongT5` model (#16792) · a72f1c9f
      Daniel Stancl authored
      
      
      * Initial commit
      
      * Make some fixes
      
      * Make PT model full forward pass
      
      * Drop TF & Flax implementation, fix copies etc
      
      * Add Flax model and update some corresponding stuff
      
      * Drop some TF things
      
      * Update config and flax local attn
      
      * Add encoder_attention_type to config
      
      * .
      
      * Update docs
      
      * Do some cleansing
      
      * Fix some issues -> make style; add some docs
      
      * Fix position_bias + mask addition + Update tests
      
      * Fix repo consistency
      
      * Fix model consistency by removing flax operation over attn_mask
      
      * [WIP] Add PT TGlobal LongT5
      
      * .
      
      * [WIP] Add flax tglobal model
      
      * [WIP] Update flax model to use the right attention type in the encoder
      
      * Fix flax tglobal model forward pass
      
      * Make the use of global_relative_attention_bias
      
      * Add test suites for TGlobal model
      
      * Fix minor bugs, clean code
      
      * Fix pt-flax equivalence though not convinced with correctness
      
      * Fix LocalAttn implementation to match the original impl. + update READMEs
      
      * Few updates
      
      * Update: [Flax] improve large model init and loading #16148
      
      * Add ckpt conversion script accoring to #16853 + handle torch device placement
      
      * Minor updates to conversion script.
      
      * Typo: AutoModelForSeq2SeqLM -> FlaxAutoModelForSeq2SeqLM
      
      * gpu support + dtype fix
      
      * Apply some suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      
      * * Remove (de)parallelize stuff
      * Edit shape comments
      * Update README.md
      * make fix-copies
      
      * Remove caching logic for local & tglobal attention
      
      * Apply another batch of suggestions from code review
      
      * Add missing checkpoints
      * Format converting scripts
      * Drop (de)parallelize links from longT5 mdx
      
      * Fix converting script + revert config file change
      
      * Revert "Remove caching logic for local & tglobal attention"
      
      This reverts commit 2a619828f6ddc3e65bd9bb1725a12b77fa883a46.
      
      * Stash caching logic in Flax model
      
      * Make side relative bias used always
      
      * Drop caching logic in PT model
      
      * Return side bias as it was
      
      * Drop all remaining model parallel logic
      
      * Remove clamp statements
      
      * Move test files to the proper place
      
      * Update docs with new version of hf-doc-builder
      
      * Fix test imports
      
      * Make some minor improvements
      
      * Add missing checkpoints to docs
      * Make TGlobal model compatible with torch.onnx.export
      * Replace some np.ndarray with jnp.ndarray
      
      * Fix TGlobal for ONNX conversion + update docs
      
      * fix _make_global_fixed_block_ids and masked neg  value
      
      * update flax model
      
      * style and quality
      
      * fix imports
      
      * remove load_tf_weights_in_longt5 from init and fix copies
      
      * add slow test for TGlobal model
      
      * typo fix
      
      * Drop obsolete is_parallelizable and one warning
      
      * Update __init__ files to fix repo-consistency
      
      * fix pipeline test
      
      * Fix some device placements
      
      * [wip]: Update tests -- need to generate summaries to update expected_summary
      
      * Fix quality
      
      * Update LongT5 model card
      
      * Update (slow) summarization tests
      
      * make style
      
      * rename checkpoitns
      
      * finish
      
      * fix flax tests
      Co-authored-by: default avatarphungvanduy <pvduy23@gmail.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
      Co-authored-by: default avatarpatil-suraj <surajp815@gmail.com>
      a72f1c9f
  13. 12 May, 2022 1 commit
  14. 23 Feb, 2022 1 commit
  15. 29 Oct, 2021 1 commit
  16. 26 Aug, 2021 1 commit
  17. 14 Jun, 2021 1 commit
  18. 18 May, 2021 1 commit
  19. 05 Mar, 2021 1 commit
  20. 10 Feb, 2021 1 commit
    • Suraj Patil's avatar
      remove adjust_logits_during_generation method (#10087) · c130e67d
      Suraj Patil authored
      * add forced logits processors
      
      * delete adjust_logits method
      
      * add forced_eos_token_id argument in config
      
      * add tests for forced logits processors
      
      * update gen utils tests
      
      * add forced option to tf generate
      
      * remove adjust_logits method from tf models
      
      * update adjust_logits for marian
      
      * delete _force_token_id_to_be_generated method
      
      * style
      
      * import warnings
      
      * pass max_length to _get_logits_processor
      
      * set forced_eos_token_id to None
      
      * set forced attributes in conf utils
      
      * typo
      
      * fix rag generate
      
      * add forced_eos_token_id in rag config
      
      * remove force_bos_token_to_be_generated from BartConfig
      
      * remove _force_token_ids_generation from FSMT
      
      * nit
      
      * fix negative constant
      
      * apply suggestions from code review
      c130e67d
  21. 11 Jan, 2021 1 commit
    • Nicolas Patry's avatar
      Enable TruncationStrategy override for pipelines (#9432) · d20e9c72
      Nicolas Patry authored
      * Enable TruncationStrategy override for pipelines
      
      * Update isort.
      
      * Fixing test
      
      * Fixing text_generation pipeline.
      
      * Using same DummyTok as other PR  for easier merge later.
      
      * Some more import guards.
      
      * Remove bogus file.
      
      * Do not pass `generate_kwargs` to `_parse_and_tokenize`.
      @patrickvonplaten
      
      * Removed DummyTok.
      
      * Doc quality.
      d20e9c72
  22. 07 Dec, 2020 1 commit
  23. 23 Oct, 2020 1 commit