1. 18 Aug, 2023 3 commits
    • Stas Bekman's avatar
      new model: IDEFICS via HuggingFaceM4 (#24796) · 6c811a32
      Stas Bekman authored
      
      
      * rename
      
      * restore
      
      * mappings
      
      * unedited tests+docs
      
      * docs
      
      * fixes
      
      * fix auto-sync breakage
      
      * cleanup
      
      * wip
      
      * wip
      
      * add fetch_images
      
      * remove einops dependency
      
      * update
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * re-add
      
      * add batching
      
      * rework
      
      * fix
      
      * improve
      
      * add Leo as I am extending his work
      
      * cleanup
      
      * fix
      
      * cleanup
      
      * slow-test
      
      * fix
      
      * fix
      
      * fixes
      
      * deal with warning
      
      * rename modified llama classes
      
      * rework fetch_images
      
      * alternative implementation
      
      * cleanup
      
      * strict version
      
      * cleanup
      
      * [`IDEFICS`] Fix idefics ci (#25056)
      
      * Fix IDEFICS CI
      
      * fix test file
      
      * fixup
      
      * some changes to make tests pass
      
      * fix
      
      * fixup
      
      * Update src/transformers/models/idefics/configuration_idefics.py
      Co-authored-by: default avatarStas Bekman <stas00@users.noreply.github.com>
      
      ---------
      Co-authored-by: default avatarStas Bekman <stas00@users.noreply.github.com>
      
      * remove compat checks
      
      * style
      
      * explain that Idefics is not for training from scratch
      
      * require pt>=2.0
      
      * fix idefics vision config (#25092)
      
      * fix idefics vision config
      
      * fixup
      
      * clean
      
      * Update src/transformers/models/idefics/configuration_idefics.py
      
      ---------
      Co-authored-by: default avatarStas Bekman <stas00@users.noreply.github.com>
      
      * cleanup
      
      * style
      
      * cleanup
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * upcase
      
      * sequence of images
      
      * handle the case with no images
      
      * Update src/transformers/image_processing_utils.py
      Co-authored-by: default avatarVictor SANH <victorsanh@gmail.com>
      
      * support pure lm take 2
      
      * support tokenizer options
      
      * parameterize num_channels
      
      * fix upcase
      
      * s|IdeficsForCausalLM|IdeficsForVisionText2Text|g
      
      * manual to one line
      
      * addressing review
      
      * unbreak
      
      * remove clip dependency
      
      * fix test
      
      * consistency
      
      * PIL import
      
      * Idefics prefix
      
      * Idefics prefix
      
      * hack to make tests work
      
      * style
      
      * fix
      
      * fix
      
      * revert
      
      * try/finally
      
      * cleanup
      
      * clean up
      
      * move
      
      * [`IDEFICS`] Fix idefics config refactor (#25149)
      
      * refactor config
      
      * nuke init weights
      
      * more refactor
      
      * oops
      
      * remove visual question answering pipeline support
      
      * Update src/transformers/models/idefics/clip.py
      Co-authored-by: default avatarStas Bekman <stas00@users.noreply.github.com>
      
      * Update src/transformers/models/idefics/modeling_idefics.py
      
      * cleanup
      
      * mv clip.py vision.py
      
      * tidyup
      
      ---------
      Co-authored-by: default avatarStas Bekman <stas00@users.noreply.github.com>
      Co-authored-by: default avatarStas Bekman <stas@stason.org>
      
      * fix
      
      * license
      
      * condition on pt
      
      * fix
      
      * style
      
      * fix
      
      * rm torchvision dependency, allow custom transforms
      
      * address review
      
      * rework device arg
      
      * add_eos_token
      
      * s/transforms/transform/
      
      * fix top level imports
      
      * fix return value
      
      * cleanup
      
      * cleanup
      
      * fix
      
      * style
      
      * license
      
      * license
      
      * Update src/transformers/models/idefics/image_processing_idefics.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * add a wrapper to freeze vision layears
      
      * tidyup
      
      * use the correct std/mean settings
      
      * parameterize values from config
      
      * add tests/models/idefics/test_image_processing_idefics.py
      
      * add test_processor_idefics.py
      
      * cleanup
      
      * cleanups
      
      * fix
      
      * fix
      
      * move to the right group
      
      * style
      
      * Apply suggestions from code review
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * add perceiver config
      
      * reset
      
      * missing arg docs
      
      * Apply suggestions from code review
      Co-authored-by: default avatarLeo Tronchon <leo.tronchon@gmail.com>
      
      * address review comments
      
      * inject automatic end of utterance tokens (#25218)
      
      * inject automatic end of utterance tokens
      
      * fix
      
      * fix
      
      * fix
      
      * rework to not use the config
      
      * not end_of_utterance_token at the end
      
      * Update src/transformers/models/idefics/processing_idefics.py
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      
      * address review
      
      * Apply suggestions from code review
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      
      * Update src/transformers/image_processing_utils.py
      Co-authored-by: default avatarNicolas Patry <patry.nicolas@protonmail.com>
      
      * [`Idefics`] add image_embeddings option in generate-related methods (#25442)
      
      * add image_embeddings option in generate-related methods
      
      * style
      
      * rename image_embeddings and allow perceiver embeddings precomputation
      
      * compute embeddings within generate
      
      * make is_encoder_decoder= True the default in config
      
      * nested if else fix
      
      * better triple check
      
      * switch if elif order for pixel values / img embeds
      
      * update model_kwargs perceiver only at the end
      
      * use _prepare_model_inputs instead of encoder_decoder logic
      
      * fix comment typo
      
      * fix config default for is_encoder_decoder
      
      * style
      
      * add typehints
      
      * precompute in forward
      
      * doc builder
      
      * style
      
      * pop instead of get image hidden states
      
      * Trigger CI
      
      * Update src/transformers/models/idefics/modeling_idefics.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * Update src/transformers/models/idefics/modeling_idefics.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * fix * + indentation + style
      
      * simplify a bit the use_resampler logic using comments
      
      * update diocstrings
      
      * Trigger CI
      
      ---------
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * fix rebase changes
      
      * unbreak #25237 - to be fixed in follow up PRs
      
      * is_composition = False
      
      * no longer needed
      
      ---------
      Co-authored-by: default avatarleot13 <leo.tronchon@gmail.com>
      Co-authored-by: default avatarYounes Belkada <49240599+younesbelkada@users.noreply.github.com>
      Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
      Co-authored-by: default avatarVictor SANH <victorsanh@gmail.com>
      Co-authored-by: default avatarJoao Gante <joaofranciscocardosogante@gmail.com>
      Co-authored-by: default avatarNicolas Patry <patry.nicolas@protonmail.com>
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      6c811a32
    • Arthur's avatar
      [`split_special_tokens`] Add support for `split_special_tokens` argument to encode (#25081) · 30b3c46f
      Arthur authored
      * draft changes
      
      * update and add tests
      
      * styling for no
      
      * move test
      
      * path to usable model
      
      * update test
      
      * small update
      
      * update bertbased tokenizers
      
      * don'tuse kwargs for _tokenize
      
      * don'tuse kwargs for _tokenize
      
      * fix copies
      
      * update
      
      * update test for special tokenizers
      
      * fixup
      
      * skip two tests
      
      * remove pdb breakpiont()
      
      * wowo
      
      * rewrite custom tests
      
      * nits
      
      * revert chang in target keys
      
      * fix markup lm
      
      * update documentation of the argument
      30b3c46f
    • Alex McKinney's avatar
      Replaces calls to `.cuda` with `.to(torch_device)` in tests (#25571) · 9d7afd25
      Alex McKinney authored
      
      
      * Replaces calls to `.cuda` with `.to(torch_device)` in tests
      `torch.Tensor.cuda()` is a pre-0.4 solution to changing a tensor's device. It is recommended to prefer `.to(...)` for greater flexibility and error handling. Furthermore, this makes it more consistent with other tests (that tend to use `.to(torch_device)`) and ensures the correct device backend is used (if `torch_device` is neither `cpu` or `cuda`).
      
      * addressing review comments
      
      * more formatting changes in Bloom test
      
      * `make style`
      
      * Update tests/models/bloom/test_modeling_bloom.py
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      
      * fixes style failures
      
      ---------
      Co-authored-by: default avatarArthur <48595927+ArthurZucker@users.noreply.github.com>
      9d7afd25
  2. 17 Aug, 2023 5 commits
  3. 16 Aug, 2023 4 commits
  4. 14 Aug, 2023 1 commit
  5. 11 Aug, 2023 3 commits
  6. 09 Aug, 2023 1 commit
  7. 08 Aug, 2023 2 commits
  8. 07 Aug, 2023 2 commits
    • Pedro Lira's avatar
      Add mask2former fp16 support (#25093) · 080a9711
      Pedro Lira authored
      * Add mask2former fp16 support
      
      * Clear consistency/quality issues
      
      * Fix consistency/quality (2)
      
      * Add integration test for mask2former (fp16 case)
      
      * Fix code quality
      
      * Add integration test for maskformer (fp16 case)
      
      * Add integration test for oneformer (fp16 case)
      
      * Remove slow decorator from fp16 tests
      
      * Fix lint
      
      * Remove usage of full inference and value checks for fp16
      
      * Temporarily comment slow for {mask, mask2, one}former
      
      * Add fp16 support to oneformer
      
      * Revert "Temporarily comment slow for {mask, mask2, one}former"
      
      This reverts commit e5371edabd301cf56079def0421a0a87df307cb0.
      
      * Remove dtype conversion noop
      080a9711
    • Yih-Dar's avatar
      Fix more offload edge cases (#25342) · c177606f
      Yih-Dar authored
      
      
      * fix
      
      * fix
      
      * fix
      
      ---------
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      c177606f
  9. 04 Aug, 2023 1 commit
  10. 03 Aug, 2023 2 commits
  11. 02 Aug, 2023 4 commits
  12. 01 Aug, 2023 1 commit
  13. 31 Jul, 2023 2 commits
  14. 28 Jul, 2023 4 commits
  15. 27 Jul, 2023 3 commits
  16. 26 Jul, 2023 2 commits