- 13 Feb, 2023 14 commits
-
-
Joao Gante authored
-
Yih-Dar authored
* use fp16 * use fp16 * use fp16 * use fp16 --------- Co-authored-by:ydshieh <ydshieh@users.noreply.github.com>
-
Warren Green authored
-
Yi Wang authored
-
dependabot[bot] authored
Bump ipython in /examples/research_projects/decision_transformer Bumps [ipython](https://github.com/ipython/ipython) from 8.1.1 to 8.10.0. - [Release notes](https://github.com/ipython/ipython/releases) - [Commits](https://github.com/ipython/ipython/compare/8.1.1...8.10.0 ) --- updated-dependencies: - dependency-name: ipython dependency-type: direct:production ... Signed-off-by:
dependabot[bot] <support@github.com> Co-authored-by:
dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
-
Billy Lee authored
* annotated TFvisionEncoderDecoder input type hints Co-authored-by:
JuheonChu <chuj@dickinson.edu> Co-authored-by:
AdiaWu <wua@dickinson.edu> * fixed failing tests * make fix-copies * failed test fix * style fix * revert --------- Co-authored-by:
JuheonChu <chuj@dickinson.edu> Co-authored-by:
AdiaWu <wua@dickinson.edu> Co-authored-by:
Matt <rocketknight1@gmail.com>
-
Younes Belkada authored
* fix bnb slow test * make fixup
-
Joao Gante authored
-
Dzmitry Pletnikau authored
-
Christopher Akiki authored
[MINOR] Fix link I'm not sure this will also fix the currently broken link in the docs (Specifically here: https://huggingface.co/docs/transformers/model_doc/time_series_transformer) whereby clicking on `kashif` attempts to link to the following non-existent URL: https://huggingface.co/docs/transformers/model_doc/%3Chttps://huggingface.co/kashif
-
Thomas Paviot authored
remove trailing word
-
Joao Gante authored
skip test
-
Maria Khalusova authored
* document question answering guide * Added the list of supported models * Apply suggestions from code review Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * switched to AutoProcessor * feedback addressed * Apply suggestions from code review Co-authored-by:
NielsRogge <48327001+NielsRogge@users.noreply.github.com> Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * Update docs/source/en/tasks/document_question_answering.mdx Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com> * more feedback addressed * addressed comments about evaluation loss * added appropriate image link * make style * typo fix * resolving toc conflict * fixed the image link --------- Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by:
NielsRogge <48327001+NielsRogge@users.noreply.github.com> Co-authored-by:
Sayak Paul <spsayakpaul@gmail.com>
-
Joao Gante authored
-
- 12 Feb, 2023 1 commit
-
-
Sylvain Gugger authored
-
- 10 Feb, 2023 20 commits
-
-
Younes Belkada authored
add int8 support
-
Yih-Dar authored
* Remove unused decoder_layerdrop * Update SPECIAL_CASES_TO_ALLOW for MT5Config * Remove unused position_embedding_init_scale * Remove unused decoder_max_relative_position * Use unused decoder_max_relative_position * Remove unused init_std * Remove unused forgotten attributes * Remove unused patch_norm * Remove unused max_seq_len * Update SPECIAL_CASES_TO_ALLOW for OneFormerConfig --------- Co-authored-by:ydshieh <ydshieh@users.noreply.github.com>
-
Han Wu authored
* Added timesformer configuration Co-authored-by:
JuheonChu <chuj@dickinson.edu> * Create documentation_tests.txt * Update documentation_tests.txt Co-authored-by:
JuheonChu <chuj@dickinson.edu> * Delete documentation_tests.txt Updates, Deleting "src/transformers/utils/documentation_tests.txt" file. Co-authored-by:
JuheonChu <chuj@dickinson.edu> * Create documentation_tests.txt Co-authored-by:
JuheonChu <chuj@dickinson.edu> * Delete documentation_tests.txt Co-authored-by:
JuheonChu <chuj@dickinson.edu> --------- Co-authored-by:
JuheonChu <chuj@dickinson.edu>
-
amyeroberts authored
* Replace input_values_prrocessing with unpack_inputs * Skip test failing with OOM * Update tests
-
Shubhamai authored
* improving tests section * documenting other env variables
-
Stas Bekman authored
* [deepspeed] deal with models w/o config.hidden_size * typo * typo
-
Yih-Dar authored
Byebye Blip-2 doctest Co-authored-by:ydshieh <ydshieh@users.noreply.github.com>
-
Sayak Paul authored
* add: task guide on image cpationing. * Empty commit to trigger CI * Apply suggestions from code review Co-authored-by:
Steven Liu <59462357+stevhliu@users.noreply.github.com> Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> * address additional comments from the PR. * fix: wording. * Update docs/source/en/tasks/image_captioning.mdx Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> --------- Co-authored-by:
Steven Liu <59462357+stevhliu@users.noreply.github.com> Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
-
Stas Bekman authored
[from_pretrained] extend `torch_dtype="auto"` to look up `config.torch_dtype` first, expand docs (#21524) * [from_pretrained] expand on torch_dtype entry * fold 4 into 1 * style * support torch_dtype='config' plus tests * style * oops * fold config into auto, fix bug * fix check * better log * better log * clean up
-
Shubhamai authored
improving flax tests
-
steventk-g authored
Update run_mae.py
-
Patrick von Platen authored
* [Variant] Make sure variant files are not incorrectly deleted * Apply suggestions from code review * fix
-
Yueming Hao authored
* fix rsqrt * fix typo
-
Jannis Vamvas authored
* Add X-MOD to Readme * Add documentation for X-MOD * Implement X-MOD * Fix formatting of X-MOD docs * Change signature of X-MOD forward methods to use lang_ids * Minor changes * Rebase with main and run make fix-copies * Make suggested changes to docstrings * Improve code readability Co-authored-by:
Younes Belkada <49240599+younesbelkada@users.noreply.github.com> * Fix code style * Conversion script: Remove asserts and type annotations * Remove _TOKENIZER_FOR_DOC * XMOD -> Xmod * Update copyright note * Fix doctests * Fix docstring * Add integration test for FillMaskPipeline * Revert "Add integration test for FillMaskPipeline" This reverts commit 4381eb3b1d0f5d85785f89caba83928e6efa6d1f. * Add end-to-end integration test for mask fill * make style * Rebase with main and make fix-copies --------- Co-authored-by:
Younes Belkada <49240599+younesbelkada@users.noreply.github.com>
-
GeneZC authored
* Fix stuff related to the causal_mask in CodeGen. 1. Line 613, `_keys_to_ignore_on_load_missing = [r"h\.\d+\.attn\.masked_bias", r"h\.\d+\.attn\.bias"]` => `_keys_to_ignore_on_load_missing = [r"h\.\d+\.attn\.causal_mask"]` to load correctly from CodeGen checkpoint without `causal_mask`. 2. Line 152, `causal_mask = self.causal_mask[:, :, key_length - query_length : key_length, :key_length] ` => `causal_mask = self.causal_mask[:, :, key_length - query_length : key_length, :key_length].bool() ` to alleviate potential user warning saying like `UserWarning: where received a uint8 condition tensor. This behavior is deprecated and will be removed in a future version of PyTorch. Use a boolean condition instead.`. * Revert the .bool() Revert the .bool() and leave it to the future PR.
-
Quentin Meeus authored
* Remove CLI spams with Whisper FeatureExtractor Whisper feature extractor representation includes the MEL filters, a list of list that is represented as ~16,000 lines. This needlessly spams the command line. I added a `__repr__` method that replaces this list with a string "<array of shape (80, 201)>" * Remove mel_filters from to_dict output Credits to @ArthurZucker * remove unused import * update feature extraction tests for the changes in to_dict
-
Eugene Zapolsky authored
* adding note concerning use_node_local_storage * overriding checkpoint.use_node_local_storage if save_on_each_node == True * add more content * add more content * improve * style --------- Co-authored-by:Stas Bekman <stas@stason.org>
-
Katie Le authored
add with torch.no_grad() to Camembert integration test Co-authored-by:Bibi <Bibi@katies-mac.local>
-
Younes Belkada authored
* v1 fix * adapt from suggestions * make style * fix tests * add gpu tests * update docs * fix other tests * Apply suggestions from code review Co-authored-by:
Nicolas Patry <patry.nicolas@protonmail.com> * better fix * make fixup * better example * revert changes * proposal * more elegant solution * Update src/transformers/pipelines/automatic_speech_recognition.py Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> --------- Co-authored-by:
Nicolas Patry <patry.nicolas@protonmail.com> Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
-
Sylvain Gugger authored
-
- 09 Feb, 2023 5 commits
-
-
Katie Le authored
* added with torch.no_grad() to the integration tests and applied make style * added with torch.no_grad() to xlm roberta forward pass --------- Co-authored-by:Bibi <Bibi@katies-mac.local>
-
Sylvain Gugger authored
* Enforce single model initialization * Add OneFormer example for problem 3 * Do it the Stas way * Actually rename the uses... * Rewrite test * Try to change the test this way * Fix all init slow/fast tests * Break connection * Fix more tests * Fix test for initialization * Remove custom test * Quality * Fix last failing tests * The end?
-
Sylvain Gugger authored
-
Sylvain Gugger authored
* Fix inclusion of non py files in package * No need for the **
-
Sylvain Gugger authored
-