- 23 May, 2025 1 commit
-
-
fxmarty-amd authored
* fix arguments * pacify pre-commit --------- Co-authored-by:Baber <baber@hey.com>
-
- 19 May, 2025 1 commit
-
-
Baber Abbasi authored
* add `sglang-generate` * nit * nit * nit * pacify pre-commit
-
- 15 May, 2025 1 commit
-
-
Filippo Momentè authored
* fix: pass device arg in model_ar in vllm_causallms * casting device arg to str in vLLM model args
-
- 10 May, 2025 1 commit
-
-
Sungjae Lee authored
-
- 09 May, 2025 1 commit
-
-
Baber Abbasi authored
-
- 06 May, 2025 1 commit
-
-
Alexandre Marques authored
-
- 16 Apr, 2025 1 commit
-
-
Baber Abbasi authored
* fix resolve_hf_chat_template version * pre-commit
-
- 14 Apr, 2025 1 commit
-
-
Alexandre Marques authored
* Add support for chat templates defined outside of tokenizer_config.json, as supported by vLLM * Update template name to avoid conflict with other variable
-
- 20 Mar, 2025 2 commits
-
-
Baber Abbasi authored
-
Yifei Zhang authored
-
- 11 Mar, 2025 1 commit
-
-
Baber Abbasi authored
-
- 27 Feb, 2025 1 commit
-
-
Baber Abbasi authored
* remove ray.remote resources * remove kobtest tag (registered as group)
-
- 21 Feb, 2025 1 commit
-
-
Lintang Sutawika authored
* changed source of eval_logger * allow eval_logger to be set from args * removed verbosity arg from non-main methods * fix logging * pre-commit * set verbosity in eval logger * replace utils.eval_logger * fix logging in main * add logging to docs * add logging message * nit * add logging to docs * refactor setup_logging to utils --------- Co-authored-by:Baber <baber@hey.com>
-
- 17 Feb, 2025 1 commit
-
-
Baber Abbasi authored
* fix vllm * fix data_parallel * copy to multimodal
-
- 07 Feb, 2025 1 commit
-
-
Baber Abbasi authored
-
- 19 Jan, 2025 1 commit
-
-
Baber Abbasi authored
* update pre-commit
-
- 15 Jan, 2025 1 commit
-
-
Baber Abbasi authored
* add assistant prefix * add arc_challenge from llama * nit * nit * nit * add assistant prefix * add mmlu_llama * nit * nit * Revert "nit" This reverts commit 6a97f8356237305e375212b966b30e8de59dd4bc. * fix regex bug * add assistant_prefix to vllm * add `Question:` * add mmlu_pro * add fewshot assistant_prefix * use `assistant_prefill` * typehints * nits * nits * add to docs * add readme
-
- 16 Dec, 2024 1 commit
-
-
Baber Abbasi authored
* batch all rolling token windows * nit * copy to vllm * fix max_length for `get_rolling_token_windows` * bugfix * bugfix * add type hints
-
- 30 Nov, 2024 1 commit
-
-
Baber Abbasi authored
* make utility function to handle `until` * fix text
-
- 15 Nov, 2024 1 commit
-
-
Oyvind Tafjord authored
-
- 30 Oct, 2024 1 commit
-
-
Chris Kerwell Gresla authored
* fix: use lora_request for data parallel vllm evals * fix(docs): include type hint * chore: lint, et pre-commit al --------- Co-authored-by:Chris Kerwell Gresla <chris@wafer.systems>
-
- 22 Oct, 2024 1 commit
-
-
Leonid Sinev authored
* Replace generic exception classes with a more specific ones * rerun pre-commit to pass linter tests * Revert "rerun pre-commit to pass linter tests" This reverts commit 67f88ccf144469853217704520e613196042d859. * reduce repetitions in errors or so * Replace generic exception class with a more specific one
-
- 04 Sep, 2024 1 commit
-
-
Baber Abbasi authored
* default chat template method fix * move chat_template to TemplateLM * remove hotfix * handle openai `chat_template` * Update lm_eval/api/model.py Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com> * add 'max_tokens' to gen_kwargs * pre-commit --------- Co-authored-by:
KonradSzafer <szafer.konrad@gmail.com> Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com>
-
- 30 Aug, 2024 1 commit
-
-
Baber Abbasi authored
* max_length - 1 (generation always >= 1) * vllm: fix rolling prefix_token * nit: add comment * fixup! max_length should be handled for logliklihoods
-
- 28 Aug, 2024 1 commit
-
-
Hailey Schoelkopf authored
* fix revision type * allow for None-input loglikelihood reqs to be cached * handle no remaining cache items * pre-commit * change cache_hook.add_partial(loglikelihood_rolling...) convention --------- Co-authored-by:Baber Abbasi <baber@eleuther.ai>
-
- 02 Jul, 2024 1 commit
-
-
Hailey Schoelkopf authored
-
- 28 Jun, 2024 1 commit
-
-
Baber Abbasi authored
* add chat template * refactor token padding * nit * nit * check on failing test * check transformers version * remove transformers pin * add ids to test * nit * fixup * fix bos bug * nit * fixup! fix bos bug * increase tolerance for table test * don't detokenize vllm logprobs * Update lm_eval/models/utils.py Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com> * pre-commit run --all-files --------- Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com>
-
- 13 Jun, 2024 1 commit
-
-
Hailey Schoelkopf authored
* Update vllm_causallms.py * adjust --------- Co-authored-by:lintangsutawika <lintang@eleuther.ai>
-
- 11 Jun, 2024 1 commit
-
-
Hailey Schoelkopf authored
-
- 28 May, 2024 1 commit
-
-
Michael Goin authored
* Reorder vllm imports in vllm_causallms.py * Update vllm_causallms.py
-
- 23 May, 2024 1 commit
-
-
Edward Gan authored
-
- 07 May, 2024 1 commit
-
-
Hailey Schoelkopf authored
-
- 02 May, 2024 1 commit
-
-
bcicc authored
* vllm lora support * remove print * version check, rename lora kwarg
-
- 20 Mar, 2024 1 commit
-
-
Hailey Schoelkopf authored
* make vllm use prefix_token_id ; have prefix_token_id be optional method to define * custom_prefix_token_id wasn't set if not passed
-
- 13 Mar, 2024 1 commit
-
-
achervyakov authored
* add manual tqdm disabling management * add typing to all new args * apply precommit changes --------- Co-authored-by:haileyschoelkopf <hailey@eleuther.ai>
-
- 09 Mar, 2024 1 commit
-
-
Antoni Baum authored
* Add compatibility for vLLM's new Logprob object * Fix * Update lm_eval/models/vllm_causallms.py * fix format? * trailing whitespace --------- Co-authored-by:Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com>
-
- 03 Mar, 2024 1 commit
-
-
Baber Abbasi authored
* use `@ray.remote` with distributed vLLM * update versions * bugfix * unpin vllm * fix pre-commit * added version assertion error * Revert "added version assertion error" This reverts commit 8041e9b78e95eea9f4f4d0dc260115ba8698e9cc. * added version assertion for DP * expand DP note * add warning * nit * pin vllm * fix typos
-
- 01 Mar, 2024 2 commits
-
-
Hailey Schoelkopf authored
* add undistribute + use more_itertools * remove divide() util fn * add more_itertools as dependency
-
Hailey Schoelkopf authored
-
- 27 Feb, 2024 1 commit
-
-
Baber Abbasi authored
* change `all_gather` to `gather` * add TaskOutput utility class * Add FilterResults class and refactor task handling. * Rename `key` to `filter_key` for clarity * Add `print_writeout` function in utils.py * Add function to calculate limit size. * Add doc_iterator method to Task class * Refactor `doc_iterator` and cleanup in Task class * remove superfluous bits * change `all_gather` to `gather` * bugfix * bugfix * fix `gather` * Refactor `gather` loop * Refactor aggregate metrics calculation * Refactor and simplify aggregate metrics calculation Removed unused code * Simplify metrics calculation and remove unused code. * simplify the metrics calculation in `utils.py` and `evaluator.py`. * Fix group metric * change evaluate to hf_evaluate * change evaluate to hf_evaluate * add docs * add docs * nits * make isslice keyword only * nit * add todo * nit * nit * nit: swap order samples_metrics tuple * move instance sorting outside loop * nit * nit * Add __repr__ for ConfigurableTask * nit * nit * Revert "nit" This reverts commit dab8d9977a643752a17f840fd8cf7e4b107df28f. * fix some logging * nit * fix `predict_only` bug. thanks to `@LSinev`! * change `print_tasks` to `prepare_print_tasks` * nits * move eval utils * move eval utils * nit * add comment * added tqdm descriptions * Update lm_eval/evaluator_utils.py Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com> * fix mgsm bug * nit * fix `build_all_requests` * pre-commit * add ceil to limit --------- Co-authored-by:
Hailey Schoelkopf <65563625+haileyschoelkopf@users.noreply.github.com>
-