- 27 Dec, 2023 1 commit
-
-
lintangsutawika authored
-
- 18 Dec, 2023 1 commit
-
-
- 17 Dec, 2023 1 commit
-
-
Wis Kojohnjaratkul authored
* Add IFEval task * Check and download nltk punkt if not already downloaded * Update gen_max_toks to 2048 to support "900 words+" instructions * Resolve pre-commit linting issues * Reduce max_gen_toks to 1280 to conserve token usage * Add warning message in `process_results` call for non chat-finetuned models
-
- 16 Dec, 2023 2 commits
-
-
Baber Abbasi authored
* fixed syntactic nits * fix temperature and seed * fix logprobs * fixup merge
-
Baber Abbasi authored
-
- 15 Dec, 2023 12 commits
-
-
Vicki Boykis authored
* enabling OpenAI completions via gooseai * openai-completions and pin openai
-
Baber Abbasi authored
-
Hailey Schoelkopf authored
* add ignoring of no-commit-to-branch * fix method of skipping pre-commit step
-
Lenni Justen authored
-
Lenni Justen authored
-
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
MorishT authored
* [fix] loading dataset from hub fails when the dataset name includes '.', as the program assumes it is on the local filesystem * add FLD benchmark * Update task.py * [update] add group 'fld' * [update] rename fld -> fld_default. add explanation to the readme * Update README.md --------- Co-authored-by:Lintang Sutawika <lintang@sutawika.com>
-
Baber Abbasi authored
-
Lintang Sutawika authored
-
- 14 Dec, 2023 7 commits
-
-
NanoCode012 authored
* fix: passing max_length to vllm engine args * feat: add `max_model_len` * chore: lint
-
Yuliang Li authored
-
Lintang Sutawika authored
* doc_to_decontamination_query can use function * add option for doc_to_decontamination_query to follow doc_to_text * added documentation for doc_to_decontamination_query * adjust description * format
-
Lintang Sutawika authored
* Additional process for doc_to_choice * doc_to_choice can also parse a string
-
Hailey Schoelkopf authored
* modularize HFLM code * pass through extra kwargs to AutoModel.from_pretrained call * remove explicit model_kwargs * rename gptq -> autogptq * fix tokenizer pad token errors * ensure model always respects device_map and autogptq's selected devices * add a _get_config helper fn
-
Lintang Sutawika authored
fix: bug of BBH_cot_fewshot
-
momotori authored
-
- 13 Dec, 2023 16 commits
-
-
Baber Abbasi authored
* remove unlabled test sets * add note to readme
-
haileyschoelkopf authored
-
haileyschoelkopf authored
-
Lintang Sutawika authored
-
momotori authored
-
Baber Abbasi authored
* unpack group; add output_path to arg * Add `vllm` to overview
-
momotori authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-
lintangsutawika authored
-