1. 30 Mar, 2025 1 commit
    • Alexandre Marques's avatar
      Adds MMLU CoT, gsm8k and arc_challenge for llama instruct (#2829) · 3816796e
      Alexandre Marques authored
      * llama-style MMLU CoT
      
      * Refactor MMLU CoT template YAML to simplify 'until' structure
      
      * Add GSM8K task configuration for LLaMA3 with few-shot examples
      
      * Fix missing newline at end of MMLU CoT YAML file
      
      * Add ARC-Challenge task configuration and processing utility
      
      * Add additional MMLU and ARC-Challenge task variants to README
      
      * Update README with notes on arc_challenge_llama dataset preprocessing
      3816796e
  2. 25 Mar, 2025 1 commit
  3. 21 Mar, 2025 1 commit
  4. 20 Mar, 2025 2 commits
    • Alexandre Marques's avatar
      Fixes to mmlu_pro_llama (#2816) · 8028a42f
      Alexandre Marques authored
      * Update generation_kwargs in default template to include additional end tokens
      
      * Update filter_list in MMLU Pro configuration to use strict_match
      
      * Update _default_template_yaml
      8028a42f
    • Alexandre Marques's avatar
      Llama3 mmlu correction (#2797) · c73b43f4
      Alexandre Marques authored
      * Update continuation template YAML for MMLU task with new generation and filtering options
      
      * Refactor filter_list structure in continuation template YAML for improved readability
      
      * Add 'take_first' function to filter_list in continuation template YAML
      
      * Update filter_list in continuation template YAML to use 'strict_match' and modify filtering functions
      
      * Add 'do_sample' option to generation_kwargs in MMLU template YAML
      c73b43f4
  5. 17 Jan, 2025 1 commit
  6. 15 Jan, 2025 1 commit
    • Baber Abbasi's avatar
      assistant prefill (#2615) · 703fbffd
      Baber Abbasi authored
      * add assistant prefix
      
      * add arc_challenge from llama
      
      * nit
      
      * nit
      
      * nit
      
      * add assistant prefix
      
      * add mmlu_llama
      
      * nit
      
      * nit
      
      * Revert "nit"
      
      This reverts commit 6a97f8356237305e375212b966b30e8de59dd4bc.
      
      * fix regex bug
      
      * add assistant_prefix to vllm
      
      * add `Question:`
      
      * add mmlu_pro
      
      * add fewshot assistant_prefix
      
      * use `assistant_prefill`
      
      * typehints
      
      * nits
      
      * nits
      
      * add to docs
      
      * add readme
      703fbffd