1. 07 Aug, 2024 5 commits
  2. 06 Aug, 2024 17 commits
  3. 05 Aug, 2024 10 commits
  4. 03 Aug, 2024 2 commits
    • Xueshen Liu's avatar
      MixtralFlashAttention2: put "plus 1" inside parentheses when calculating... · 621fb3c0
      Xueshen Liu authored
      MixtralFlashAttention2: put "plus 1" inside parentheses when calculating rotary_seq_len, allowing None position_ids input. (#31500)
      
      * Mixtral: remove unnecessary plus 1 when calculating rotary_seq_len, allowing position_ids=None (no auto position_ids generation could be unsafe)
      
      * fix typo [:-1] to [:, -1]
      
      * to meet formatting requirement
      
      * to meet formatting requirement
      
      * remove white space
      
      * MixtralFlashAttention2: put "+ 1" inside parentheses when calculating rotary_seq_len, allowing None position_ids input. Fix format/style issue.
      
      * propagate to startcoder2, phi3, mixtral and qwen2
      
      * update qwen2_moe
      621fb3c0
    • Shaopeng Fu's avatar
      fix: (issue #32124) Exception raised when running... · 7c31d05b
      Shaopeng Fu authored
      fix: (issue #32124) Exception raised when running `transformers/examples/flax/language-modeling/t5_tokenizer_model.py`. (#32157)
      
      fix: Exception raised when running .
      7c31d05b
  5. 02 Aug, 2024 3 commits
  6. 01 Aug, 2024 3 commits