"tests/models/llama/test_tokenization_llama.py" did not exist on "3c7fbf35a6c9237e8bbceb5b4f315980ed10d8a0"
  1. 20 Feb, 2024 1 commit
  2. 19 Feb, 2024 4 commits
  3. 16 Feb, 2024 7 commits
  4. 15 Feb, 2024 3 commits
    • amyeroberts's avatar
      Patch to skip failing `test_save_load_low_cpu_mem_usage` tests (#29043) · 4156f517
      amyeroberts authored
      * Patch to skip currently failing tests
      
      * Whoops - wrong place
      4156f517
    • Donggeun Yu's avatar
      DeformableDetrModel support fp16 (#29013) · 5b6fa230
      Donggeun Yu authored
      
      
      * Update ms_deform_attn_cuda.cu
      
      * Update ms_deform_attn_cuda.cuh
      
      * Update modeling_deformable_detr.py
      
      * Update src/transformers/models/deformable_detr/modeling_deformable_detr.py
      Co-authored-by: default avataramyeroberts <22614925+amyeroberts@users.noreply.github.com>
      
      * Update modeling_deformable_detr.py
      
      * python utils/check_copies.py --fix_and_overwrite
      
      * Fix dtype missmatch error
      
      * Update test_modeling_deformable_detr.py
      
      * Update test_modeling_deformable_detr.py
      
      * Update modeling_deformable_detr.py
      
      * Update modeling_deformable_detr.py
      
      ---------
      Co-authored-by: default avataramyeroberts <22614925+amyeroberts@users.noreply.github.com>
      5b6fa230
    • Arthur's avatar
      Fix static generation when compiling! (#28937) · f3788b09
      Arthur authored
      
      
      * wow I was scared!
      
      * fix everything
      
      * nits
      
      * make it BC?
      
      * add todo
      
      * nits
      
      * is_tracing should still be used to pass tracing tests
      
      * nits
      
      * some nits to make sure genration works with static cache uncompiled
      
      * fix sdpa
      
      * fix FA2 for both static and dynamic in a better way?
      
      * style
      
      * fix-copies
      
      * fix fix copies
      
      * fix sequential beam searcg
      
      * style
      
      * use `keys_to_ignore`
      
      * nit
      
      * correct dtype inference when init
      
      * :( the fix for FA2 is still not optimal to investigate!
      
      * styling
      
      * nits
      
      * nit
      
      * this might work better
      
      * add comment
      
      * Update src/transformers/models/llama/modeling_llama.py
      
      * "position_ids" -> "cache_position"
      
      * style
      
      * nit
      
      * Remove changes that should no be propagatted just yet
      
      * Apply suggestions from code review
      
      * Styling
      
      * make sure we raise an errir for static cache with FA2 enabled
      
      * move  to the bottom of the signature
      
      * style
      
      * Update src/transformers/models/llama/modeling_llama.py
      Co-authored-by: default avatarYounes Belkada <49240599+younesbelkada@users.noreply.github.com>
      
      * Update src/transformers/models/llama/modeling_llama.py
      
      * nit in the name
      
      ---------
      Co-authored-by: default avatarYounes Belkada <49240599+younesbelkada@users.noreply.github.com>
      f3788b09
  5. 14 Feb, 2024 9 commits
  6. 13 Feb, 2024 4 commits
  7. 12 Feb, 2024 3 commits
  8. 08 Feb, 2024 2 commits
  9. 07 Feb, 2024 1 commit
  10. 06 Feb, 2024 4 commits
  11. 05 Feb, 2024 2 commits