"src/llamafactory/hparams/model_args.py" did not exist on "6cedac147c989817368a06ac32ea206ed05e7232"
  1. 24 May, 2022 1 commit
  2. 18 May, 2022 1 commit
  3. 17 May, 2022 3 commits
  4. 16 May, 2022 1 commit
  5. 09 May, 2022 1 commit
  6. 28 Apr, 2022 1 commit
  7. 30 Mar, 2022 1 commit
  8. 25 Mar, 2022 1 commit
  9. 11 Mar, 2022 1 commit
  10. 09 Mar, 2022 1 commit
  11. 07 Mar, 2022 1 commit
  12. 03 Mar, 2022 1 commit
  13. 02 Mar, 2022 1 commit
  14. 01 Mar, 2022 1 commit
  15. 19 Feb, 2022 1 commit
  16. 04 Feb, 2022 1 commit
  17. 03 Feb, 2022 1 commit
  18. 01 Feb, 2022 2 commits
  19. 31 Jan, 2022 1 commit
  20. 29 Jan, 2022 1 commit
  21. 27 Jan, 2022 1 commit
  22. 26 Jan, 2022 1 commit
  23. 25 Jan, 2022 1 commit
  24. 24 Jan, 2022 2 commits
  25. 22 Jan, 2022 1 commit
  26. 11 Jan, 2022 2 commits
  27. 10 Jan, 2022 1 commit
  28. 08 Jan, 2022 1 commit
  29. 07 Jan, 2022 1 commit
  30. 17 Dec, 2021 3 commits
  31. 10 Dec, 2021 1 commit
  32. 22 Nov, 2021 1 commit
  33. 11 Nov, 2021 1 commit
    • Sangkug Lym's avatar
      persistent fused layer norm · a2fdcdf0
      Sangkug Lym authored
      fix the guard to fall back to the baseline fused layer norm kernel
      
      Persisten ln: move the guard for supported hidden sizes to layer norm module
      a2fdcdf0