1. 04 Nov, 2022 2 commits
  2. 03 Nov, 2022 1 commit
  3. 02 Nov, 2022 1 commit
  4. 01 Nov, 2022 2 commits
  5. 31 Oct, 2022 10 commits
  6. 25 Oct, 2022 1 commit
  7. 24 Oct, 2022 4 commits
  8. 23 Oct, 2022 4 commits
  9. 22 Oct, 2022 1 commit
  10. 21 Oct, 2022 3 commits
  11. 17 Oct, 2022 4 commits
  12. 16 Oct, 2022 1 commit
  13. 14 Oct, 2022 2 commits
  14. 10 Oct, 2022 1 commit
  15. 06 Oct, 2022 2 commits
    • Tri Dao's avatar
      Merge pull request #55 from ajfadam/main · 8dd52b07
      Tri Dao authored
      remove numpy dependency
      8dd52b07
    • Antoine Adam's avatar
      remove numpy dependency · 4e38df05
      Antoine Adam authored
      According to the `setup.py` file, only dependencies are torch and einops. But the `bert_padding.py` file requires `numpy` only to multiply the elements of a `torch.Size` object. This change aims at allowing the use of FlashAttention without numpy.
      4e38df05
  16. 05 Oct, 2022 1 commit