1. 21 Jul, 2023 1 commit
  2. 20 Jul, 2023 2 commits
  3. 19 Jul, 2023 4 commits
  4. 18 Jul, 2023 2 commits
  5. 17 Jul, 2023 2 commits
  6. 16 Jul, 2023 1 commit
  7. 15 Jul, 2023 3 commits
  8. 08 Jul, 2023 2 commits
    • Tri Dao's avatar
      Merge pull request #299 from proger/rotary-inference-mode · 72ad03ea
      Tri Dao authored
      rotary: update cos/sin cache when switching from inference mode
      72ad03ea
    • Volodymyr Kyrylov's avatar
      rotary: update cos/sin cache when switching from inference mode · 70ab266a
      Volodymyr Kyrylov authored
      This resolves RuntimeErrors after running evaluation in inference mode:
      
      ```
        File "/home/proger/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
          return forward_call(*args, **kwargs)
        File "/home/proger/.local/lib/python3.10/site-packages/flash_attn/modules/mha.py", line 492, in forward
          qkv = self.rotary_emb(qkv)
        File "/home/proger/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
          return forward_call(*args, **kwargs)
        File "/home/proger/.local/lib/python3.10/site-packages/flash_attn/layers/rotary.py", line 229, in forward
          return apply_rotary_emb_qkv_(
        File "/home/proger/.local/lib/python3.10/site-packages/torch/autograd/function.py", line 506, in apply
          return super().apply(*args, **kwargs)  # type: ignore[misc]
      RuntimeError: Inference tensors cannot be saved for backward. To work around you can make a clone to get a normal tensor and use it in autograd.
      ```
      70ab266a
  9. 06 Jul, 2023 1 commit
  10. 04 Jul, 2023 1 commit
  11. 03 Jul, 2023 4 commits
  12. 02 Jul, 2023 1 commit
  13. 02 Jun, 2023 2 commits
  14. 30 May, 2023 3 commits
  15. 27 May, 2023 2 commits
  16. 25 May, 2023 5 commits
  17. 19 May, 2023 3 commits
  18. 17 May, 2023 1 commit