1. 17 Jan, 2024 1 commit
    • Benjamin Warner's avatar
      Initial FSDP Support for QLoRA Finetuning (#970) · dcfb6f81
      Benjamin Warner authored
      
      
      This PR adds initial FSDP support for training QLoRA models. It enables basic FSDP and CPU Offload support, with low memory training via FSDP.sync_module_states option unsupported.
      
      This PR builds off of #840 commit 8278fca and BNB FSDP by @TimDettmers and @Titus-von-Koeller.
      
      An example of using this PR to finetune QLoRA models with FSDP can be found in the demo repo: AnswerDotAi/fsdp_qlora.
      
      * Minimal changes for fp32 4bit storage from BNB commit 8278fca
      
      * Params4bit with selectable storage dtype
      
      * possible fix for double quantizing linear weight & quant storage dtype
      
      * minor fixes in Params4bit for peft tests
      
      * remove redundant
      
      * add float16
      
      * update test
      
      * Remove float16 quant cast as there are fp32, bf16, & fp16 quant kernels
      
      ---------
      Co-authored-by: default avatarKerem Turgutlu <keremturgutlu@gmail.com>
      dcfb6f81
  2. 08 Jan, 2024 1 commit
  3. 02 Nov, 2023 2 commits
  4. 04 Aug, 2023 1 commit
  5. 19 Jul, 2023 1 commit
  6. 12 Jul, 2023 1 commit
  7. 11 Jul, 2023 1 commit
  8. 10 Jul, 2023 3 commits
  9. 09 Jul, 2023 3 commits
  10. 08 Jul, 2023 2 commits
  11. 05 Jul, 2023 1 commit
  12. 04 Jul, 2023 2 commits
  13. 31 May, 2023 2 commits
  14. 24 May, 2023 1 commit
  15. 06 May, 2023 2 commits
  16. 02 May, 2023 7 commits
  17. 01 May, 2023 3 commits
  18. 30 Apr, 2023 1 commit
  19. 29 Apr, 2023 4 commits
  20. 27 Apr, 2023 1 commit