- 01 Feb, 2024 1 commit
-
-
Aarni Koskela authored
`out_order` is the global parametrization list, not the test fixture argument
-
- 30 Jan, 2024 1 commit
-
-
Aarni Koskela authored
* Adjust Ruff configuration * do not autofix always * be less strict around tests and benchmarks * adjust ignores for now * Ruff: autofix I and F401 * Apply ruff autofixes * Fix RUF013 complaint * Fix mutable default in replace_linear * Don't use bare except * Wrap bitsandbytes.__main__ entrypoint in function; fix "sensible" typo * Fix ruff B008 (function call in arguments) * Add ruff noqas as suitable * Fix RUF005 (splat instead of concatenating) * Fix B018 (useless expression) * Add pre-commit configuration + GitHub Actions lint workflow * Fix unused `e` in bitsandbytes/__main__.py * fix merge conflict resolution error * run pre-commit hook --------- Co-authored-by:Titus <9048635+Titus-von-Koeller@users.noreply.github.com>
-
- 24 Jan, 2024 1 commit
-
-
Aarni Koskela authored
* implicitly skip any test that implicitly uses CUDA on a non-CUDA box * add a `requires_cuda` fixture
-
- 17 Jan, 2024 1 commit
-
-
Benjamin Warner authored
This PR adds initial FSDP support for training QLoRA models. It enables basic FSDP and CPU Offload support, with low memory training via FSDP.sync_module_states option unsupported. This PR builds off of #840 commit 8278fca and BNB FSDP by @TimDettmers and @Titus-von-Koeller. An example of using this PR to finetune QLoRA models with FSDP can be found in the demo repo: AnswerDotAi/fsdp_qlora. * Minimal changes for fp32 4bit storage from BNB commit 8278fca * Params4bit with selectable storage dtype * possible fix for double quantizing linear weight & quant storage dtype * minor fixes in Params4bit for peft tests * remove redundant * add float16 * update test * Remove float16 quant cast as there are fp32, bf16, & fp16 quant kernels --------- Co-authored-by:Kerem Turgutlu <keremturgutlu@gmail.com>
-
- 08 Jan, 2024 1 commit
-
-
Tim Dettmers authored
-
- 02 Nov, 2023 2 commits
-
-
Ruslan Svirschevski authored
-
Ruslan Svirschevski authored
-
- 04 Aug, 2023 1 commit
-
-
Tim Dettmers authored
-
- 19 Jul, 2023 1 commit
-
-
Tim Dettmers authored
-
- 12 Jul, 2023 1 commit
-
-
Tim Dettmers authored
-
- 11 Jul, 2023 1 commit
-
-
Tim Dettmers authored
-
- 10 Jul, 2023 3 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 09 Jul, 2023 3 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 08 Jul, 2023 2 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 05 Jul, 2023 1 commit
-
-
Tim Dettmers authored
-
- 04 Jul, 2023 2 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 31 May, 2023 2 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 24 May, 2023 1 commit
-
-
Tim Dettmers authored
-
- 06 May, 2023 2 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 02 May, 2023 7 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 01 May, 2023 3 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-
Tim Dettmers authored
-
- 30 Apr, 2023 1 commit
-
-
Tim Dettmers authored
-
- 29 Apr, 2023 2 commits
-
-
Tim Dettmers authored
-
Tim Dettmers authored
-