Unverified Commit 4f98b144 authored by Younes Belkada's avatar Younes Belkada Committed by GitHub
Browse files

Docs / PEFT: Add PEFT API documentation (#31078)

* add peft references

* add peft references

* Update docs/source/en/peft.md

* Update docs/source/en/peft.md
parent 779bc360
...@@ -81,6 +81,8 @@ model = AutoModelForCausalLM.from_pretrained(model_id) ...@@ -81,6 +81,8 @@ model = AutoModelForCausalLM.from_pretrained(model_id)
model.load_adapter(peft_model_id) model.load_adapter(peft_model_id)
``` ```
Check out the [API documentation](#transformers.integrations.PeftAdapterMixin) section below for more details.
## Load in 8bit or 4bit ## Load in 8bit or 4bit
The `bitsandbytes` integration supports 8bit and 4bit precision data types, which are useful for loading large models because it saves memory (see the `bitsandbytes` integration [guide](./quantization#bitsandbytes-integration) to learn more). Add the `load_in_8bit` or `load_in_4bit` parameters to [`~PreTrainedModel.from_pretrained`] and set `device_map="auto"` to effectively distribute the model to your hardware: The `bitsandbytes` integration supports 8bit and 4bit precision data types, which are useful for loading large models because it saves memory (see the `bitsandbytes` integration [guide](./quantization#bitsandbytes-integration) to learn more). Add the `load_in_8bit` or `load_in_4bit` parameters to [`~PreTrainedModel.from_pretrained`] and set `device_map="auto"` to effectively distribute the model to your hardware:
...@@ -227,6 +229,19 @@ lora_config = LoraConfig( ...@@ -227,6 +229,19 @@ lora_config = LoraConfig(
model.add_adapter(lora_config) model.add_adapter(lora_config)
``` ```
## API docs
[[autodoc]] integrations.PeftAdapterMixin
- load_adapter
- add_adapter
- set_adapter
- disable_adapters
- enable_adapters
- active_adapters
- get_adapter_state_dict
<!-- <!--
TODO: (@younesbelkada @stevhliu) TODO: (@younesbelkada @stevhliu)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment