"examples/int4-flux.1-dev-qencoder.py" did not exist on "37c494a74a267c551c947640476fb7eb248ec950"
readme.md 4.21 KB
Newer Older
1
# Model Conversion Tool
2

3
This converter tool can convert model weights between different formats.
4

5
## Feature 1: Convert Quantized Models
6

7
This tool supports converting **FP32/FP16/BF16** model weights to **INT8, FP8** types.
8
9
10
11
12
13
14

### Wan DIT

```bash
python converter.py \
    --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/ \
    --output /Path/To/output \
gushiqiao's avatar
Fix  
gushiqiao committed
15
    --output_ext .safetensors \
16
    --output_name wan_int8 \
gushiqiao's avatar
gushiqiao committed
17
    --linear_dtype torch.int8 \
gushiqiao's avatar
gushiqiao committed
18
19
20
    --model_type wan_dit \
    --quantized \
    --save_by_block
21
22
23
24
25
26
```

```bash
python converter.py \
    --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/ \
    --output /Path/To/output \
gushiqiao's avatar
Fix  
gushiqiao committed
27
    --output_ext .safetensors \
28
    --output_name wan_fp8 \
gushiqiao's avatar
gushiqiao committed
29
    --linear_dtype torch.float8_e4m3fn \
gushiqiao's avatar
gushiqiao committed
30
31
32
    --model_type wan_dit \
    --quantized \
    --save_by_block
33
34
```

GoatWu's avatar
GoatWu committed
35
36
37
38
39
40
41
42
### Wan DiT + LoRA

```bash
python converter.py \
    --source /Path/To/Wan-AI/Wan2.1-T2V-14B/ \
    --output /Path/To/output \
    --output_ext .safetensors \
    --output_name wan_int8 \
gushiqiao's avatar
gushiqiao committed
43
    --linear_dtype torch.int8 \
GoatWu's avatar
GoatWu committed
44
45
    --model_type wan_dit \
    --lora_path /Path/To/LoRA1/ /Path/To/LoRA2/ \
gushiqiao's avatar
gushiqiao committed
46
47
48
    --lora_alpha 1.0 1.0 \
    --quantized \
    --save_by_block
GoatWu's avatar
GoatWu committed
49
50
```

51
52
53
54
55
56
### Hunyuan DIT

```bash
python converter.py \
    --source /Path/To/hunyuan/lightx2v_format/i2v/ \
    --output /Path/To/output \
gushiqiao's avatar
gushiqiao committed
57
    --output_ext ..safetensors \
58
    --output_name hunyuan_int8 \
gushiqiao's avatar
gushiqiao committed
59
    --linear_dtype torch.int8 \
gushiqiao's avatar
gushiqiao committed
60
61
    --model_type hunyuan_dit \
    --quantized
62
63
64
65
66
67
```

```bash
python converter.py \
    --source /Path/To/hunyuan/lightx2v_format/i2v/ \
    --output /Path/To/output \
gushiqiao's avatar
Fix  
gushiqiao committed
68
    --output_ext .safetensors \
69
    --output_name hunyuan_fp8 \
gushiqiao's avatar
gushiqiao committed
70
    --linear_dtype torch.float8_e4m3fn \
gushiqiao's avatar
gushiqiao committed
71
72
    --model_type hunyuan_dit \
    --quantized
73
74
```

75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
### QWen-Image DIT

```bash
python converter.py \
    --source /path/to/Qwen-Image-Edit/transformer \
    --output /Path/To/output \
    --output_ext .safetensors \
    --output_name qwen_int8 \
    --linear_dtype torch.int8 \
    --model_type qwen_image_dit \
    --quantized \
    --save_by_block
```

```bash
python converter.py \
    --source /path/to/Qwen-Image-Edit/transformer \
    --output /Path/To/output \
    --output_ext .safetensors \
    --output_name qwen_fp8 \
    --linear_dtype torch.float8_e4m3fn \
    --model_type qwen_image_dit \
    --quantized \
    --save_by_block
```
100
101
102
103
104
105
106
107
108

### Wan T5EncoderModel

```bash
python converter.py \
    --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/models_t5_umt5-xxl-enc-bf16.pth \
    --output /Path/To/output \
    --output_ext .pth\
    --output_name models_t5_umt5-xxl-enc-int8 \
gushiqiao's avatar
gushiqiao committed
109
110
    --linear_dtype torch.int8 \
    --non_linear_dtype torch.bfloat16 \
gushiqiao's avatar
gushiqiao committed
111
112
    --model_type wan_t5 \
    --quantized
113
114
115
116
117
```

```bash
python converter.py \
    --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/models_t5_umt5-xxl-enc-bf16.pth \
gushiqiao's avatar
gushiqiao committed
118
    --output /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/fp8 \
119
120
    --output_ext .pth\
    --output_name models_t5_umt5-xxl-enc-fp8 \
gushiqiao's avatar
gushiqiao committed
121
122
    --linear_dtype torch.float8_e4m3fn \
    --non_linear_dtype torch.bfloat16 \
gushiqiao's avatar
gushiqiao committed
123
124
    --model_type wan_t5 \
    --quantized
125
126
127
128
129
130
131
132
133
134
```


### Wan CLIPModel

```bash
python converter.py \
  --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/models_clip_open-clip-xlm-roberta-large-vit-huge-14.pth \
  --output /Path/To/output \
  --output_ext .pth \
gushiqiao's avatar
gushiqiao committed
135
  --output_name clip-int8 \
gushiqiao's avatar
gushiqiao committed
136
137
  --linear_dtype torch.int8 \
  --non_linear_dtype torch.float16 \
gushiqiao's avatar
gushiqiao committed
138
139
  --model_type wan_clip \
  --quantized
140
141
142
143
144

```
```bash
python converter.py \
  --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P/models_clip_open-clip-xlm-roberta-large-vit-huge-14.pth \
gushiqiao's avatar
gushiqiao committed
145
  --output ./output \
146
  --output_ext .pth \
gushiqiao's avatar
gushiqiao committed
147
  --output_name clip-fp8 \
gushiqiao's avatar
gushiqiao committed
148
149
  --linear_dtype torch.float8_e4m3fn \
  --non_linear_dtype torch.float16 \
gushiqiao's avatar
gushiqiao committed
150
151
  --model_type wan_clip \
  --quantized
152
```
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174


## Feature 2: Format Conversion Between Diffusers and Lightx2v
Supports mutual conversion between Diffusers architecture and LightX2V architecture

### Lightx2v->Diffusers
```bash
python converter.py \
       --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P \
       --output /Path/To/Wan2.1-I2V-14B-480P-Diffusers \
       --direction forward \
       --save_by_block
```

### Diffusers->Lightx2v
```bash
python converter.py \
       --source /Path/To/Wan-AI/Wan2.1-I2V-14B-480P-Diffusers \
       --output /Path/To/Wan2.1-I2V-14B-480P \
       --direction backward \
       --save_by_block
```