Commit 78e87f24 authored by GoatWu's avatar GoatWu
Browse files

update docs

parent 407b8508
......@@ -43,27 +43,29 @@ Using `Wan2.1-I2V-14B-480P-StepDistill-CfgDistill-LightX2V` as an example, the s
```
Wan2.1-I2V-14B-480P-StepDistill-CfgDistill-LightX2V/
├── distill_fp8/ # FP8 quantized version (DIT/T5/CLIP)
├── distill_fp8/ # FP8 quantized version (DIT/T5/CLIP)
│ ├── block_xx.safetensors # DIT model FP8 quantized version
│ ├── models_t5_umt5-xxl-enc-fp8.pth # T5 encoder FP8 quantized version
│ ├── clip-fp8.pth # CLIP encoder FP8 quantized version
│ ├── Wan2.1_VAE.pth # VAE variational autoencoder
│ ├── taew2_1.pth # Lightweight VAE (optional)
│ └── config.json # Model configuration file
├── distill_int8/ # INT8 quantized version (DIT/T5/CLIP)
├── distill_int8/ # INT8 quantized version (DIT/T5/CLIP)
│ ├── block_xx.safetensors # DIT model INT8 quantized version
│ ├── models_t5_umt5-xxl-enc-int8.pth # T5 encoder INT8 quantized version
│ ├── clip-int8.pth # CLIP encoder INT8 quantized version
│ ├── Wan2.1_VAE.pth # VAE variational autoencoder
│ ├── taew2_1.pth # Lightweight VAE (optional)
│ └── config.json # Model configuration file
├── distill_models/ # Original precision version (DIT/T5/CLIP)
├── distill_models/ # Original precision version (DIT/T5/CLIP)
│ ├── distill_model.safetensors # DIT model original precision version
│ ├── models_t5_umt5-xxl-enc-bf16.pth # T5 encoder original precision version
│ ├── models_clip_open-clip-xlm-roberta-large-vit-huge-14.pth # CLIP encoder original precision version
│ ├── Wan2.1_VAE.pth # VAE variational autoencoder
│ ├── taew2_1.pth # Lightweight VAE (optional)
│ └── config.json # Model configuration file
├── loras/
│ ├── Wan21_I2V_14B_lightx2v_cfg_step_distill_lora_rank64.safetensors # Distillation model lora
```
### 💾 Storage Recommendations
......
......@@ -44,27 +44,29 @@ Wan2.1-I2V-14B-480P-LightX2V/
```
Wan2.1-I2V-14B-480P-StepDistill-CfgDistill-LightX2V/
├── distill_fp8/ # FP8 量化版本 (DIT/T5/CLIP)
├── distill_fp8/ # FP8 量化版本 (DIT/T5/CLIP)
│ ├── block_xx.safetensors # DIT 模型 FP8 量化版本
│ ├── models_t5_umt5-xxl-enc-fp8.pth # T5 编码器 FP8 量化版本
│ ├── clip-fp8.pth # CLIP 编码器 FP8 量化版本
│ ├── Wan2.1_VAE.pth # VAE 变分自编码器
│ ├── taew2_1.pth # 轻量级 VAE (可选)
│ └── config.json # 模型配置文件
├── distill_int8/ # INT8 量化版本 (DIT/T5/CLIP)
├── distill_int8/ # INT8 量化版本 (DIT/T5/CLIP)
│ ├── block_xx.safetensors # DIT 模型 INT8 量化版本
│ ├── models_t5_umt5-xxl-enc-int8.pth # T5 编码器 INT8 量化版本
│ ├── clip-int8.pth # CLIP 编码器 INT8 量化版本
│ ├── Wan2.1_VAE.pth # VAE 变分自编码器
│ ├── taew2_1.pth # 轻量级 VAE (可选)
│ └── config.json # 模型配置文件
├── distill_models/ # 原始精度版本 (DIT/T5/CLIP)
├── distill_models/ # 原始精度版本 (DIT/T5/CLIP)
│ ├── distill_model.safetensors # DIT 模型原始精度版本
│ ├── models_t5_umt5-xxl-enc-bf16.pth # T5 编码器原始精度版本
│ ├── models_clip_open-clip-xlm-roberta-large-vit-huge-14.pth # CLIP 编码器原始精度版本
│ ├── Wan2.1_VAE.pth # VAE 变分自编码器
│ ├── taew2_1.pth # 轻量级 VAE (可选)
│ └── config.json # 模型配置文件
├── loras/
│ ├── Wan21_I2V_14B_lightx2v_cfg_step_distill_lora_rank64.safetensors # 蒸馏模型lora
```
### 💾 存储建议
......
......@@ -5,9 +5,9 @@ if __name__ == "__main__":
url = "http://localhost:8000/v1/tasks/"
message = {
"prompt": "镜头向左平移。",
"prompt": "Summer beach vacation style, a white cat wearing sunglasses sits on a surfboard. The fluffy-furred feline gazes directly at the camera with a relaxed expression. Blurred beach scenery forms the background featuring crystal-clear waters, distant green hills, and a blue sky dotted with white clouds. The cat assumes a naturally relaxed posture, as if savoring the sea breeze and warm sunlight. A close-up shot highlights the feline's intricate details and the refreshing atmosphere of the seaside.",
"negative_prompt": "镜头晃动,色调艳丽,过曝,静态,细节模糊不清,字幕,风格,作品,画作,画面,静止,整体发灰,最差质量,低质量,JPEG压缩残留,丑陋的,残缺的,多余的手指,画得不好的手部,画得不好的脸部,畸形的,毁容的,形态畸形的肢体,手指融合,静止不动的画面,杂乱的背景,三条腿,背景人很多,倒着走",
"image_path": "assets/inputs/imgs/test_1.png", # 图片地址
"image_path": "assets/inputs/imgs/img_0.jpg", # 图片地址
}
logger.info(f"message: {message}")
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment