Commit 2c99b605 authored by helloyongyang's avatar helloyongyang
Browse files

update doc

parent a1b0167c
...@@ -18,9 +18,8 @@ git clone https://github.com/ModelTC/lightx2v.git lightx2v && cd lightx2v ...@@ -18,9 +18,8 @@ git clone https://github.com/ModelTC/lightx2v.git lightx2v && cd lightx2v
conda create -n lightx2v python=3.11 && conda activate lightx2v conda create -n lightx2v python=3.11 && conda activate lightx2v
pip install -r requirements.txt pip install -r requirements.txt
# Install again separately to bypass the version conflict check
# The Hunyuan model needs to run under this version of transformers. If you do not need to run the Hunyuan model, you can ignore this step. # The Hunyuan model needs to run under this version of transformers. If you do not need to run the Hunyuan model, you can ignore this step.
pip install transformers==4.45.2 # pip install transformers==4.45.2
# install flash-attention 2 # install flash-attention 2
git clone https://github.com/Dao-AILab/flash-attention.git --recursive git clone https://github.com/Dao-AILab/flash-attention.git --recursive
...@@ -34,7 +33,7 @@ cd flash-attention/hopper && python setup.py install ...@@ -34,7 +33,7 @@ cd flash-attention/hopper && python setup.py install
```shell ```shell
# Modify the path in the script # Modify the path in the script
bash scripts/run_wan_t2v.sh bash scripts/wan/run_wan_t2v.sh
``` ```
In addition to the existing input arguments in the script, there are also some necessary parameters in the `${lightx2v_path}/configs/wan_t2v.json` file specified by `--config_json`. You can modify them as needed. In addition to the existing input arguments in the script, there are also some necessary parameters in the `wan_t2v.json` file specified by `--config_json`. You can modify them as needed.
...@@ -48,10 +48,3 @@ Documentation ...@@ -48,10 +48,3 @@ Documentation
Gradio Deployment <deploy_guides/deploy_gradio.md> Gradio Deployment <deploy_guides/deploy_gradio.md>
ComfyUI Deployment <deploy_guides/deploy_comfyui.md> ComfyUI Deployment <deploy_guides/deploy_comfyui.md>
Local Windows Deployment <deploy_guides/deploy_local_windows.md> Local Windows Deployment <deploy_guides/deploy_local_windows.md>
.. Indices and tables
.. ==================
.. * :ref:`genindex`
.. * :ref:`modindex`
...@@ -25,9 +25,8 @@ git clone https://github.com/ModelTC/lightx2v.git lightx2v && cd lightx2v ...@@ -25,9 +25,8 @@ git clone https://github.com/ModelTC/lightx2v.git lightx2v && cd lightx2v
conda create -n lightx2v python=3.11 && conda activate lightx2v conda create -n lightx2v python=3.11 && conda activate lightx2v
pip install -r requirements.txt pip install -r requirements.txt
# 单独重新安装transformers,避免pip的冲突检查
# 混元模型需要在4.45.2版本的transformers下运行,如果不需要跑混元模型,可以忽略 # 混元模型需要在4.45.2版本的transformers下运行,如果不需要跑混元模型,可以忽略
pip install transformers==4.45.2 # pip install transformers==4.45.2
# 安装 flash-attention 2 # 安装 flash-attention 2
git clone https://github.com/Dao-AILab/flash-attention.git --recursive git clone https://github.com/Dao-AILab/flash-attention.git --recursive
...@@ -41,7 +40,7 @@ cd flash-attention/hopper && python setup.py install ...@@ -41,7 +40,7 @@ cd flash-attention/hopper && python setup.py install
```shell ```shell
# 修改脚本中的路径 # 修改脚本中的路径
bash scripts/run_wan_t2v.sh bash scripts/wan/run_wan_t2v.sh
``` ```
除了脚本中已有的输入参数,`--config_json`指向的`${lightx2v_path}/configs/wan_t2v.json`中也会存在一些必要的参数,可以根据需要,自行修改。 除了脚本中已有的输入参数,`--config_json`指向的`wan_t2v.json`中也会存在一些必要的参数,可以根据需要,自行修改。
...@@ -48,10 +48,3 @@ LightX2V 是一个轻量级的视频生成推理框架,旨在提供一个利 ...@@ -48,10 +48,3 @@ LightX2V 是一个轻量级的视频生成推理框架,旨在提供一个利
gradio部署 <deploy_guides/deploy_gradio.md> gradio部署 <deploy_guides/deploy_gradio.md>
comfyui部署 <deploy_guides/deploy_comfyui.md> comfyui部署 <deploy_guides/deploy_comfyui.md>
本地windows电脑部署 <deploy_guides/deploy_local_windows.md> 本地windows电脑部署 <deploy_guides/deploy_local_windows.md>
.. Indices and tables
.. ==================
.. * :ref:`genindex`
.. * :ref:`modindex`
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment