README.md 2.67 KB
Newer Older
Dongz's avatar
Dongz committed
1
# LightX2V: Light Video Generation Inference Framework
helloyongyang's avatar
helloyongyang committed
2

helloyongyang's avatar
helloyongyang committed
3
4
5
6
7
<div align="center">
  <picture>
    <img alt="LightLLM" src="assets/img_lightx2v.jpg" width=75%>
  </picture>
</div>
helloyongyang's avatar
helloyongyang committed
8

helloyongyang's avatar
helloyongyang committed
9
--------------------------------------------------------------------------------
helloyongyang's avatar
helloyongyang committed
10

helloyongyang's avatar
helloyongyang committed
11
12
13
14
15
16
17
18
19
20
## Supported Model List

[HunyuanVideo-T2V](https://huggingface.co/tencent/HunyuanVideo)

[HunyuanVideo-I2V](https://huggingface.co/tencent/HunyuanVideo-I2V)

[Wan2.1-T2V](https://huggingface.co/Wan-AI/Wan2.1-T2V-1.3B)

[Wan2.1-I2V](https://huggingface.co/Wan-AI/Wan2.1-I2V-14B-480P)

21
## Fast Start Up With Conda
helloyongyang's avatar
helloyongyang committed
22

Dongz's avatar
Dongz committed
23
```shell
24
# clone repo and submodules
25
git clone https://github.com/ModelTC/lightx2v.git lightx2v && cd lightx2v
26
git submodule update --init --recursive
27

28
# create conda env and install requirments
29
conda create -n lightx2v python=3.11 && conda activate lightx2v
helloyongyang's avatar
helloyongyang committed
30
31
32
33
pip install -r requirements.txt

# Install again separately to bypass the version conflict check
pip install transformers==4.45.2
34

35
# install flash-attention 2
helloyongyang's avatar
helloyongyang committed
36
cd lightx2v/3rd/flash-attention && pip install --no-cache-dir -v -e .
37
38

# install flash-attention 3, only if hopper
helloyongyang's avatar
helloyongyang committed
39
cd lightx2v/3rd/flash-attention/hopper && pip install --no-cache-dir -v -e .
40

41
42
# modify the parameters of the running script
bash scripts/run_hunyuan_t2v.sh
helloyongyang's avatar
helloyongyang committed
43
44
```

45
## Fast Start Up With Docker
helloyongyang's avatar
helloyongyang committed
46

Dongz's avatar
Dongz committed
47
```shell
zhiwei.dong's avatar
zhiwei.dong committed
48
docker pull lightx2v/lightx2v:latest
49
docker run -it --rm --name lightx2v --gpus all --ipc=host lightx2v/lightx2v:latest
helloyongyang's avatar
helloyongyang committed
50
```
Dongz's avatar
Dongz committed
51

52
## Contributing Guidelines
Dongz's avatar
Dongz committed
53

54
We have prepared a `pre-commit` hook to enforce consistent code formatting across the project. If your code complies with the standards, you should not see any errors, you can clean up your code following the steps below:
Dongz's avatar
Dongz committed
55
56
57
58

1. Install the required dependencies:

```shell
helloyongyang's avatar
helloyongyang committed
59
pip install ruff pre-commit
Dongz's avatar
Dongz committed
60
61
```

62
2. Then, run the following command before commit:
Dongz's avatar
Dongz committed
63
64

```shell
helloyongyang's avatar
helloyongyang committed
65
pre-commit run --all-files
Dongz's avatar
Dongz committed
66
67
```

68
69
70
71
72
73
3. Finally, please double-check your code to ensure it complies with the following additional specifications as much as possible:
  - Avoid hard-coding local paths: Make sure your submissions do not include hard-coded local paths, as these paths are specific to individual development environments and can cause compatibility issues. Use relative paths or configuration files instead.
  - Clear error handling: Implement clear error-handling mechanisms in your code so that error messages can accurately indicate the location of the problem, possible causes, and suggested solutions, facilitating quick debugging.
  - Detailed comments and documentation: Add comments to complex code sections and provide comprehensive documentation to explain the functionality of the code, input-output requirements, and potential error scenarios.

Thank you for your contributions!