Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
xuwx1
LightX2V
Commits
8013b8a2
"magic_pdf/git@developer.sourcefind.cn:wangsen/mineru.git" did not exist on "20ed0cd5fbab507757bc9a7a9efd7aa86067fc45"
Commit
8013b8a2
authored
Apr 25, 2025
by
helloyongyang
Browse files
add --use-deprecated=legacy-resolver for pip install
parent
c705464d
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
2 additions
and
4 deletions
+2
-4
Dockerfile
Dockerfile
+1
-3
README.md
README.md
+1
-1
No files found.
Dockerfile
View file @
8013b8a2
...
@@ -25,9 +25,7 @@ RUN update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.11 1
...
@@ -25,9 +25,7 @@ RUN update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.11 1
&&
update-alternatives
--install
/usr/bin/python python /usr/bin/python3.11 1
&&
update-alternatives
--install
/usr/bin/python python /usr/bin/python3.11 1
RUN
pip config
set
global.index-url https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple
\
RUN
pip config
set
global.index-url https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple
\
&&
pip
install
packaging ninja vllm torch torchvision diffusers
transformers
==
4.45.2
\
&&
pip
install
-r
/workspace/lightx2v/requirements.txt
--use-deprecated
=
legacy-resolver
tokenizers accelerate safetensors opencv-python numpy imageio imageio-ffmpeg
\
einops loguru sgl-kernel qtorch ftfy easydict
# install flash-attention 2
# install flash-attention 2
RUN
cd
lightx2v/3rd/flash-attention
&&
pip
install
--no-cache-dir
-v
-e
.
RUN
cd
lightx2v/3rd/flash-attention
&&
pip
install
--no-cache-dir
-v
-e
.
...
...
README.md
View file @
8013b8a2
...
@@ -27,7 +27,7 @@ git submodule update --init --recursive
...
@@ -27,7 +27,7 @@ git submodule update --init --recursive
# create conda env and install requirments
# create conda env and install requirments
conda create
-n
lightx2v
python
=
3.11
&&
conda activate lightx2v
conda create
-n
lightx2v
python
=
3.11
&&
conda activate lightx2v
pip
install
-r
requirements.txt
pip
install
-r
requirements.txt
--use-deprecated
=
legacy-resolver
# install flash-attention 2
# install flash-attention 2
cd
lightx2v/3rd/flash-attention
&&
pip
install
--no-cache-dir
-v
-e
.
cd
lightx2v/3rd/flash-attention
&&
pip
install
--no-cache-dir
-v
-e
.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment