Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
renzhc
diffusers_dcu
Commits
9ff72433
Unverified
Commit
9ff72433
authored
Dec 04, 2024
by
fancy45daddy
Committed by
GitHub
Dec 04, 2024
Browse files
add torch_xla support in pipeline_stable_audio.py (#10109)
Update pipeline_stable_audio.py
parent
c1926cef
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
10 additions
and
0 deletions
+10
-0
src/diffusers/pipelines/stable_audio/pipeline_stable_audio.py
...diffusers/pipelines/stable_audio/pipeline_stable_audio.py
+10
-0
No files found.
src/diffusers/pipelines/stable_audio/pipeline_stable_audio.py
View file @
9ff72433
...
...
@@ -26,6 +26,7 @@ from ...models import AutoencoderOobleck, StableAudioDiTModel
from
...models.embeddings
import
get_1d_rotary_pos_embed
from
...schedulers
import
EDMDPMSolverMultistepScheduler
from
...utils
import
(
is_torch_xla_available
,
logging
,
replace_example_docstring
,
)
...
...
@@ -33,6 +34,12 @@ from ...utils.torch_utils import randn_tensor
from
..pipeline_utils
import
AudioPipelineOutput
,
DiffusionPipeline
from
.modeling_stable_audio
import
StableAudioProjectionModel
if
is_torch_xla_available
():
import
torch_xla.core.xla_model
as
xm
XLA_AVAILABLE
=
True
else
:
XLA_AVAILABLE
=
False
logger
=
logging
.
get_logger
(
__name__
)
# pylint: disable=invalid-name
...
...
@@ -726,6 +733,9 @@ class StableAudioPipeline(DiffusionPipeline):
step_idx
=
i
//
getattr
(
self
.
scheduler
,
"order"
,
1
)
callback
(
step_idx
,
t
,
latents
)
if
XLA_AVAILABLE
:
xm
.
mark_step
()
# 9. Post-processing
if
not
output_type
==
"latent"
:
audio
=
self
.
vae
.
decode
(
latents
).
sample
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment