Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
renzhc
diffusers_dcu
Commits
01a56927
Unverified
Commit
01a56927
authored
Nov 15, 2025
by
David Bertoin
Committed by
GitHub
Nov 15, 2025
Browse files
Rope in float32 for mps or npu compatibility (#12665)
rope in float32
parent
a9e4883b
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
1 deletion
+6
-1
src/diffusers/models/transformers/transformer_prx.py
src/diffusers/models/transformers/transformer_prx.py
+6
-1
No files found.
src/diffusers/models/transformers/transformer_prx.py
View file @
01a56927
...
...
@@ -275,7 +275,12 @@ class PRXEmbedND(nn.Module):
def
rope
(
self
,
pos
:
torch
.
Tensor
,
dim
:
int
,
theta
:
int
)
->
torch
.
Tensor
:
assert
dim
%
2
==
0
scale
=
torch
.
arange
(
0
,
dim
,
2
,
dtype
=
torch
.
float64
,
device
=
pos
.
device
)
/
dim
is_mps
=
pos
.
device
.
type
==
"mps"
is_npu
=
pos
.
device
.
type
==
"npu"
dtype
=
torch
.
float32
if
(
is_mps
or
is_npu
)
else
torch
.
float64
scale
=
torch
.
arange
(
0
,
dim
,
2
,
dtype
=
dtype
,
device
=
pos
.
device
)
/
dim
omega
=
1.0
/
(
theta
**
scale
)
out
=
pos
.
unsqueeze
(
-
1
)
*
omega
.
unsqueeze
(
0
)
out
=
torch
.
stack
([
torch
.
cos
(
out
),
-
torch
.
sin
(
out
),
torch
.
sin
(
out
),
torch
.
cos
(
out
)],
dim
=-
1
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment