Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
renzhc
diffusers_dcu
Commits
720dbfc9
Unverified
Commit
720dbfc9
authored
Dec 05, 2022
by
Benjamin Lefaudeux
Committed by
GitHub
Dec 05, 2022
Browse files
Compute embedding distances with torch.cdist (#1459)
small but mighty
parent
513fc681
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
7 deletions
+2
-7
src/diffusers/models/vae.py
src/diffusers/models/vae.py
+2
-7
No files found.
src/diffusers/models/vae.py
View file @
720dbfc9
...
@@ -290,15 +290,10 @@ class VectorQuantizer(nn.Module):
...
@@ -290,15 +290,10 @@ class VectorQuantizer(nn.Module):
# reshape z -> (batch, height, width, channel) and flatten
# reshape z -> (batch, height, width, channel) and flatten
z
=
z
.
permute
(
0
,
2
,
3
,
1
).
contiguous
()
z
=
z
.
permute
(
0
,
2
,
3
,
1
).
contiguous
()
z_flattened
=
z
.
view
(
-
1
,
self
.
vq_embed_dim
)
z_flattened
=
z
.
view
(
-
1
,
self
.
vq_embed_dim
)
# distances from z to embeddings e_j (z - e)^2 = z^2 + e^2 - 2 e * z
d
=
(
# distances from z to embeddings e_j (z - e)^2 = z^2 + e^2 - 2 e * z
torch
.
sum
(
z_flattened
**
2
,
dim
=
1
,
keepdim
=
True
)
min_encoding_indices
=
torch
.
argmin
(
torch
.
cdist
(
z_flattened
,
self
.
embedding
.
weight
),
dim
=
1
)
+
torch
.
sum
(
self
.
embedding
.
weight
**
2
,
dim
=
1
)
-
2
*
torch
.
einsum
(
"bd,dn->bn"
,
z_flattened
,
self
.
embedding
.
weight
.
t
())
)
min_encoding_indices
=
torch
.
argmin
(
d
,
dim
=
1
)
z_q
=
self
.
embedding
(
min_encoding_indices
).
view
(
z
.
shape
)
z_q
=
self
.
embedding
(
min_encoding_indices
).
view
(
z
.
shape
)
perplexity
=
None
perplexity
=
None
min_encodings
=
None
min_encodings
=
None
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment