"git@developer.sourcefind.cn:OpenDAS/ollama.git" did not exist on "7e5c8eee5c65bcf7c0d46d5fe3b084fd70d36015"
Unverified Commit 87a92f77 authored by dg845's avatar dg845 Committed by GitHub
Browse files

Fix bug in ResnetBlock2D.forward where LoRA Scale gets Overwritten (#6736)



Fix bug in ResnetBlock2D.forward when not USE_PEFT_BACKEND and using scale_shift for time emb where the lora scale  gets overwritten.
Co-authored-by: default avatarSayak Paul <spsayakpaul@gmail.com>
parent 0db766ba
...@@ -384,9 +384,9 @@ class ResnetBlock2D(nn.Module): ...@@ -384,9 +384,9 @@ class ResnetBlock2D(nn.Module):
raise ValueError( raise ValueError(
f" `temb` should not be None when `time_embedding_norm` is {self.time_embedding_norm}" f" `temb` should not be None when `time_embedding_norm` is {self.time_embedding_norm}"
) )
scale, shift = torch.chunk(temb, 2, dim=1) time_scale, time_shift = torch.chunk(temb, 2, dim=1)
hidden_states = self.norm2(hidden_states) hidden_states = self.norm2(hidden_states)
hidden_states = hidden_states * (1 + scale) + shift hidden_states = hidden_states * (1 + time_scale) + time_shift
else: else:
hidden_states = self.norm2(hidden_states) hidden_states = self.norm2(hidden_states)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment