Unverified Commit a451c0ed authored by Yaniv Galron's avatar Yaniv Galron Committed by GitHub
Browse files

removing redundant requires_grad = False (#10628)



We already set the unet to requires grad false at line 506
Co-authored-by: default avatarAryan <aryan@huggingface.co>
parent 37c9697f
...@@ -515,10 +515,6 @@ def main(): ...@@ -515,10 +515,6 @@ def main():
elif accelerator.mixed_precision == "bf16": elif accelerator.mixed_precision == "bf16":
weight_dtype = torch.bfloat16 weight_dtype = torch.bfloat16
# Freeze the unet parameters before adding adapters
for param in unet.parameters():
param.requires_grad_(False)
unet_lora_config = LoraConfig( unet_lora_config = LoraConfig(
r=args.rank, r=args.rank,
lora_alpha=args.rank, lora_alpha=args.rank,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment