• Will Berman's avatar
    Refactor LoRA (#3778) · c2a28c34
    Will Berman authored
    
    
    * refactor to support patching LoRA into T5
    
    instantiate the lora linear layer on the same device as the regular linear layer
    
    get lora rank from state dict
    
    tests
    
    fmt
    
    can create lora layer in float32 even when rest of model is float16
    
    fix loading model hook
    
    remove load_lora_weights_ and T5 dispatching
    
    remove Unet#attn_processors_state_dict
    
    docstrings
    
    * text encoder monkeypatch class method
    
    * fix test
    
    ---------
    Co-authored-by: default avatarPatrick von Platen <patrick.v.platen@gmail.com>
    c2a28c34
loaders.py 72.4 KB