"examples/research_projects/distillation/train.py" did not exist on "31c23bd5ee26425a67f92fc170789656379252a6"
fix: load_best_model_at_end error when load_in_8bit is True (#23443)
Ref: https://github.com/huggingface/peft/issues/394 Loading a quantized checkpoint into non-quantized Linear8bitLt is not supported. call module.cuda() before module.load_state_dict()
Showing
Please register or sign in to comment