• Matthew Yu's avatar
    add default layer losses and loss combiner · 419974bb
    Matthew Yu authored
    Summary:
    Pull Request resolved: https://github.com/facebookresearch/d2go/pull/421
    
    Add some reasonable defaults when running knowledge distillation
    * get_default_kd_image_classification_layer_losses => returns cross entropy loss on the output of the student classification layer and the teacher output (this is what the imagenet distillation uses)
    * DefaultLossCombiner => simple function to multiply the losses by some weights
    
    Unsure if these should go in `distillation.py` or a separate place (e.g., defaults or classification)
    
    Reviewed By: chihyaoma
    
    Differential Revision: D40330718
    
    fbshipit-source-id: 5887566d88e3a96d01aca133c51041126b2692cc
    419974bb
test_modeling_distillation.py 22.3 KB