• Qianli Scott Zhu's avatar
    Fix resnet missing layers. (#4254) · 493d7a2e
    Qianli Scott Zhu authored
    * Fix resnet missing layers.
    
    The official v1 model contains BN and Relu between input layer and
    pooling.
    
    * Remove the BN and Relu for V2.
    
    After some discussion with team and refer to existing
    implementation, those two layer seems to be only useful in V1.
    In V2, the first unit of the block will have a projection, that
    apply the BN and Relu for the shortcut. Adding a comment to make
    this clear.
    
    * Expand the comment section.
    
    * Remove the pre-trained checkpoint since its broken right now.
    
    Will restore it once we have new checkpoint generated.
    493d7a2e
README.md 3.05 KB