- 19 May, 2020 1 commit
-
-
Francisco Massa authored
* Make copy of targets in GeneralizedRCNNTransform * Fix flake8
-
- 15 May, 2020 1 commit
-
-
Urwa Muaz authored
* freeze layers only if pretrained backbone is used If pretrained backbone is not used and one intends to train the entire network from scratch, no layers should be frozen. * function argument to control the trainable features Depending on the size of dataset one might want to control the number of tunable parameters in the backbone, and this parameter in hyper parameter optimization for the dataset. It would be nice to have this function support this. * ensuring tunable layer argument is valid * backbone freezing in fasterrcnn_resnet50_fpn Handle backbone freezing in fasterrcnn_resnet50_fpn function rather than the resnet_fpn_backbone function that it uses to get the backbone. * remove layer freezing code layer freezing code has been moved to fasterrcnn_resnet50_fpn function that consumes resnet_fpn_backbone function. * correcting linting errors * correcting linting errors * move freezing logic to resnet_fpn_backbone Moved layer freezing logic to resnet_fpn_backbone with an additional parameter. * remove layer freezing from fasterrcnn_resnet50_fpn Layer freezing logic has been moved to resnet_fpn_backbone. This function only ensures that the all layers are made trainable if pretrained models are not used. * update example resnet_fpn_backbone docs * correct typo in var name * correct indentation * adding test case for layer freezing in faster rcnn This PR adds functionality to specify the number of trainable layers while initializing the faster rcnn using fasterrcnn_resnet50_fpn function. This commits adds a test case to test this functionality. * updating layer freezing condition for clarity More information in PR * remove linting errors * removing linting errors * removing linting errors
-
- 16 Dec, 2019 1 commit
-
-
Francisco Massa authored
-