Add Clipped ReLU and ELU activations (#2285)
* wip: add apis for clipped_relu and elu, and layer implementation for clipped_relu
* add tensor_tools documentation
* add cpu implementations for new activations
* add elu layer
* use upperbound and lowerbound for clipped_relu
* fix clipped_relu gradient due to wrong variable naming
* fix elu_gradient due to wrong variable naming
* fix elu_gradient documentation
* add documentation
* WIP: add test_layer cases for clipped_relu and elu
For some reason that I can't see, ELU is failing...
* add clipped_relu and elu tests... cuda elu layer still does not work
* fix spacing
* add custom cuda implementation for elu_gradient (this one works)
* Revert "add custom cuda implementation for elu_gradient (this one works)"
This reverts commit 446dd803964cc6ecca598ddf6688e5d89ca0b112.
* Revert "Revert "add custom cuda implementation for elu_gradient (this one works)""
This reverts commit 0b615f50081d0d90e71d502b6767fcb6ba62f28a.
* add comment about custom elu gradient implementation
* add gradient tests, restore cudnn elu gradient
* re add custom elu gradient implementation
* update docs
* use own cuda implementation for clipped_relu and elu
Co-authored-by:
Davis E. King <davis@dlib.net>
Showing
Please register or sign in to comment