- 27 Jun, 2017 1 commit
-
-
Davis King authored
reallocation and copying inside conv_'s backward pass. Doing this required adding an add_to_output boolean option to the methods of tensor_conv.
-
- 22 Jun, 2017 1 commit
-
-
OranjeeGeneral authored
refactored interface to reduce complexity so conv and convt layers forward passes have to call setup explicit now and there is only one ()-operator
-
- 21 Apr, 2017 1 commit
-
-
Davis King authored
-
- 02 Apr, 2017 1 commit
-
-
Davis King authored
rather than the entire tensor.
-
- 16 Mar, 2017 1 commit
-
-
Joachim authored
fixed backward pass in cont layer to accumulate gradients this will pass the layer test now also removed compile warnings and changed some comments
-
- 13 Mar, 2017 1 commit
-
-
Joachim authored
-
- 19 Feb, 2017 1 commit
-
-
Davis King authored
-
- 06 Feb, 2017 1 commit
-
-
Dennis Francis authored
* feature_addition : Mean squared loss layer for multiple output (#404) * Added loss_mean_squared_multioutput layer to support multiple outputs. * Also added a corresponding test case to test a single variable regression with multiple outputs. * Added error checks on truth argument Added assert statements to check that truth argument in compute_loss_value_and_gradient() method contains matrices of correct dimension relative to the output tensor's size. Also the requirements on argument truth to the abstract documentation.
-
- 26 Nov, 2016 1 commit
-
-
Dennis Francis authored
-
- 25 Nov, 2016 1 commit
-
-
Dennis Francis authored
-
- 23 Nov, 2016 1 commit
-
-
Dennis Francis authored
Added mean squared loss layer "loss_mean_squared" to DNN as requested in https://github.com/davisking/dlib/issues/152 Also added test case of a simple linear regression with one variable that uses this layer.
-
- 18 Nov, 2016 1 commit
-
-
Davis King authored
-
- 02 Nov, 2016 1 commit
-
-
Davis King authored
versions were calling into cuDNN, however, the cuDNN functions for doing this are horrifically slow, well over 100x slower than they should be, which is surprising since these functions are so trivial.
-
- 23 Oct, 2016 1 commit
-
-
Davis King authored
-
- 27 Aug, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
alias tensors. Now any kind of tensors are supported.
-
- 12 Aug, 2016 1 commit
-
-
Davis King authored
cudnnGetConvolutionBackwardFilterAlgorithm() to pick invalid algorithms, resulting in cuDNN not working correctly.
-
- 06 Aug, 2016 1 commit
-
-
Davis King authored
-
- 11 Jun, 2016 1 commit
-
-
Davis King authored
automatically sizes the tensor.
-
- 01 Jun, 2016 1 commit
-
-
Davis King authored
-
- 27 May, 2016 2 commits
- 26 May, 2016 2 commits
-
-
Evgeniy Fominov authored
-
Fm authored
-
- 25 May, 2016 1 commit
-
-
Davis King authored
-
- 23 May, 2016 1 commit
-
-
Davis King authored
-
- 22 May, 2016 3 commits
-
-
Davis King authored
caused by num_computational_layers being wrong when tax layers were placed as the first layer. These visit functions being wrong also caused multi-GPU support to not work on such networks.
-
Davis King authored
-
Davis King authored
-
- 14 May, 2016 1 commit
-
-
Davis King authored
skip layers and add_prev style layers. In particular, now in-place layers only overwrite the gradient information in their child layer if they are operating in in-place mode. Otherwise, they add their gradients to their child layers. It should also be noted that it's safe for in-place layers to overwrite gradients when in in-place mode since their child layers are inaccessible when in-place layers operate in in-place mode. This prevents any other layers from trying to add to the child layer, thereby avoiding the potability of layer interference. So the bug this change fixes is that, when not in in-place mode the child layers are still accessible but in-place layers were *still* overwriting child gradients.
-
- 05 May, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
interfaces. Also changed the default behavior when the stride isn't 1. Now the filters will be applied only to the "valid" part of the image.
-
- 04 May, 2016 2 commits
-
-
Davis King authored
to expose it in the final layer interface.
-
Davis King authored
-
- 25 Apr, 2016 1 commit
-
-
Davis King authored
-
- 24 Apr, 2016 1 commit
-
-
Davis King authored
-
- 17 Apr, 2016 1 commit
-
-
Davis King authored
-
- 16 Apr, 2016 2 commits
-
-
Davis King authored
- Made layer() recurse into repeat objects so that the index given to layer() does what you would expect. - Added an operator<< for network objects that prints the network architecture.
-
Davis King authored
-
- 10 Apr, 2016 1 commit
-
-
Davis King authored
-