- 06 Feb, 2017 1 commit
-
-
Dennis Francis authored
* feature_addition : Mean squared loss layer for multiple output (#404) * Added loss_mean_squared_multioutput layer to support multiple outputs. * Also added a corresponding test case to test a single variable regression with multiple outputs. * Added error checks on truth argument Added assert statements to check that truth argument in compute_loss_value_and_gradient() method contains matrices of correct dimension relative to the output tensor's size. Also the requirements on argument truth to the abstract documentation.
-
- 17 Dec, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
-
- 26 Nov, 2016 1 commit
-
-
Dennis Francis authored
-
- 25 Nov, 2016 1 commit
-
-
Dennis Francis authored
-
- 23 Nov, 2016 1 commit
-
-
Dennis Francis authored
Added mean squared loss layer "loss_mean_squared" to DNN as requested in https://github.com/davisking/dlib/issues/152 Also added test case of a simple linear regression with one variable that uses this layer.
-
- 06 Nov, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
training_label_type instead of a single label_type. This way, the label type used for training can be distinct from the type output by the network. This change breaks backwards compatibility with the previous API.
-
- 16 Oct, 2016 1 commit
-
-
Davis King authored
-
- 05 Sep, 2016 2 commits
-
-
Davis King authored
-
Davis King authored
-
- 04 Sep, 2016 1 commit
-
-
Davis King authored
attendant objects. Also fixed a minor bug in the loss layer.
-
- 03 Sep, 2016 1 commit
-
-
Davis King authored
-
- 28 Aug, 2016 1 commit
-
-
Davis King authored
-
- 14 Aug, 2016 1 commit
-
-
Davis King authored
rather than a compile time constant. This also removes it from the input layer interface since the DNN core infers its value at runtime, meaning users that define their own input layers don't need to specify it anymore.
-
- 12 Jun, 2016 1 commit
-
-
Davis King authored
-
- 15 May, 2016 1 commit
-
-
Davis King authored
interface.
-
- 16 Apr, 2016 1 commit
-
-
Davis King authored
- Made layer() recurse into repeat objects so that the index given to layer() does what you would expect. - Added an operator<< for network objects that prints the network architecture.
-
- 08 Feb, 2016 1 commit
-
-
Davis King authored
-
- 30 Jan, 2016 1 commit
-
-
Davis King authored
objects. Then made the relevant parts of the code use these functions.
-
- 13 Dec, 2015 1 commit
-
-
Davis King authored
-
- 21 Nov, 2015 1 commit
-
-
Davis King authored
than adding them. This way, the gradient buffer can be used as scratch space during the loss computation.
-
- 20 Nov, 2015 1 commit
-
-
Davis King authored
-
- 18 Nov, 2015 1 commit
-
-
Davis King authored
-
- 08 Nov, 2015 1 commit
-
-
Davis King authored
-
- 03 Nov, 2015 1 commit
-
-
Davis King authored
that contain a different number of samples than their input tensors.
-
- 18 Oct, 2015 1 commit
-
-
Davis King authored
-
- 15 Oct, 2015 1 commit
-
-
Davis King authored
-
- 28 Sep, 2015 1 commit
-
-
Davis King authored
-
- 26 Sep, 2015 1 commit
-
-
Davis King authored
-
- 25 Sep, 2015 1 commit
-
-
Davis King authored
-
- 23 Sep, 2015 4 commits
-
-
Davis King authored
-
Davis King authored
-
Davis King authored
-
Davis King authored
-