- 05 Sep, 2016 1 commit
-
-
Davis King authored
-
- 04 Sep, 2016 1 commit
-
-
Davis King authored
attendant objects. Also fixed a minor bug in the loss layer.
-
- 03 Sep, 2016 1 commit
-
-
Davis King authored
-
- 28 Aug, 2016 1 commit
-
-
Davis King authored
-
- 14 Aug, 2016 1 commit
-
-
Davis King authored
rather than a compile time constant. This also removes it from the input layer interface since the DNN core infers its value at runtime, meaning users that define their own input layers don't need to specify it anymore.
-
- 12 Jun, 2016 1 commit
-
-
Davis King authored
-
- 15 May, 2016 1 commit
-
-
Davis King authored
interface.
-
- 16 Apr, 2016 1 commit
-
-
Davis King authored
- Made layer() recurse into repeat objects so that the index given to layer() does what you would expect. - Added an operator<< for network objects that prints the network architecture.
-
- 08 Feb, 2016 1 commit
-
-
Davis King authored
-
- 30 Jan, 2016 1 commit
-
-
Davis King authored
objects. Then made the relevant parts of the code use these functions.
-
- 13 Dec, 2015 1 commit
-
-
Davis King authored
-
- 21 Nov, 2015 1 commit
-
-
Davis King authored
than adding them. This way, the gradient buffer can be used as scratch space during the loss computation.
-
- 20 Nov, 2015 1 commit
-
-
Davis King authored
-
- 18 Nov, 2015 1 commit
-
-
Davis King authored
-
- 08 Nov, 2015 1 commit
-
-
Davis King authored
-
- 03 Nov, 2015 1 commit
-
-
Davis King authored
that contain a different number of samples than their input tensors.
-
- 18 Oct, 2015 1 commit
-
-
Davis King authored
-
- 15 Oct, 2015 1 commit
-
-
Davis King authored
-
- 28 Sep, 2015 1 commit
-
-
Davis King authored
-
- 26 Sep, 2015 1 commit
-
-
Davis King authored
-
- 25 Sep, 2015 1 commit
-
-
Davis King authored
-
- 23 Sep, 2015 4 commits
-
-
Davis King authored
-
Davis King authored
-
Davis King authored
-
Davis King authored
-