"git@developer.sourcefind.cn:OpenDAS/dlib.git" did not exist on "de32c75c156f18ffabc93e5706b38c0c1480e81a"
Commit 51ebcfc7 authored by Davis King's avatar Davis King
Browse files

Said backward() is allowed to reuse computations cached during the forward() computation.

parent 91f512ec
...@@ -177,6 +177,9 @@ namespace dlib ...@@ -177,6 +177,9 @@ namespace dlib
- SUBNET implements the SUBNET interface defined at the top of this file. - SUBNET implements the SUBNET interface defined at the top of this file.
- setup() has been called. - setup() has been called.
- computed_output is the tensor resulting from calling forward(sub,computed_output). - computed_output is the tensor resulting from calling forward(sub,computed_output).
Moreover, this was the most recent call to forward(). This means that
backward() is allowed to cache intermediate results computed during
forward() and use them for the backward computation.
- have_same_dimensions(gradient_input, computed_output) - have_same_dimensions(gradient_input, computed_output)
- have_same_dimensions(sub.get_gradient_input(), sub.get_output()) == true - have_same_dimensions(sub.get_gradient_input(), sub.get_output()) == true
- have_same_dimensions(params_grad, get_layer_params()) == true - have_same_dimensions(params_grad, get_layer_params()) == true
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment