Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
dlib
Commits
b87ecad5
"examples/tensorflow/vscode:/vscode.git/clone" did not exist on "4be4b134247cf79617480e5f4646dfa07bd96a4e"
Commit
b87ecad5
authored
Dec 18, 2016
by
Davis King
Browse files
Improved example
parent
fd132304
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
3 deletions
+7
-3
examples/dnn_metric_learning_on_images_ex.cpp
examples/dnn_metric_learning_on_images_ex.cpp
+7
-3
No files found.
examples/dnn_metric_learning_on_images_ex.cpp
View file @
b87ecad5
...
@@ -288,8 +288,12 @@ int main(int argc, char** argv)
...
@@ -288,8 +288,12 @@ int main(int argc, char** argv)
dlib
::
rand
rnd
(
time
(
0
));
dlib
::
rand
rnd
(
time
(
0
));
load_mini_batch
(
5
,
5
,
rnd
,
objs
,
images
,
labels
);
load_mini_batch
(
5
,
5
,
rnd
,
objs
,
images
,
labels
);
// Normally you would use the non-batch-normalized version of the network to do
// testing, which is what we do here.
anet_type
testing_net
=
net
;
// Run all the images through the network to get their vector embeddings.
// Run all the images through the network to get their vector embeddings.
std
::
vector
<
matrix
<
float
,
0
,
1
>>
embedded
=
net
(
images
);
std
::
vector
<
matrix
<
float
,
0
,
1
>>
embedded
=
testing_
net
(
images
);
// Now, check if the embedding puts images with the same labels near each other and
// Now, check if the embedding puts images with the same labels near each other and
// images with different labels far apart.
// images with different labels far apart.
...
@@ -304,14 +308,14 @@ int main(int argc, char** argv)
...
@@ -304,14 +308,14 @@ int main(int argc, char** argv)
// The loss_metric layer will cause images with the same label to be less
// The loss_metric layer will cause images with the same label to be less
// than net.loss_details().get_distance_threshold() distance from each
// than net.loss_details().get_distance_threshold() distance from each
// other. So we can use that distance value as our testing threshold.
// other. So we can use that distance value as our testing threshold.
if
(
length
(
embedded
[
i
]
-
embedded
[
j
])
<
net
.
loss_details
().
get_distance_threshold
())
if
(
length
(
embedded
[
i
]
-
embedded
[
j
])
<
testing_
net
.
loss_details
().
get_distance_threshold
())
++
num_right
;
++
num_right
;
else
else
++
num_wrong
;
++
num_wrong
;
}
}
else
else
{
{
if
(
length
(
embedded
[
i
]
-
embedded
[
j
])
>=
net
.
loss_details
().
get_distance_threshold
())
if
(
length
(
embedded
[
i
]
-
embedded
[
j
])
>=
testing_
net
.
loss_details
().
get_distance_threshold
())
++
num_right
;
++
num_right
;
else
else
++
num_wrong
;
++
num_wrong
;
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment