Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
dd933006
Commit
dd933006
authored
Oct 19, 2021
by
A. Unique TensorFlower
Committed by
TF Object Detection Team
Oct 19, 2021
Browse files
Update keypoint estimation to always use a safe k in top_k
PiperOrigin-RevId: 404311441
parent
b13db259
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
14 additions
and
1 deletion
+14
-1
research/object_detection/meta_architectures/center_net_meta_arch.py
...ject_detection/meta_architectures/center_net_meta_arch.py
+14
-1
No files found.
research/object_detection/meta_architectures/center_net_meta_arch.py
View file @
dd933006
...
@@ -301,8 +301,21 @@ def top_k_feature_map_locations(feature_map, max_pool_kernel_size=3, k=100,
...
@@ -301,8 +301,21 @@ def top_k_feature_map_locations(feature_map, max_pool_kernel_size=3, k=100,
perm
=
[
0
,
3
,
1
,
2
])
perm
=
[
0
,
3
,
1
,
2
])
feature_map_peaks_transposed
=
tf
.
reshape
(
feature_map_peaks_transposed
=
tf
.
reshape
(
feature_map_peaks_transposed
,
[
batch_size
,
num_channels
,
-
1
])
feature_map_peaks_transposed
,
[
batch_size
,
num_channels
,
-
1
])
# safe_k will be used whenever there are fewer positions in the heatmap
# than the requested number of locations to score. In that case, all
# positions are returned in sorted order. To ensure consistent shapes for
# downstream ops the outputs are padded with zeros. Safe_k is also
# fine for TPU because TPUs require a fixed input size so the number of
# positions will also be fixed.
safe_k
=
tf
.
minimum
(
k
,
tf
.
shape
(
feature_map_peaks_transposed
)[
-
1
])
scores
,
peak_flat_indices
=
tf
.
math
.
top_k
(
scores
,
peak_flat_indices
=
tf
.
math
.
top_k
(
feature_map_peaks_transposed
,
k
=
k
)
feature_map_peaks_transposed
,
k
=
safe_k
)
scores
=
tf
.
pad
(
scores
,
[(
0
,
0
),
(
0
,
0
),
(
0
,
k
-
safe_k
)])
peak_flat_indices
=
tf
.
pad
(
peak_flat_indices
,
[(
0
,
0
),
(
0
,
0
),
(
0
,
k
-
safe_k
)])
scores
=
tf
.
ensure_shape
(
scores
,
(
batch_size
,
num_channels
,
k
))
peak_flat_indices
=
tf
.
ensure_shape
(
peak_flat_indices
,
(
batch_size
,
num_channels
,
k
))
# Convert the indices such that they represent the location in the full
# Convert the indices such that they represent the location in the full
# (flattened) feature map of size [batch, height * width * channels].
# (flattened) feature map of size [batch, height * width * channels].
channel_idx
=
tf
.
range
(
num_channels
)[
tf
.
newaxis
,
:,
tf
.
newaxis
]
channel_idx
=
tf
.
range
(
num_channels
)[
tf
.
newaxis
,
:,
tf
.
newaxis
]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment