Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
Bw-bestperf
SAM
Commits
06bd20da
Commit
06bd20da
authored
Apr 10, 2023
by
Eric Mintun
Browse files
Fix lint.
parent
ca981bf3
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
5 additions
and
5 deletions
+5
-5
segment_anything/modeling/image_encoder.py
segment_anything/modeling/image_encoder.py
+4
-4
segment_anything/utils/onnx.py
segment_anything/utils/onnx.py
+1
-1
No files found.
segment_anything/modeling/image_encoder.py
View file @
06bd20da
...
@@ -144,8 +144,8 @@ class Block(nn.Module):
...
@@ -144,8 +144,8 @@ class Block(nn.Module):
rel_pos_zero_init (bool): If True, zero initialize relative positional parameters.
rel_pos_zero_init (bool): If True, zero initialize relative positional parameters.
window_size (int): Window size for window attention blocks. If it equals 0, then
window_size (int): Window size for window attention blocks. If it equals 0, then
use global attention.
use global attention.
input_size (tuple(int, int) or None): Input resolution for calculating the relative
positional
input_size (tuple(int, int) or None): Input resolution for calculating the relative
parameter size.
positional
parameter size.
"""
"""
super
().
__init__
()
super
().
__init__
()
self
.
norm1
=
norm_layer
(
dim
)
self
.
norm1
=
norm_layer
(
dim
)
...
@@ -201,8 +201,8 @@ class Attention(nn.Module):
...
@@ -201,8 +201,8 @@ class Attention(nn.Module):
qkv_bias (bool): If True, add a learnable bias to query, key, value.
qkv_bias (bool): If True, add a learnable bias to query, key, value.
rel_pos (bool): If True, add relative positional embeddings to the attention map.
rel_pos (bool): If True, add relative positional embeddings to the attention map.
rel_pos_zero_init (bool): If True, zero initialize relative positional parameters.
rel_pos_zero_init (bool): If True, zero initialize relative positional parameters.
input_size (tuple(int, int) or None): Input resolution for calculating the relative
positional
input_size (tuple(int, int) or None): Input resolution for calculating the relative
parameter size.
positional
parameter size.
"""
"""
super
().
__init__
()
super
().
__init__
()
self
.
num_heads
=
num_heads
self
.
num_heads
=
num_heads
...
...
segment_anything/utils/onnx.py
View file @
06bd20da
...
@@ -82,7 +82,7 @@ class SamOnnxModel(nn.Module):
...
@@ -82,7 +82,7 @@ class SamOnnxModel(nn.Module):
)
)
prepadded_size
=
self
.
resize_longest_image_size
(
orig_im_size
,
self
.
img_size
).
to
(
torch
.
int64
)
prepadded_size
=
self
.
resize_longest_image_size
(
orig_im_size
,
self
.
img_size
).
to
(
torch
.
int64
)
masks
=
masks
[...,
:
prepadded_size
[
0
],
:
prepadded_size
[
1
]]
masks
=
masks
[...,
:
prepadded_size
[
0
],
:
prepadded_size
[
1
]]
# type: ignore
orig_im_size
=
orig_im_size
.
to
(
torch
.
int64
)
orig_im_size
=
orig_im_size
.
to
(
torch
.
int64
)
h
,
w
=
orig_im_size
[
0
],
orig_im_size
[
1
]
h
,
w
=
orig_im_size
[
0
],
orig_im_size
[
1
]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment