Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
d4d62846
Unverified
Commit
d4d62846
authored
Apr 25, 2023
by
Younes Belkada
Committed by
GitHub
Apr 25, 2023
Browse files
[`SAM`] Add sam doc (#22984)
* add sam doc * fixes * multiple fixes
parent
f0f5e28f
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
25 additions
and
0 deletions
+25
-0
src/transformers/models/sam/modeling_sam.py
src/transformers/models/sam/modeling_sam.py
+25
-0
No files found.
src/transformers/models/sam/modeling_sam.py
View file @
d4d62846
...
@@ -1270,6 +1270,31 @@ class SamModel(SamPreTrainedModel):
...
@@ -1270,6 +1270,31 @@ class SamModel(SamPreTrainedModel):
return_dict
=
None
,
return_dict
=
None
,
**
kwargs
,
**
kwargs
,
)
->
List
[
Dict
[
str
,
torch
.
Tensor
]]:
)
->
List
[
Dict
[
str
,
torch
.
Tensor
]]:
r
"""
Example:
```python
>>> from PIL import Image
>>> import requests
>>> from transformers import AutoModel, AutoProcessor
>>> model = AutoModel.from_pretrained("facebook/sam-vit-base")
>>> processor = AutoProcessor.from_pretrained("facebook/sam-vit-base")
>>> img_url = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/sam-car.png"
>>> raw_image = Image.open(requests.get(img_url, stream=True).raw).convert("RGB")
>>> input_points = [[[400, 650]]] # 2D location of a window on the car
>>> inputs = processor(images=raw_image, input_points=input_points, return_tensors="pt")
>>> # Get segmentation mask
>>> outputs = model(**inputs)
>>> # Postprocess masks
>>> masks = processor.post_process_masks(
... outputs.pred_masks, inputs["original_sizes"], inputs["reshaped_input_sizes"]
... )
```
"""
output_attentions
=
output_attentions
if
output_attentions
is
not
None
else
self
.
config
.
output_attentions
output_attentions
=
output_attentions
if
output_attentions
is
not
None
else
self
.
config
.
output_attentions
output_hidden_states
=
(
output_hidden_states
=
(
output_hidden_states
if
output_hidden_states
is
not
None
else
self
.
config
.
output_hidden_states
output_hidden_states
if
output_hidden_states
is
not
None
else
self
.
config
.
output_hidden_states
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment