Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
04028317
"web/app/git@developer.sourcefind.cn:orangecat/ollama.git" did not exist on "39e946f25668324d2be2382d20013b9ff2cbf618"
Unverified
Commit
04028317
authored
Jun 14, 2021
by
Stas Bekman
Committed by
GitHub
Jun 14, 2021
Browse files
consistent nn. and nn.functional: part 5 docs (#12161)
parent
88e84186
Changes
5
Show whitespace changes
Inline
Side-by-side
Showing
5 changed files
with
9 additions
and
9 deletions
+9
-9
docs/source/add_new_model.rst
docs/source/add_new_model.rst
+1
-1
docs/source/main_classes/trainer.rst
docs/source/main_classes/trainer.rst
+2
-2
docs/source/migration.md
docs/source/migration.md
+2
-2
docs/source/quicktour.rst
docs/source/quicktour.rst
+2
-2
docs/source/task_summary.rst
docs/source/task_summary.rst
+2
-2
No files found.
docs/source/add_new_model.rst
View file @
04028317
...
...
@@ -518,7 +518,7 @@ PyTorch, called ``SimpleModel`` as follows:
.. code:: python
import torch.nn as
nn
from torch import
nn
class SimpleModel(nn.Module):
def __init__(self):
...
...
docs/source/main_classes/trainer.rst
View file @
04028317
...
...
@@ -59,7 +59,7 @@ classification:
.. code-block:: python
import torch
from torch import nn
from transformers import Trainer
class MultilabelTrainer(Trainer):
...
...
@@ -67,7 +67,7 @@ classification:
labels = inputs.pop("labels")
outputs = model(**inputs)
logits = outputs.logits
loss_fct =
torch.
nn.BCEWithLogitsLoss()
loss_fct = nn.BCEWithLogitsLoss()
loss = loss_fct(logits.view(-1, self.model.config.num_labels),
labels.float().view(-1, self.model.config.num_labels))
return (loss, outputs) if return_outputs else loss
...
...
docs/source/migration.md
View file @
04028317
docs/source/quicktour.rst
View file @
04028317
...
...
@@ -265,8 +265,8 @@ Let's apply the SoftMax activation to get predictions.
.. code-block::
>>> ## PYTORCH CODE
>>>
import torch.nn.functional as F
>>> pt_predictions =
F
.softmax(pt_outputs.logits, dim=-1)
>>>
from torch import nn
>>> pt_predictions =
nn.functional
.softmax(pt_outputs.logits, dim=-1)
>>> ## TENSORFLOW CODE
>>> import tensorflow as tf
>>> tf.nn.softmax(tf_outputs.logits, axis=-1)
...
...
docs/source/task_summary.rst
View file @
04028317
...
...
@@ -451,7 +451,7 @@ of tokens.
>>>
##
PYTORCH
CODE
>>>
from
transformers
import
AutoModelWithLMHead
,
AutoTokenizer
,
top_k_top_p_filtering
>>>
import
torch
>>>
from
torch
.
nn
import
functional
as
F
>>>
from
torch
import
nn
>>>
tokenizer
=
AutoTokenizer
.
from_pretrained
(
"gpt2"
)
>>>
model
=
AutoModelWithLMHead
.
from_pretrained
(
"gpt2"
)
...
...
@@ -467,7 +467,7 @@ of tokens.
>>>
filtered_next_token_logits
=
top_k_top_p_filtering
(
next_token_logits
,
top_k
=
50
,
top_p
=
1.0
)
>>>
#
sample
>>>
probs
=
F
.
softmax
(
filtered_next_token_logits
,
dim
=-
1
)
>>>
probs
=
nn
.
functional
.
softmax
(
filtered_next_token_logits
,
dim
=-
1
)
>>>
next_token
=
torch
.
multinomial
(
probs
,
num_samples
=
1
)
>>>
generated
=
torch
.
cat
([
input_ids
,
next_token
],
dim
=-
1
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment