Unverified Commit dfa4c26b authored by Katarina Slama's avatar Katarina Slama Committed by GitHub
Browse files

Typo and fix the input of labels to `cross_entropy` (#7841)

The current version caused some errors. The changes fixed it for me. Hope this is helpful!
parent a5a8eeb7
...@@ -109,9 +109,9 @@ The following is equivalent to the previous example: ...@@ -109,9 +109,9 @@ The following is equivalent to the previous example:
.. code-block:: python .. code-block:: python
from torch.nn import functional as F from torch.nn import functional as F
labels = torch.tensor([1,0]).unsqueeze(0) labels = torch.tensor([1,0])
outputs = model(input_ids, attention_mask=attention_mask) outputs = model(input_ids, attention_mask=attention_mask)
loss = F.cross_entropy(labels, outputs.logitd) loss = F.cross_entropy(outputs.logits, labels)
loss.backward() loss.backward()
optimizer.step() optimizer.step()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment