Commit a6f412da authored by Christopher Goh's avatar Christopher Goh
Browse files

Fixed typo in migration guide

parent 4fc9f9ef
...@@ -314,7 +314,7 @@ loss = outputs[0] ...@@ -314,7 +314,7 @@ loss = outputs[0]
# In pytorch-transformers you can also have access to the logits: # In pytorch-transformers you can also have access to the logits:
loss, logits = outputs[:2] loss, logits = outputs[:2]
# And even the attention weigths if you configure the model to output them (and other outputs too, see the docstrings and documentation) # And even the attention weights if you configure the model to output them (and other outputs too, see the docstrings and documentation)
model = BertForSequenceClassification.from_pretrained('bert-base-uncased', output_attentions=True) model = BertForSequenceClassification.from_pretrained('bert-base-uncased', output_attentions=True)
outputs = model(input_ids, labels=labels) outputs = model(input_ids, labels=labels)
loss, logits, attentions = outputs loss, logits, attentions = outputs
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment